About this Journal Submit a Manuscript Table of Contents
International Journal of Computer Games Technology
Volume 2011 (2011), Article ID 570210, 7 pages
Research Article

Out of the Cube: Augmented Rubik's Cube

1Department of Computer Science, Ben-Gurion University of the Negev, P.O.B 653 Be'er Sheva 84105, Israel
2Screen-Based Arts, Bezalel Academy of Arts and Design, Jerusalem 91240, Israel

Received 23 January 2011; Accepted 2 May 2011

Academic Editor: Suiping Zhou

Copyright © 2011 Oriel Bergig et al. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.


Computer gaming habits have a tendency to evolve with technology, the best being ones that immerse both our imagination and intellect. Here, we describe a new game platform, an Augmented Reality Rubik's cube. The cube acts simultaneously as both the controller and the game board. Gameplay is controlled by the cube, and game assets are rendered on top of it. Shuffling and tilting operations on the cube are mapped to game interaction. We discuss the game design decisions involved in developing a game for this platform, as well as the technological challenges in implementing it. Ultimately, we describe two games and discuss the conclusions of an informal user study based on those games.

1. Introduction

Augmented Reality (AR), where computer-generated graphics is rendered and registered on the real world in real time, has existed as an academic field since the 60’s. As anticipated by Bolter and Grusin [1], AR is now gaining wider public acceptance as AR applications are being demonstrated in art, entertainment, and gaming.

In 2007, the first commercial AR game was produced. In The Eye of Judgment (http://www.eyeofjudgment.com/) an AR game for the Sony Play Station, a special set of board and cards was designed. Since the Eye of Judgment saw light, a constantly increasing number of commercial AR games are developed every year, motivating research for AR game technologies.

In this work, we present a game technology that extends an existing game platform, a toy. While some game platforms are developed and tailored to support AR experiences (e.g., The Eye of Judgment game board), others can be based on existing ones. For example, augmenting regular cards may lay the foundations for a new game technology based on an existing game platform.

One of the advantages in exploiting an existing game platform (e.g., cards) to create new digital experiences lies in the fact that people are familiar with the underlying game mechanics (e.g., pile cards). In addition, although the game uses a tangible platform, distributing it becomes a simpler task. For example, the software for the game can be downloaded online.

Augmented Reality game technologies that revolve around familiar game platforms can exploit the interactions of the underlying platform. For example, piles of cards could translate to grouping models represented by those cards. In many cases, it seems natural to preserve the meaning of interactions in the underlying platform and map them to the AR experience. Furthermore, new game interactions are made possible with meaning only within the scope of the extended AR platform. For example, tilting a card to one side can cause an augmented model to slip aside.

In this work, we developed Out of the Cube (OOTC), an AR extension to the traditional Rubik’s cube. To play the game, the player modifies the cube using our sticker kit. We define a set of interactions and provide a natural mapping between those interactions and events that take place in the digital world. One of the early design decisions was to make all of the interactions based on the cube itself, essentially creating a controller-free interface. The traditional meaning of “shuffling the cube” interaction is preserved across the AR experience and used for virtual puzzle solving. Additional interactions are only possible with the augmented cube as described later.

To explore Rubik’s cube as an AR game platform, we designed and implemented two games. Figure 1(a) is screen shot from a puzzle game, which is organized in levels. The game was designed to advance the United Nations Millennium Development Goals (http://www.un.org/millenniumgoals) with the premise of aiding in the development of poor countries. In the game, virtual villages are augmented on the cube faces. These villages develop according to the resources they receive which are controlled by the arrangement of the cube. The goal is to reach equilibrium of resources across all villages. To achieve this, the player shuffles virtual assets between the villages. Figure 1(b) is a screen shot from our second game, which is a skill-based maze walking challenge. In this game, the player has to help a child reach his goal of education and happiness by tilting the cube and walking him carefully on a narrow path. Failing to navigate the character carefully may result in a free fall from the cube’s face to the ground.

Figure 1: Augmented Reality games based on Rubik’s Cube. (a) A village puzzle game and (b) a skill-based maze walking game.

This work participated in Microsoft Imagine Cup and won the 1st place in the national phase in Israel and was demonstrated in the Imagine Cup International Expo in Cairo.

The rest of the paper is structured as follows. We begin by describing background work in Section 2. We then describe the design of OOTC. In Section 4, we describe the architecture and implementation of the game application. The conclusions of an initial and informal user study are described in Section 5. Section 6 proposes ideas for additional games. In Section 7, we conclude and present directions for future work.

2. Background

OOTC has emerged as a quest to combine the popular and addictive Rubik’s Cube toy with the interactivity and immersiveness of AR games. We first describe the history of Rubik’s Cube and its use as a platform. We continue by describing AR games based on cubical platforms and finally extend the discussion to AR tangible interactions and AR games.

2.1. Rubik’s Cube Platform

Rubik’s cube is a game mechanics invented in 1974 and sold commercially since 1980. As of January 2009, 350 million cubes have been sold worldwide, making it the world’s top-selling puzzle game [2]. It is widely considered to be the world’s best-selling toy [3].

The traditional Rubik’s cube game mechanics has been used to develop new games extending the traditional puzzle game. The cube was extended to 4 × 4 and 5 × 5 cubes by Seven Towns Ltd, the company that owns the Rubik’s cube brand. They also offer Rubik's Custom Sticker Kits (http://www.rubiks.com/shop) for people to create their own games. It comes with five sheets of A4 paper size blank stickers for use in a color printer. Different designs are used for promoting business, and special designs are sold commercially (e.g., NBA teams logo Cube). In this work, we created a sticker kit, very similar to those offered by Seven Towns Ltd. The symbols on our sticker kit allow pose estimation and identification of cubelet (one of the nine squares of a Rubik’s Cube’s face).

Electronic games based on the cube mechanics exist as well. Rubik’s TouchCube (http://www.rubikstouchcube.com/) is an electronic cube with touch sensor technology. The challenge is faced by swiping a finger on the cube, rather than shuffling it. Rubik’s Puzzle World is an abstract environment populated by cubelets which make up the game’s DNA. A collection of games based on this world are available for the Nintendo DS and Wii game consoles. In this work, we developed electronic games based on the cube played in front of a computer with a webcam.

2.2. Cube Platforms Used in AR

Cube platforms have been used for creating AR games. Magic Cubes [4] is a research project seeking after unique user interfaces made of two cubes with markers in an AR environment. In Jumanji Singapore [5], the cubes are used as a dice and control tool for a monopoly-style game. The purpose of Jumanji is to take its users on a virtual three-dimensional tour of Singapore’s attractions while playing a board game competition. LevelHead (http://ljudmila.org/~julian/levelhead/) is a spatial memory game using three small plastic cubes with a unique marker on each face. It creates the impression that a room is somehow inside each cube. Our work shares some interaction metaphors with all the above games. However, it is based on the traditional Rubik’s cube where the shuffling interaction is used to solve an AR puzzle.

2.3. Related AR Interactions and Games

Some of the interactions available in OOTC are not entirely new. For example, occlusion-based interaction has been described in [6]. Tilting and moving markers to interact with augmented 3D content are described in [7]. Our goal is to explore the design decisions and implications involved in using them in a game based on Rubik’s cube as a game platform.

Augmented Reality games that are real-world extensions of existing purely virtual games have been developed to extenuate different effects of the game environment. ARQuake [8] and Human Pacman [9] are augmented versions of Quake and Pacman that are played outdoors. Invisible Train [10] and Smart Memory [11] are based on popular game mechanics and played on handheld devices. Other AR games were designed to demonstrate interaction between virtual and physical objects. Neon Racer [12] allows players to steer vehicles with traditional gamepads while spectators (and players) can use real objects to influence the race. Monkey Bridge [13] demonstrates how virtual objects can react to events in the virtual and physical worlds. The physical world is a tabletop setup with physical objects like bricks and wooden blocks that take part in the game. In this work, the games were designed to demonstrate how Rubik’s cube can be used to interact with virtual content.

AR parrot (http://ardrone.parrot.com/parrot-ar-drone/usa/) is a physical platform designed for AR games. It consists of a quadricopter equipped with two cameras, which can be controlled using a computer or a phone that display the live feed from the camera. In AR.Pursuit, the video is augmented with virtual content in a combat game.

3. Game and Technology Design

Designing AR games based on an existing game platform is challenging. Preserving the nature of underlying platform is usually preferred [14]. In this work, we harness the tangible interactions embedded in the design of Rubik’s cube, while preserving the cube shuffling interaction (see Figure 2). We extend interactivity with additional AR interactions. Rubik’s cube turns into an AR interface making it a platform for AR games.

Figure 2: Shuffling the cube shuffles the assets arranged around the village accordingly. Here, shuffling the right side of the cube will cause the laptop and the lab assets to be replaced with other assets depending on the cube new arrangement.

In the village puzzle game (see Figure 1(a)), the player is responsible to the world that is literally in the palm of his hand. The playground of the game consists of a computer and a webcam. To play, the player modifies a Rubik’s cube using an OOTC sticker kit and interacts with the cube by rotating and shuffling the cube. While manipulating the cube, the player sees his own hands holding a small village and its surroundings. The village and the assets around it change according to the cube combinations.

The village puzzle game is a 3D educational puzzle experience trusting the responsibility of developing poor villages in the hands of the player. The augmented space includes six villages tied in with different visual themes. Each village has its own unique story and environmental problems. The villages are overlaid in 3D on top of the Rubik’s cube face. Each village is surrounded by the virtual assets that are necessary ingredients for its development. Through each level, the player has to distribute wisely different assets inside a village and between villages in order to find an arrangement that brings equilibrium. In some cases, it is necessary to combine two ingredients to make them more effective. In such cases, the two ingredients need to be placed one next to the other.

The maze game (see Figure 1(b)) is a skill-based challenge where a virtual character is controlled by tilting the cube to collect different assets. Two items are spread on the maze, and the character has to collect one item and bring it close to the other item. The challenge is to orient the movements of the character on a narrow path laid out in a maze structure. The path is organised in tiles that are hung in the air and stepping over the boundary results in free fall from the cube.

We magnified the experience by designing the maze as a minigame of the puzzle game. In the puzzle game, when two assets are placed nearby, they can have a stronger effect on the development of the village. To unlock the effect, the maze minigame has to be played. The virtual character has to pick one asset and bring it to the other one walking on the maze. For example, shuffling the cube to bring a lab resource and a laptop resource to the village assists each one on its own to the development of the village. However, if the two are placed next to each other and then carried one to the other by the character in the maze minigame, it will boost this village development.

The OOTC platform design enables interactions with virtual content augmented on a Rubik’s cube. We designed and developed five different interaction metaphors (examples are depicted in Figure 3). These are (1) shuffling to change game assets, (2) tilting to move items around, (3) rotating faces to see different views, (4) hiding a cubelet, which is one of the nine squares of a Rubik’s Cube’s face, to press a button, and (5) hotspots to choose between menu items. These interactions are further detailed below.

Figure 3: Interaction metaphors. (a) Shuffling to change game assets; (b) hiding a cubelet to press a button; (c) rotating faces to see different views; (d) hotspots to choose between menu items.

Shuffling the cube is based on the traditional mechanics of the Rubik’s cube for level solving. We preserve the familiar context of this mechanics by shuffling to spatially arrange virtual assets in order to solve the puzzle. Rotating the whole cube face reveals different views of the puzzle. Each face is augmented with one of six different villages sharing the available resources. It follows that shuffling the cube adds ingredients to one village but takes away ingredients from another. Rotating the whole cube, which changes the currently viewed and augmented cube face, is necessary to find out what is available for each village. The player seeks to find an equilibrium between villages while responding to unpredicted events, for example, a storm that demolishes the food reserves. This is similar to the Rubik’s cube mechanics, as players constantly check the different faces to make sure that the last shuffle has not caused serious damage to one of the villages.

We borrowed existing AR interactions and mapped them to the cube, creating a self-contained platform. Tilting the cube causes virtual objects rendered on the cube to move around according to the tilt direction and magnitude. In our game it is used to direct a boy character on a maze and collect farm ingredients. Hiding a cubelet has an effect that is similar to pressing a button. When two complementary ingredients (augmented on cubelets) have been placed nearby, the player has to hide one of these cubelets with a finger which fires the described maze minigame. Finally, the screen corners can be used as hotspots and moving the cube to one of these corners is equivalent to choosing an action from the menu. For example, when the level is over, the player can choose to continue to the next level, open a relevant United Nations Millennium Development webpage, or contribute by donating a dollar. The last corner is used to save and exit the game.

4. Architecture and Implementation

4.1. Platform Architecture

Our architecture decouples platform and game implementation allowing various games to be developed on the same platform.

OOTC is organized in two layers. The core layer analyzes the live video feed and detects the pose of the cube at each frame. The reasoning layer keeps state history and maps core layer detections to game interactions. The core layer is developed using OpenCV, the open-source computer vision library (http://opencv.willowgarage.com/), and ARToolKit (http://www.hitl.washington.edu/artoolkit/), an open source library for marker-based AR. ARToolKit traditional markers are square black frames with symbols inside, and, here, we designed a sticker kit (see Figure 4) with a different appearance which required preprocessing the image before sending it to ARToolKit.

Figure 4: The OOTC sticker kits.

The reasoning layer analyzes and accumulates core layer detections received at every frame. It also stabilized registration and interactions. Supporting interactions require identifying the elements: viewable face, shuffling, hidden cubelets, and hotspots.

4.2. Identifying a Cube’s Face

Determining the currently viewed face is performed by the core layer and forms the basis for other interactions as well. The sticker kit includes a unique sticker for each face’s central cubelet. The sticker design is preconfigured as a marker ID for the ARToolKit library. The stickers’ background is white with a black symbol in the middle, while ARToolKit markers have black frames with a black symbol in the middle. We explored the possibility of using standard ARToolKit markers, but since the cube has thin black areas between cubelets which tend to merge with the black frame of the sticker, it made the tracking instable.

4.3. Pose Estimation and Tracking

Designing a sticker kit that can support robust registration and provide an appealing game design is a challenging task. We experimented with different sticker kit designs and chose a white background with a black symbol in the middle as the central cubelet. We first identify the white backgrounds and invert their color so that ARToolKit can process them. While black tends to be relatively preserved in different lighting conditions, white areas vary widely according to the lighting conditions and camera quality. Figure 5 depicts the difference between a white area in four lighting conditions of faces different angles. This results in wide variations in white color. To overcome these variations, we assume one of the six patterns is present in the image. The first step of ARToolKit is thresholding the image and a parameter can be used to control the threshold level. We, thus, try different numbers until one of the six patterns is found.

Figure 5: The appearances of white under different angles. Values are given in HSV.

We experienced jittering and classification failures caused by the small size of the marker, the white background, and the webcam quality. We overcame these effects by keeping a short history of the poses in the reasoning layer. We then dropped outlier poses and smoothed inliers using DESP [15]. We now turn to describe the identification of the cubelets around the central one.

4.4. Identifying a Cube’s Arrangement

Identifying the shuffle of the cube is performed using Shape Context signatures [16]. Shape Contexts are designed to identify shapes across Euclidian transformations, rather than projective ones. Hence, we first rectify the face image to restore its planar state. We then mask each of the eight cubelets around the center and proceed to match them to the set of eight cubelets learned through a calibration step.

4.5. Hidden Cubelets

The cubelets can be used as virtual buttons by hiding a single cubelet by a finger. The occlusion formed by a finger is identified by examining the Shape Context signatures. A finger might cover more than one cubelet at a time. Hence, if one, two, or three neighboring cubelets are not identified while all the others are, we conclude that the user is pressing a button. It follows that only one button can be pressed at a time.

4.6. Hotspots

Hotspots are areas on the screen that trigger an event when the cube is aligned to them. This can be easily determined using the homography’s translation vector. The reasoning layer accumulates hotspot events with their ID’s for several consecutive frames before invoking a callback with the pressed cubelet’s ID.

4.7. Technical Challenges

So far we described all the ingredients of the OOTC platform and our implementation and only lightly touched on the numerous attempts we made to realize it. One of the most time-consuming tasks was figuring out how to perform cube face detection and cubelet identification while supporting design decisions for the sticker kit.

While the final central cubelet marker is a black symbol on white background as explained earlier, our first attempt was using a black frame surrounding the entire cube face. It was straightforward to implement and covered the maximal possible face area resulting in stable pose estimation. The main caveat of this approach is that the cube itself is black and, when tilting the cube, the black frame printed on the stickers merges with the spaces between the cubelets. We hence turned to a single central cubelet solution. However, we required it to remain colored, as in the original design, to look better. However, robust color identification under different lighting conditions remains a challenging task and even a black pattern with white frame can be difficult to track as explained above and depicted in Figure 5.

Cubelet identification was initially planned to use ARToolKit rather than Shape Contexts. During implementation, we found that this constrained the stickers’ design to relatively small symbols with dominant white background similar to the central cubelet. In addition, it introduces more patterns into the ARToolKit pattern set, which yields a higher ratio of identification error than with six patterns.

5. User Study

We developed a village puzzle game with a maze minigame to demonstrate the platform. The games were presented in several events. At the Imagine Cup International Expo (see Figure 6), many attendees played the game and provided positive feedback.

Figure 6: Users trying the system at Imagine Cup Expo.

We describe here results from user studies we performed at the design stage of the platform. We tested different interactions and possible hardware setups. We found that the positioning of the camera relative to the user and the screen meaningfully affects usability. Furthermore, the tilt interaction is not as trivial as expected and controlling direction and speed can be confusing. The “natural” orientation of the cube (which is mapped to “no movement” of the augmented character) is challenging, since the cube is held by the player. On the other hand, shuffling the cube was natural and required minimal practice to be used. Following are the main user studies we performed.

5.1. Camera Position

To explore the most natural camera setup, three camera positions were experimented with, where the camera was (1) fixed on a hat, (2) mounted to a laptop screen, and (3) fixed to a down-facing stand. We experimented with ten students selected randomly around the campus. The students had a chance to try the game with each of the camera positions for one minute. We performed an objective test where the mission is to make a character cross a simple maze augmented on top of a cube face. Success in the mission is crossing without falling off the path. Each time the character falls, the player has to start from the beginning. Table 1 summarizes the results. A fixed camera on a hat led to poor game experience. While users indicated the advantage of the eye view direction for the camera, their head movements created a too hard to control scenario. On the other hand, the camera mounted to the laptop screen generated constant confusion, regardless of our attempts to mirror and flip the image. Finally, we mounted a down-facing camera to a stand to make the bottom of the image reflect the player’s direction. All players exhibited a shorter average time to complete the mission.

Table 1: Camera position.
5.2. Determining Tilt Interaction Speed

We initially assumed that the speed of a character moving on a tilted cube should correspond to the magnitude of the tilt. However, our experiments reveled that it was hard to control the character movements. We created three tasks and asked fifteen students, picked at random around the university campus, to complete tasks in three different configurations. The tasks were (a) follow a virtual line, (b) cross a cube face on its two diagonals, and (c) go around the face following a square path. The objective was to perform the tasks as fast as possible. We measured the combined time it took the player to complete the three tasks experimenting with three different selections for character speed. (1) Fixed speed: the magnitude of the tilt was ignored; the character stands or walks in the direction of the tilt. (2) Two speeds: the character stands and either walks or runs depending on the tilt. (3) Continuous: character speed is a linear function of tilt magnitude. From Table 2 we conclude the two-speed configuration brought to shorter time for completing tasks.

Table 2: Determining tilt interaction speed.

Finally, we were interested in the subjective opinion of the players who participated in the experiments above and play our games for ten minutes. Table 3 summarizes the results we found. Most players preferred the camera mounted to a stand and the ability to toggle between standing, walking, and running. Some players spent a considerable amount of time trying to figure out the correct combination of the cube and asking for help indicating a deep level of involvement in the game. In the minigame, some players tried to catch the augmented character when it flipped over the cube.

Table 3: Number of users who preferred a configuration.

6. Additional Games

To provide more support on how OOTC can be used as a game platform, we briefly describe three additional game concepts that were considered as alternative to the games we finally developed. The first game is aimed for toddlers and features an image puzzle game. The goal is to connect piece-to-piece six images of different animals on the cube’s faces. Each animal image is broken to nine pieces, and the player has to shuffle the cube until the six images are complete. Once an image is fully assembled, a 3D model appears on top of the cube and interacts with the player. For example, feeding of the 3D animal is possible by moving the cube to a hotspot with food. Playing with the animal is possible by tilting the cube.

In another possible game, the goal could be to experiment with tweaked combination of animal. Animal images are broken to nine on the cubelets, and the arrangement of the cube implies a combination of the animal pieces. Morphed 3D models can then be created from the different pieces and augmented. Once an interesting creature is crafted, the player can share it with friends.

The third game is a skill-based challenge organized in levels. It is a maze defined dynamically by shuffling the cube, and the player has to control a character on this maze. The maze is made of tiles that carry collectable items, providing different rewards. The goal is to cross a cube face from point A to point B in limited time while collecting as many items as possible. Tiles may be missing, making crossing impossible without shuffling the cube. Rearranging the maze can also help earning rewards by revealing more items. While collecting as many points as possible, the player has to reach point B in time to complete the level.

7. Conclusion

In this work, we introduced OOTC, an Augmented Reality Rubik’s Cube game platform. We developed several games to demonstrate the platform. We discussed design issues for games based on OOTC and described several design decisions taking limiting factors into account. We also explored the design and implementation of five interaction metaphors. The cube shuffling action preserves its original usage pattern from a Rubik’s cube. Other interactions are borrowed from other tangible AR experiences and studies on the Rubik’s cube environment.

We expect more games based on OOTC and would like to extend the user study with more players and a deeper investigation of the cube shuffle interaction.


This work was supported by the Tuman Fund and Kreitman Foundation Fellowships. The authors would like to thank the reviewers for their suggestions, which helped improving the paper. The authors would like to thank players who provided us with valuable feedback.


  1. J. D. Bolter and R. Grusin, Remediation: Understanding New Media, MIT Press, Cambridge, Mass, USA, 2000.
  2. L. A. William, “The Rubik's cube: a puzzling success,” Time, 2009.
  3. A. Jamieson, “Rubik's cube inventor is back with Rubik's 360,” The Daily Telegraph, 2009.
  4. Z. Z. Ying, C. A. David, L. Yu, and K. Hirokazu, “Magic cubes for social and physical family entertainment,” in Proceedings of the International Conference for Human-Computer Interaction (CHI '05), pp. 1156–1157, Portland, Ore, USA, 2005.
  5. Z. Z. Ying, C. A. David, C. Tingting, and L. Yu, “Jumanji Singapore: an interactive 3D board game turning hollywood fantasy into reality,” in Proceedings of the International Conference on Advances in Computer Entertainment Technology (ACM SIGCHI '04), 2004.
  6. G. A. Lee, M. Billinghurst, and G. J. Kim, “Occlusion based interaction methods for tangible augmented reality environments,” in Proceedings of the ACM SIGGRAPH International Conference on Virtual Reality Continuum and its Applications in Industry (VRCAI '04), pp. 419–426, June 2004.
  7. M. Billinghurst, H. Kato, and I. Poupyrev, “Tangible augmented reality,” in Proceedings of the ACM SIGGRAPH ASIA courses, December 2008. View at Publisher · View at Google Scholar
  8. B. Thomas, B. Close, J. Donoghue, J. Squirs, P. De Bondi, and W. Piekarski, “ARQuake: an outdoor/indoor augmented reality first person application,” Journal of Personal and Ubiquitous Computing, vol. 6, no. 1, 2002.
  9. A. D. Cheok, S. W. Fong, K. H. Goh, et al., “Pacman: a mobile entertainment system with ubiquitous computing and tangible interaction over a wide outdoor area,” in Proceedings of the 5th International Symposium Human-Computer Interaction with Mobile Devices and Services, vol. 2795, pp. 209–223, Udine, Italy, September, 2003.
  10. D. Wagner, T. Pintaric, F. Ledermann, and D. Schmalstieg, “Towards massively multi-user augmented reality on handheld devices,” in Proceedings of the 3rd International Conference on Pervasive Computing, pp. 208–219, May 2005.
  11. M. Rohs, “Marker-based embodied interaction for handheld augmented reality games,” Journal of Virtual Reality and Broadcasting, vol. 4, no. 5, 2007.
  12. M. Weilguny, Design aspects in augmented reality games, Diploma thesis, 2006.
  13. I. Barakonyi, M. Weilguny, T. Psik, and D. Schmalstieg, “MonkeyBridge: autonomous agents in augmented reality games,” in Proceedings of the International Conference on Advances in Computer Entertainment Technology (ACM SIGCHI '05), 2005.
  14. S. Hinske and M. Langheinrich, “W41K: digitally augmenting traditional game environments,” in Proceedings of the 3rd International Conference on Tangible and Embedded Interaction (TEI'09), pp. 99–106, USA, February 2009. View at Publisher · View at Google Scholar
  15. J. J. LaViola, “Double exponential smoothing: an alternative to Kalman filter-based predictive tracking,” in Proceedings of the Work-Shop on Virtual Environments, pp. 199–206, 2003.
  16. S. Belongie, J. Malik, and J. Puzicha, “Shape matching and object recognition using shape contexts,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 24, no. 24, pp. 509–522, 2002.