The most compelling augmented reality experiences are the ones that push the interplay between digital objects and the physical world to extreme limits without sacrificing or obscuring either. Today, we are going to give you the basics of what you can accomplish on widely available mobile phones. As a experience designer, now is the time to start thinking about digital interfaces that are not constrained by the screen, mouse pointer, and touch.
Chances are you have used one of the apps like Snapchat’s Face Filters or Apple’s Animoji. These experiences use augmented reality tech to turn the user’s face into a new input method. The user’s facial features are tracked to derive placement for content and detected expressions are used to provide contextual responses.
This notion of tracking and awareness are key when designing augmented reality experiences. Understanding the data that augmented reality systems provide is important to design the magic and delightful moments that are not possible with any other platforms.
- Subject matter in paintings transform into provocative and engaging experiences - https://vimeo.com/223223058
- An infographic in a magazine becomes an engaging animatic with much more detailed information - https://www.youtube.com/watch?v=wwmmgAw8ZAE
- Real world objects are annotated to improve efficiency and visualize information - https://www.youtube.com/watch?v=ML5jH1JI5sU
Device Tracking
Tracking is foundational to augmented reality experiences. It involves the device keeping track of where it is and how it is oriented in physical space at all times. Your smartphone has several software and hardware systems that give partial information for tracking. For example, the accelerometer sensor can detect the rotation of the device relative to the Earth’s gravity and if it’s moving in a particular direction. This only part of the tracking equation as the accelerometer alone can’t give you an absolute position.
“Aha! GPS gives position!”, you exclaim. Unfortunately, GPS isn’t the answer. GPS is useful for coarsely determining if a user is near a particular location but it’s not accurate enough to achieve the precision necessary for real-time, millimeter position calculations. It also doesn’t work indoors.
To get that precise positioning, the tracking system depends on computer vision algorithms that constantly analyze the live camera feed for movement and environmental features. This sheds light on why a well lit space with lots of visual interest, like patterned surfaces or furniture, are optimal. (For more on how to compose your scenes to take advantage of detail without cluttering your design, read my tips on presenting your augmented reality work.)
These tracking systems provide a vague idea about of the physical world that the device, and presumably the user, are in at any given time. Vague may be an understatement. Current tech is limited to detecting flat surfaces as planes described by location, size, and orientation. There is no information if the surface is a floor, table, wall, or anything else.
While there are some interesting interactions that can be done with this baseline, it is pretty limited. What we need is more contextual awareness. Eventually these systems will become sophisticated enough to parse the viewer’s environment and make similar assumptions that humans do. What is available to us today is object recognition and tracking.
Environmental Awareness
Since the tracking system is already closely watching the camera feed, most AR platforms will allow you to specify images to be tracked. When the images are found in the real world, the image tracker will give you very fast and accurate updates on the position and orientation. This unlocks two key pieces of information: the real world positioning of a physical object that isn’t the device and some predetermined contextual awareness. Attaching virtual content to a real world object like a business card is a commonly cited example of this sort of image-triggered annotation
This example, while pretty cool, doesn’t even begin to scratch the surface of what we can do with image tracking.
You can also use the tracked object as a reference point in the physical world. This opens up multiple possibilities. One effect is to completely obscure the real world image with virtual content. Perhaps a picture in a frame completely transforms into a portal from which the user interface appears. Remember: the most magical moments happen when the UI seems to be both aware of the physical environment and to react in a contextually relevant way.
Another approach is to completely obscure the image with content that looks like the original image but more rich in presentation and interaction.
Image tracking also opens up the possibilities to use it as an input method that can includes conditional responses and branching logic depending upon the images being detected at any given time.
Coming Soon to Torch: Image Tracking and Real World Annotation
Image and object tracking open up a whole new range of possibilities for interactions that are unique to augmented reality. These technologies have become more robust and reliable over the years but they are only really accessible through coding and programming interfaces. That is about to change. At Torch we have been working on an image tracking feature that will allow you to trigger interactions based on detected images and objects.
Tracking images in Torch will give you new ways to interact and set up your prototypes. Take our wayfinding prototype as an example– with the original project we used a manual and arbitrary set project anchor. This works fine if you can reliably place the anchor each time you open the project. Using image tracking, we could specify a scene anchor that is based on a landmark image in the building lobby. It is ideal if the image is something that is not moved easily and can be found right at the entrance. This gives a reliable anchoring point to start the experience that not only makes it easier for you to work with the prototype across multiple sessions, but also makes it much easier for collaborators to open your project without explicit guidance for setting the project anchor.
Great things are coming and we can’t wait to see what you build. As always, contact us if you need anything by emailing support and if you are in the mood to share your designs, try using the video record feature in Torch play mode or post using the #builtwithtorch hashtag.
Thanks!
Paul and the Torch Team