< Back to Blog

To Build a Bandersnitch: 3 Lessons in AR Storytelling

Storytelling in AR is Like Writing in 3D

The immersive nature of augmented reality allows designers to tell dynamic and complex stories, but the process shouldn’t be complicated. By focussing on three core aspects of spatial design, storytelling in AR can be a snap. In my last post, I discussed the power of narrative in AR in the context of the Bandersnitch choose your own adventure experience I designed in Torch. In this post, I’m going to dig into the actual mechanics of creating an immersive experience in AR.

We’ll explore three components that designers commonly utilize to shape the user’s experience in augmented reality: project assets (2D, 3D, and audio), interactions, and the world layer–the physical environment in which the experience takes place. I’ll also offer a classification of interaction types (active vs. passive), highlight the tension inherent in an immersive narrative between freedom to explore and the linearity of narrative, as well as share some design recommendations on how to overcome this tension.

The opening scene establishes mood and motifs repeated throughout.


For instance, a furniture placement app like Ikea Place is in constant dialogue with the physical environment and usually has very few interactions associated with it. On the other hand, Bandersnitch employs more than a dozen interactions, multiple environments, and occludes the world layer during specific segments of the experience as much as we deemed safe. Same components, different mix. 

In Ikea Place, the goal is to let you try and buy as many products with as little hassle as possible, With Bandersnitch, as with most narratives, the author seeks to shape, guide, and even control the user’s progression through the various parts of the story. This dictated how I treated each major component.

Accommodating the world layer

Contrary to the design goal in most AR apps, we obscured the world as much as possible here to maintain narrative forward momentum while keeping the user focused on the choices in front of her. Don’t overlook the more immersive possibilities of AR - a room-scale model can draw a user in and lead to the suspension of disbelief that makes a narrative engaging, even when viewed through a phone screen.

That’s not to say we ignore the world layer. For starters, there are safety concerns associated with totally obscuring a person’s view. (The Google AR design guidelines provide a great resource to learn about these). And while we are trying to create a dark, ominous experience in the cave, ARKit still requires a well-lit space. Gloomy just isn’t an option in mobile AR.

The bridge joins two cave sections but also provides you with a way to remain aware of your surroundings.

At the outset, we inform anyone trying the experience that they will require about eight feet of unobstructed space. Because the cave model completely occludes the real world, we created a space between two sections, attached by a rickety bridge that ends up serving a narrative purpose, to help the user can keep track of where they are in the physical world. Of course, using a mobile device instead of in a headset is also a big help here. Especially when using an iPad, you really feel like you are holding a viewer into a different world.

Key Tip: Make a clear and definitive choice about integrating (or not) the World Layer into your experience. You should have a compelling reason for hiding the world layer, just as you should have a compelling reason for why this story should be told in AR. It’s fine to immerse a user by hiding the World Layer as long as you do so safely and with a clear understanding of accessibility concerns.

Using 3D models, 2D assets, and audio

As we’ve found again and again with AR, 2D assets and text are effective and inexpensive—an advantageous combo especially when you are in the prototyping phase. We chose Apple ][ as our font, which communicated a “1980s command line” aesthetic and, in some scenes, I applied a 3D background to the text to give it substance and reliable legibility no matter the context. We used audio in the Hatch/Psychologist sequence and later in the tunnel, but the secret sauce of this AR experience is its 3D assets. 

While audio and 2D assets are the workhorses of most AR experiences–important supporting players–the tone and character of a story is set by the 3D objects. Between Sketchfab and Poly, I found everything I needed relatively quickly. I “constructed” the start screen sequence, including the Bandersnatch monolith (so many legos), as well as the tunnel and Dax (the Bandersnitch) all from found models. I used low-poly assets mainly for performance reasons but ultimately liked the overall feel they provided, evoking along with the text video play of a bygone era. 

Text, audio, video, and 3D assets all play their part in the experience but the 3D assets really shine.

During the design process—quite unintentionally, or at least subconsciously, a coherent image system emerged. Doors, hatches, and tunnels all represented points of decision and transition.

These liminal points are where users spend most of their time agonizing over choices. The tape—a musical object evocative of endless loops—were perfect for the Rick Roll. The bridge, rickety and uninviting, creates a mood of apprehension even before Dax appears.

We discovered, and then doubled down on, emergent themes, which made building this experience feel just like writing. It was one of the best parts of building Bandersnitch. Writing is world building and there are a rich set of literary devices and tools an author has at her disposal to amplify or foreshadow narrative threads. The same is true when designing in AR.For example, we added the hatch branch of the experience and played up choices and their consequence. In this branch, the Psychologist provides the player with a boomerang to fight Dax fit with the theme that violence often hurt its purveyor as well as its target.

Key Tip: Use 2D assets and audio to inform and guide, and focus on 3D models to create mood and thematic unity that will engage your users. I’d even suggest scouting 3D assets ahead of time for inspiration.

Interactions

The Bandersnitch CYOA experience challenged me to think differently about Torch’s interaction system and how I could use trigger types to engage and reward users. We tend to focus on utilitarian cases for AR with Torch -- wayfinding, pre-sales visualization tool, travel and leisure experiences  -- which are typically more transactional in nature and have a greater emphasis on presenting data and soliciting input. We want to move a person through the experience as efficiently as possible.

A chart with interactions, transitions, and major scenes.

With Bandersnitch, the goal is a little different. We want the user to linger, consider, study, and puzzle out meanings. We used three interaction trigger types—“select,” “enter proximity” and “gaze at.” Users tap an object to express choices–selecting a cereal brand. We used proximity triggers (when a user approaches an object and gaze to trigger responses (typically changes to object properties) that reward exploration and movement. Examples of this are found in the end tunnel sequence where proximity triggers result in the appearance of additional directions and, ultimately, the Dax.

Working on this experience underscored an important difference between active interactions, like tap and sometimes gaze, and passive interactions (proximity, gaze). I came to understand their unique effects.

Active interactions encourage the user to express clear intention, increasing their sense of agency and engagement—this is, after all, a choose your own adventure experience. Passive interactions are often triggered by conscious and unconscious physical movement. These passive interactions have very real responses with assets appearing out of thin air or ambient audio playing in the background of a scene. Passive interactions can create a sense of surprise or reward movement and exploration.

 

Animations triggered by interactions—especially passive ones—were also essential in maintaining interest in the experience. I am certain that if the Bandersnitch experience were simply someone clicking through a series of choices without freedom to explore and without encountering unexpected surprises, the experience would not have been nearly as engaging.

Key Tip: A mix of active and passive interactions create an experience that engages and delights, which neatly summarizes our aspirations here at Torch for  AR. Bottom line: Use both! The people trying your experience will appreciate the decision even if they don’t notice it.

Denouement

With its ability to transform a viewer’s environment and immerse them in a narrative, AR storytelling gives designers opportunities to create deep connections and lasting impressions. It wasn’t until I built Bandersnitch in Torch that I understood that AR was also a good “writing” tool. I found I could express ideas in a way that wasn’t quite cinematic or literary, and certainly wasn’t just a straight video game rip off—something I was conscious of from the outset. Developing this narrative project helped me understand AR storytelling as a unique medium with amazing affordances and potential. That is pretty damn exciting and almost certainly worthy of a bunch of sequels.

Special thanks to Kami Karras for her help with this post and the Bandersnitch experience.

Feeling inspired? Download Torch and start writing in AR!