Spatial Controller

Timeframe:
Spring 2019
2 Weeks

(ongoing)
In this project I build a method that allows the iPhone to become a powerful 3D input device that can manipulate and interact with digital 3D environments.

I had the humbling opportunity to present and discuss this project as well as my Hybrid Interactions project directly to Jony Ive’s Interaction Architecture team at Apple. If you are interested in learning more about it, feel free to send me a message!



Problem Space


With the advent of AR, VR, and other technologies, 3D interfaces are becoming more common to see. However, when viewed on a 2D computer screen with a 2D tool (mouse), we are using incompatible tools and practices to manipulate 3D realities.

Using a mouse to navigate and manipulate 3D space takes a long time to master and consider intuitive, because it is not inherently spatial. 
As you can see from the video above—even after using Unity for a while, I still take a long time to precisely position and nagivate 3D objects using a mouse and keyboard.  

This project is an ongoing set of explorations that reimagine more intuitive ways to navigate, explore, manipulate, and otherwise interact with digital 3D environments with an external phone-based input controller.


Process


Because of the inherent complexities of this project, I broke down my design and prototyping process into a couple different threads—designing the input software, establishing the computer-phone communication, exploring the interactions themselves, and refining and consolidating to be cohesive.

Input App
In order to achieve this interaction paradigm, I had to create a custom app with Swift and XCode that streamed real-time position and orientation data to my computing device. This is essentially a tool that allowed me to create more tools. I fully intend this controller to eventually have a far more refined UI.
Debug UI version of the app I developed.I hope to eventually create a more polished version with clearer controls and functions.
I created this input app in XCode with Swift using the SwiftOSC framework.
This project was the first time I used Swift!


Functional Iterations
This project really started when I created the first functional prototype, which mapped physical device rotation to the rotation of a digital 3D object. (Check out my twitter thread for a better idea of my iterative process!)

Eventually, I was able to use ARKit to give my digital box positional data—completing the 6DoF capabilities needed to make meaningful and intuitive interactions.
First time passing position to Unity through OSC using ARKit.
Interaction Explorations
Through this project I was designing an approach to how we interface with with 3D spaces through our 2D screens. There are two guiding principles that I tried to embody through these designs:

1. The nature of the input interactions changes the nature of the output.
2. If interactions are more intuitive, we have a greater capacity to be more thoughtful about what we’re actually trying to accomplish.

Here are a set of quick interaction explorations of varying scales that make way for this tool to be used in more creative and meaningful ways.
Interaction 1: Haptic Zones
In each of these grey volumes, different kinds of haptic feedback are being triggered.
Though impossible to see, each of the grey volumes is embedded with a different ‘haptic viscocity’ and when you enter/ stay inside of them with your physical 3D cursor, your device gives you haptic feedback at varying intensities and natures.

I started to see how this passive information could be extremely rich when the objects have different characteristics, and as our haptic vocabulary becomes more diverse in range.


Interaction 2: Camera Control
Camera rotation based on orientation of iPhone.
This was a quick prototype where I change the camera angle (around a pivot or freeform) of a certain 3D scene based on the orientation of the device.

Orbiting in 3D software is such a distraction from my flow, and having a more natural way to change my camera angle allows me to be that much more focused on what I’m trying to do.

Next time, I think I would toggle the visibility of the cursor, as it is slightly distracting.


Interaction 3: 3D Touch Manipulation
Using the iPhone 3D Touch capabilities to manipulate these digital 3D objects in different ways.
In this interaction, I map the force of the touch event to some attribute in the digital 3D object. I was curious to see if this was an intuitive way to interface with a digital space and its content.

Another important detail to highlight is that there are haptics in this interaction. A haptic feedback is triggered when the digital cursor collides with and exits a digital object. It makes your brain believe that there are “invisible objects” in reality!

In terms of design, this interaction allows me to understand how the nature of the 3D touch input can be used in a 3D context. I started asking questions like: “Is this interaction an appropriate metaphor for making something brighter? faster?”


Interaction 4: Collaborative Annotations
A video highlighting the back and forth interaction between the digital 3D scene and the device itself. 
In this interaction, I wanted to leverage the impressive input capabilities that the iPhone has to offer. I created a tool that allows a single user or multiple users to use their own devices to nagivate and locally place thoughts in a 3D scene.

When one isn’t concerned as much about perfectly placing and orienting a comment, they have an easier time stringing thoughts and being more focused in this headspace. 


This video portrays how easily the phone is used as a digital interface tool.

It was exciting to finally make this because it felt like an appropriate use of my technology. When I was demoing to others, I frequently heard that it felt intuitive and natural to use.

One lesson I learned was that when multiple screens are present, it is important to know where you should be looking. I intend to add that feature soon!
Next Steps
Flesh out the controller UI to cohesively offer functions such as:
  • instantiating a primitive
  • manipulating material properties
  • manipulate camera
  • enter different modes (annotate, drawing, selection, documentation, etc.)

Onboarding process:
  • How you pair your personal device as a controller
  • Choose your own unique controller
  • Determinine which hand you are using

Refine and Test:
  • User testing to see if it is more intuitive than existing 3D software
  • Edit based on feedback (multiple rounds)


Current Learnings


This project is a lot of fun! I took a simple curiosity and built a project out of it, and I was so thankful to have the resources and environment to learn the skills I needed to manifest my ideas into interactive prototypes. 

I am learning that there is a heightened sense of reality in digital artifacts when you activate the human body more, allowing you to perceive, think about, and mainpulate things in a more thoughtful way. I hope that this project can continue to mature and hopefully inspire future technologies and interaction paradigms as the 3D interface paradigm becomes more popular.
email     linkedin     instagram      vimeo     resume      inspirations     things