Hybrid Interaction Experiments

Timeframe:
Spring 2019
2 Weeks
This project highlights several different interaction design experiments using a variety of hardware and software input/output methods. Through these experiments, I am learning some hard skills, but ultimately deepening my understanding of the nature of interactions and am developing a sense of how I ought to be designing them.

I had the humbling opportunity to present and discuss this project as well as my Spatial Controller project directly to Jony Ive’s Interaction Architecture team at Apple.


Problem Space:


In our current technology paradigm, the input tools we use to interface with our digital spaces are largely limited to the keyboard, mouse, and trackpad. There is a problem here because sometimes, these tools don’t afford the kind of interaction appropriate for a certain function, and thus limiting the kind of experiences we build.

This project is an ongoing series of explorations that start to imagine what new applications, functions, or experiences could be afforded from another computational input tool—our smartphones.


Exploration 1: Rotational Controller


This exploration was one where I take a single-axis rotation with an iPhone and map that motion to a different application in software. These interactions are designed for situations where you might not have the hardware to navigate our digital spaces, or would like an isolated input tool for specific and precise manipulation.

Navigation Tools—

These are interactions that I’ve designed to supplement or reimagine the way we interface with our everyday computer OS. We currently have two primary input devices (keyboard and mouse) to navigate our digital spaces, but these projects explore how our digital experiences could be heightened, focused, or simplified through a more physical, human input interaction.

These interactions map rotational movement to linear movement. I was initially concerned that this would be an incompatible pair, but it actually felt very natural, kind of like a dial.
Mapped rotational movement to location in a video.
This interaction allows the user to easily navigate a video or movie, to relative precision. This is for environments like home movies, where we might want to interface with our videos without taking out a keyboard and mouse.
For more specific observations or studying, there may be instances where it is important to view a video in a frame by frame manner. This interaction offers that capacity, as well as triggers a haptic feedback at each frame change.
Creative Tools—

The nature of the tools we use directrlt impacts the nature of the things we create with them. When we afford the input tool greater humanitiy/ human touch, we are eliminating that extra layer of abstraction and perhaps allowing the created thing to be more inherently human.
Mapped rotational movement to the brightness of a photo in Adobe Lightroom.
I started to see how it could be powerful to use a physical granularity tool as a powerful input device in creative applications. For example, there is something inherently satisfying in having haptics tell you every stop of exposure you change! 


Learnings—

These interactions I experimented with helped me to extend my understanding how we could possible interface with our computers in a more intuitive way. I learned about the affordances of physical objects, and how that could help/hurt my interaction design. I also started to peripherally learn the nature of rotation input.


Exploration 2: Robotic Mirroring

This exploration allows one to use their own device to remotely control a robot, upon which a variety of things could be mounted.

Iterations—
 

Initial functional prototype made with a servo and my OSC app
After the initial functional prototype, I decided to take it further and make a full pan-tilt system. I decided against using that same servo because of the limited rotational range as well as the heavy limitations with passing many values quickly. I decided to use IQ Motion Module motors.


Some developments with the form, considering the Arduino placement


For these prototypes, I designed the top cover to be removeable and the front to be clear acrylic, so debugging the electricity would be easier. I planned to seal everything later on.


Functional wooden prototype.
Notice the panel mounts that make the model self-contained and easy to power!


This was the first working prototype. When I actually saw the robot copying my movement, I started to easily see the robot as an extension of my hand, and found myself manipulating its roations extremely intuitively.

First working demo!
Not visible in the video, but I also programmed haptics to trigger when the phone is rotated past the robot’s reach.
Lastly, I created a polished case with translucent white acrylic and covered the front to hide the electronic components.


Lessons—

This interaction exploration allowed me to understand an approach to make robots more approachable and intuitive to interface with. Although it is not a far reach to type out angles for these different motos to go to, the physical mainpulation affords an intuition that allows you to focus exactly on what the robot should be doing, not how it should do that thing.

Technially, I learned a lot more about communication protocols between 5+ entities, as well as some physical prototyping. In addition, I started to understand how thoughtful interactions, even for seemingly intimidating things like robots, can make tasks extremely approachable and intuitive.

I hope to take this project further eventually and add some attachments!


Exploration 3: Force Touch on Web

Experiments with manipulating 3D forms with force touch on a MacBook Pro
This was a quick and simple exploration in which I explored how the nature of the force touch input interaction could be mapped to something in the digital 3D world. I learned that it shouldn’t be used for high-precision things, but more volumetric and mass-driven ouptuts.

email     linkedin     instagram      vimeo     resume      inspirations     things