Hybrid Interaction Experiments

Timeframe:
Spring 2019
2 Weeks
This project is a collection of interaction design experiments using a variety of hardware and software input/output methods. Through these experiments, I am learning some hard skills, but ultimately deepening my understanding of the nature of interactions and am developing a sense of how I ought to be designing them.

I had the humbling opportunity to present and discuss this project as well as my Spatial Controller project directly to Jony Ive’s Interaction Architecture team at Apple.


Problem Space


In our current technology paradigm, the input tools we use to interface with our digital spaces are largely limited to the keyboard, mouse, and trackpad. There is a problem here because sometimes, these tools don’t afford the kind of interaction appropriate for a certain function, and thus limiting the kind of experiences we build.

This project is an ongoing series of simple explorations that start to imagine what new applications, functions, or experiences could be afforded from different hybrid interactions.


Process

This exploration allows one to use their own device to remotely control a robot, upon which a variety of things could be mounted.

Iterations
 

Initial functional prototype made with a servo and my OSC app
After the initial functional prototype, I decided to take it further and make a full pan-tilt system. I decided against using that same servo because of the limited rotational range as well as the heavy limitations with passing many values quickly. I decided to use IQ Motion Module motors.


Some developments with the form, considering the Arduino placement


For these prototypes, I designed the top cover to be removeable and the front to be clear acrylic, so debugging the electricity would be easier. I planned to seal everything later on.


Functional wooden prototype.
Notice the panel mounts that make the model self-contained and easy to power!


This was the first working prototype. When I actually saw the robot copying my movement, I started to easily see the robot as an extension of my hand, and found myself manipulating its roations extremely intuitively.

First working demo!
Not visible in the video, but I also programmed haptics to trigger when the phone is rotated past the robot’s reach.
Lastly, I created a polished case with translucent white acrylic and covered the front to hide the electronic components.


Learnings
This interaction exploration allowed me to understand an approach to make robots more approachable and intuitive to interface with. Although it is not a far reach to type out angles for these different motos to go to, the physical mainpulation affords an intuition that allows you to focus exactly on what the robot should be doing, not how it should do that thing.

Technially, I learned a lot more about communication protocols between 5+ entities, as well as some physical prototyping. In addition, I started to understand how thoughtful interactions, even for seemingly intimidating things like robots, can make tasks extremely approachable and intuitive.

I hope to take this project further eventually and add some attachments!

email     linkedin     instagram      vimeo     resume      inspirations     things