
Cube Controller
Our input methods for manipulating 3D models are antiquated and distant from natural methods of manipulating handheld objects. Intuitive gestures such as flipping an object over, stretching it, pinching it, or sliding it are lost when converted to a mouse or trackpad. This Cube Controller project asks: How might we augment the sensory experience of the manipulation of virtual three-dimensional objects?
I created a device prototype that enables some of these interactions through three feedback servos that can sense and actuate movement in the three major axes (X, Y, Z). It uses an accelerometer and gyroscope to sense the cube's movement in space, and can be recalibrated to align orientations with its digital twin or visual representation.
Created in the hyperSense: Augmenting Human Experience in Environments course with Dina El-Zanfaly, as part of her larger research in the hyperSense: Embodied Computations lab.
Interaction design, Mechanical design, Physical computing, Prototyping, Human-computer interaction research






Prototyping process
I first linked the accelerometer/gyroscope movement with a 3D model using Arduino with Processing for realtime visualization of the model manipulation.


I then tested and validated my linear scaling and sensing system with a single isolated axis. There's the potential for a whole range of control schemes for the actuation based on user input, digital object class, or other data.

