Exoskeleton Arm for Virtual Reality
For my senior design course, I prototyped an exoskeleton arm to interface with virtual reality. The arm was 3D printed using Solidworks CAD software and controlled by on-board AVR 32-bit microcontroller. The exoskeleton arm allowed user to interface with shapes of virtual objects in real time by using basic haptic feedback.
Role: Hardware & MCU Code (partner Kevin Wang)
Duration: Oct 2014 - Dec 2014
Design & Implementation
We explored couple options to map voltage input to force output on fingers. With safety and repeatability in mind, we used phototransistors to receive light intensity and the servos to provide a pulling force on the fingers.
The hardware included microcontroller, LCD display, phototransistors, LED control, and servos. For better control, the ADC input of the phototransistors had decoupling capacitors and low pass filter to remove noise.
The microcontroller concurrently executed LCD display, servo control, and LED control in real time. Light-intensity was linearly mapped to distance in order to emulate the volume or contour of the virtual object. Upon reaching a certain threshold in the ADC, the controller would return servos to a mapped position and produce haptic feedback to the user.
Given only 2 months to design and execute the project, we prioritized on the core implementation of the project. We were able to implement accurate and repeatable haptic feedback but we were limited to only varying the volume of spherical virtual objects. This projected provided a thorough culminating design experience of noise filtering, real-time processing, mechanical design, as well as overall integration of the project. Our future steps would've been to implement ways to vary the shape of the virtual objects by adding LEDs for each axis.