MOONYOUNG'S RESEARCH
  • Home
  • Work Experience
    • KAIST Hubo Lab
    • iRobot
    • inTouch Technologies Inc.
  • Projects
    • Embedded Computer Vision for Robot Localization
    • Vision-based Robotic Arm
    • CUAUV
    • Exoskeleton Arm
    • Robotic Segway
    • Robotic Arm Manipulation
    • Multi-core Processor
    • Piezosensor on Bipedal Robot
  • Blog
Picture
Exoskeleton Arm for Virtual Reality

Exoskeleton Arm for Virtual Reality

Picture
For my senior design course, I prototyped an exoskeleton arm to interface with virtual reality. The arm was 3D printed using Solidworks CAD software and controlled by on-board AVR 32-bit microcontroller. The exoskeleton arm allowed user to interface with shapes of virtual objects in real time by using basic haptic feedback.

Role: Hardware & MCU Code (partner Kevin Wang)
Duration: Oct 2014 - Dec 2014
Accomplishments:
  • ​Featured in Hackaday and Electronics-Lab Website
  • Checking off "Exoskeleton Arm" from bucketlist
  • Full website link



Design & Implementation

Mechanical Framework

Picture

We explored couple options to map voltage input to force output on fingers. With safety and repeatability in mind, we used phototransistors to receive light intensity and the servos to provide a pulling force on the fingers. 

Hardware

Picture

The hardware included microcontroller, LCD display, phototransistors, LED control, and servos. For better control, the ADC input of the phototransistors had  decoupling capacitors and low pass filter to remove noise.  

Control

Picture
The microcontroller concurrently executed LCD display, servo control, and LED control in real time. Light-intensity was linearly mapped to distance in order to emulate the volume or contour of the virtual object. Upon reaching a certain threshold in the ADC, the controller would return servos to a mapped position and produce haptic feedback to the user.

Given only 2 months to design and execute the project, we prioritized on the core implementation of the project. We were able to implement accurate and repeatable haptic feedback but we were limited to only varying the volume of spherical virtual objects. This projected provided a thorough culminating design experience of noise filtering, real-time processing, mechanical design, as well as overall integration of the project. Our future steps would've been to implement ways to vary the shape of the virtual objects by adding LEDs for each axis. 
Code on Github
3D on Thingiverse
Powered by Create your own unique website with customizable templates.
  • Home
  • Work Experience
    • KAIST Hubo Lab
    • iRobot
    • inTouch Technologies Inc.
  • Projects
    • Embedded Computer Vision for Robot Localization
    • Vision-based Robotic Arm
    • CUAUV
    • Exoskeleton Arm
    • Robotic Segway
    • Robotic Arm Manipulation
    • Multi-core Processor
    • Piezosensor on Bipedal Robot
  • Blog