KAIST Humanoid Research Lab
Role: Research Scientist (Robot Perception)
Duration: Feb 2017 - March 2020
Duration: Feb 2017 - March 2020
Camera-Guided Footstep Planner for Bipedal Robot
Time: 02/2020~ on-going Role: Perception SW Although legged robots can traverse greater variety of terrains than mobile robots, determining footstep positions for stable-walk can be challenging. In this work, I analyze performance of popular visual sensors, and develop perception software to generate a queue of feasible footstep pose for the walking-motion-controller. The challenge is to minimize the overall computation processing time (5Hz) with visual data from dynamic movements.
|
|
Mobile Manipulator for Stock & Disposal
Time: 06/2019~12/2019 Role: Hardware Development & Perception SW 3rd Place at World Robot Summit 2019 Competition, Japan Created a mobile base platform for the commercial 6 DOF robot arm RB5 developed by Rainbow Robotics (c). In addition, we created a ROS API for the RB5 manipulator, allowing for seamless integration for sensor data, perception modules, and motion control.
With the new hardware platform integrated with ROS, I then developed SLAM & navigation module using 2D Lidar with IMU. Improvements from previous navigation include finer alignment using ARUCO markers from camera data. Marker detection module was similarly adopted for object detection for manipulation. However, because object manipulation required high precision, I fused 3D pointcloud data with RGB marker data for calibrated precision of millimeter scale within 1 meter detection range. With this platform, we competed in the World Robot Summit Stock & Disposal competition to place 3rd place (first year entry). |
|
Service Robot Platform Development
Time: 03/2018~12/2018 Role: SW Framework Dev & State Machine & Integration IROS2019 Conference (Accepted) [PDF] RoboWorld 2018 Public Exhibition, Korea Created a completely new vision system for the Hubo Platform (DRC winning robot) for autonomous navigation and object detection tasks. This entailed selecting a new sensor suite for our application, as well as heavily expanding the previous software framework to be compatible with ROS.
With the new hardware platform integrated with ROS, I led a collaboration effort between 4 KAIST labs to develop intelligence for service robot. The main software for included object detection, SLAM, planning, and state machine that evokes motion control. The overall platform was tested with a specific application of beverage delivery in a convenient store setting. The robot demonstrated fast execution by delivering a drink in about 30 seconds and robustness in 10 variety of objects. |
|
Automated 2D, 3D Data Collection for Object Pose Estimation
Time: 10/2019~ Present Accurate & abundant training data leads to improved deep neural network performance. However, human annotation for 6D object pose is tedious. I wrote python script to automate data collection using robotic arm with RGBD camera data. 2D & 3D data of object from multiple viewpoints are registered for dense 3D reconstruction of object. This will allow for easier training & labeling process for various novel objects.
|
|
Dynamic Path Planner and 2D Navigation
Time: 06/2019~ 10/2019 The service robot in 2018 did not have collision avoidance capabilities and required closed-off space for operation. With Google Cartographer's SLAM (Lidar&IMU based), a 2D octomap is generated which allows for collision-free path planning based on A*. A path to a user-specified goal is generated in 0.01 seconds, which then the robot uses to generate smooth motor velocity commands.
|
|
Previous robot software framework (PODO) was designed solely for real-time motion control. With the new vision system developed in ROS, a bridge between ROS and PODO frameworks were required for tasks that required close integration of perception and action (navigation and manipulation). I developed a ROS API for Hubo platform that abstracted motion controls for humanoid platform (inverse kinematics, trajectory generation, etc) and converted Hubo's motion-requests into the ROS-standard Action data types.
|
|
Walking and Lifting a Box with Humanoid Platform
Time: 03/2018~ 06/2018 Simple footstep generation and box-lifting motion for DRC Hubo. Detected box pose mapped to walking velocity commands to arrive at the box. Once within grasping range, initiate open-loop lifting motion. This project led to initial efforts of expanding previous robot's software framework to integrate with ROS.
|
|