Current Research

Knowledge Transfer and Task Modeling for Robot Manipulation Tasks

Robust methods for representing, generalizing, and sharing knowledge across different robotic systems and configurations are important in many domains of robotics research and application. In this project we present a framework for capturing robot capability and process specification to simplify the sharing and reuse of knowledge between robots in manufacturing environments. A SysML model is developed that represents knowledge about system capabilities in the form of simple skills and skill primitives that can be used in different situations or contexts. The videos shown here serve as a demonstration of how the model can be applied to different assembly tasks.

[Huckaby, et al., "Toward a Knowledge Transfer Framework for Process Abstraction in Manufacturing Robotics", ICML 2013 Workshop]
[Huckaby, et al., "A Taxonomic Framework for Task Modeling and Knowledge Transfer in Manufacturing Robotics", AAAI 2012 Workshop]

SysML for Robotics

The Systems Modeling Language, or SysML, is a relatively recent modeling language that has been developed specifically for modeling the various complex systems encountered in Systems Engineering. This project proposes the use of SysML for the modeling of robotic systems. SysML offers a number of advantages over other modeling options, such as robust tools for model checking, verification, and automatic code generation. This work discusses SysML as an option for the modeling of general robotics systems, and applies it specifically to action and skill modeling of robots in manufacturing environments.

[Huckaby, et al., "A Case for SysML in Robotics", CASE 2014 - In Review]
[Huckaby, et al., "Modeling Robot Assembly Tasks in Manufacturing Using SysML", ISR 2014]

Knowledge Transfer Through Platform Abstraction

In this work we extend research that has already been done using the SysML-based knowledge model for robots in manufacturing environments. While the approach has previously been demonstrated to be able to use this knowledge model to abstract across different assembly tasks using the same manipulator, we now attempt to demonstrate how it can be used without modification to transfer knowledge across different robot platforms. Our demonstrator platform in this case is a Universal Robots UR-5 robot manipulator.

[Huckaby, et al., "Skill Abstraction in Robot Manufacturing Tasks", ISER 2014 - In Review]

Planning Using a Task Modeling Framework

In this project we present the idea that by using AI planning in concert with formal task modeling, the overhead associated with plan creation for complex tasks can be reduced. The proposed approach uses a SysML taxonomy to model the system capabilities and the process specification, and the PDDL planning language to determine acceptable objective solutions. This idea is applied to the manufacturing domain, and examples are shown modeling a multi-robot system in an automobile manufacturing environment.

[Huckaby, et al., "Planning with a Task Modeling Framework in Manufacturing Robotics", IROS 2013]

Past Projects

Robotics for Biological Sampling

Currently most work involving the taking and analyzing of biological samples in the biomedical field is done manually. This can be time consuming, and makes some tasks quite difficult. In this project, we proposed and implemented a robotic solution to this problem. By using an RGBD camera and point cloud visualization tools, we were able to identify specific points on the sample, and have the robot autonomously take the samples at those points, and analyze the sample using a mass spectrometer. This method allowed us to generate a topographical map of the sample, which could be useful for a number of different applications (e.g. better targeted cancer treatment.)

[Bennet, et al., "Robotic Plasma Probe Ionization Mass Spectrometry (ROPPI-MS) of Non-Planar Surfaces", The Analyst 2014]

Mobile Manipulation Using a Low-DOF Manipulator

This work explored the use of a low degree-of-freedom manipulator as a viable and cost-effective solution to the mobile manipulation problem in a home setting. The project utilized the Jeeves mobile manipulation platform. Through experimental results we have shown that with a single degree of freedom manipulator and the control scheme proposed and implemented on the Jeeves platform, we can achieve reliable and repeatable results for a number of varied household object types. Perception was performed using a 3D laser scanner to create a point cloud, from which table top objects could be segmented. Ten objects were selected to serve as the representative categories of general objects found in domestic environments.

[Huckaby, et al., "Mobile Manipulation in Domestic Environments Using a Low-DOF Manipulator", Technical Report]

Networked Multi-Agent Control

This class project focused on the control of a simulated team of robots in a search and rescue operation. Controllers were designed to accommodate waypoint navigation of a team of up to twelve robots, team movement given limited communication distance and line of sight, navigation among complex and varied obstacles, ground coverage for search operations, split and merge operations, and precise formation achievement. In a competition measuring speed and accuracy, the controllers I designed for this project placed second in a class of 30+ students.

GT@Home - RoboCup@Home Team

The Cognitive Robotics Lab at Georgia Tech participated in the 2010 Robocup@Home competition in Singapore. The team, GT@Home, consisted of members from the lab, and the mobile manipulation platform 'Jeeves'. Work was done to demonstrate navigation, SLAM in domestic environments, semantic scene understanding, object segmentation and recognition, person following, verbal human interaction, and mobile manipulation.

Dynamic Characterization of Light-Weight Robots

High precision tasks are an important part of the manufacturing industry. With the recent availability of smaller, light-weight robot manipulators, the question arises of whether it would be feasible to use them in tasks that would otherwise be unsuitable for the standard large industrial robots. For this to be possible, the light weight robot would need to be stiff enough to be able to meet safety constraints. To that end, this study used experimental modal analysis to determine and analyze the dynamic characteristics of small, light-weight manipulators to determine whether it would be possible for them to be used in manufacturing tasks. Two light-weight robot manipulators were considered in the study: the KUKA Light-Weight Robot (LWR) and the KUKA KR5-sixx.

[Huckaby, et al., "Dynamic Characterization of KUKA Light-Weight Manipulators", Technical Report]

Georgia Tech PSA Design

I worked with several members of the lab to design the robot effects (physical set, robot choreography, etc.) for Georgia Tech's 2009 public service announcement (PSA). A new commercial is produced annually, and is often broadcast along with televised sporting events. The PSA went on to win an Emmy Award for "Best Television Commercial Produced in the Southeast." The first video showcases the PSA itself, while the second longer video shows more footage that was taken of the robot setup.

Semantic Perception for Mobile Manipulation

Personal service robots will need to understand semantic object relationships and task context in order to assist humans in their everyday lives. This project demonstrated a technique using keywords, spatial relationships, colors, and other contextual information to assist in the mobile manipulation and object recognition tasks. Preliminary results using a mobile manipulation platform were also presented.

[Choi, et al., "Toward Semantic Perception for Mobile Manipulation", IROS 2009 Workshop]

Simulation for Mobile Manipulation

This class project utilized the srLib rigid multi-body dynamics simulation package to develop two main components: a dynamically balancing mobile platform, and a seven link manipulator. To demonstrate this project, we simulated a mobile manipulator comprised of our dynamically balancing platform with our seven link arm. We used this platform for a fetch and carry task, and were able to successfully move the platform, grasp an object, and deliver the object to a secondary location.

National Robotics Roadmap

I spent the summer of 2008 helping Dr. Henrik Christensen administer a series of workshops designed to formulate a long-term research roadmap for robotics in the United States. Each workshop brought together leaders from both academia and industry in targeted areas (manufacturing, healthcare, service, emerging technologies) to discuss business drivers, missing competencies, and research directions for 5, 10, and 15 year time frames. The resulting roadmap was consequently presented to Congress in 2009, and was one motivation for the National Robotics Initiative. The roadmap was recently updated (2013), and presented again to Congress in October 2013.

DARPA Urban Challenge

I spent one year working on the BYU team competing in the DARPA Urban Challenge, a competition aimed at enabling a car to drive in an urban environment autonomously. I served as the undergraduate group leader for the team controls group, and was responsible for leading a group of students in designing and implementing vehicle drive and correction control systems.