Robots are fundamentally different from traditional information processing systems. The physical capabilities of robots amplify the cost of software flaws from inconvenience to potentially severe property damage and personal injury. Traditional software development approaches often, and even expectedly, result in systems with numerous flaws. Thus, we must adopt more powerful approaches to building safe and reliable robots.
We perform a visual analysis to extract a task description from human demonstration, then transfer the results to a robot.
The Motion Grammar is a Formal Model for robot control. We demonstrate this method through the physical human-robot games of Yamakuzushi and Chess.
To produce correctly operating robotic systems, we need a way to modify the system dynamics to achieve the desired behavior. To help automate the derivation of correctly operating controlled systems, we introduce the Motion Grammar Calculus, a set of rewrite rules for Context-Free Hybrid Systems based on the Motion Grammar.
Ach is a message passing IPC implemented over circular buffers in shared memory. It is useful for real-time control because it does not suffer from head-of line blocking. The library has been formally verified with a SPIN model. More info on its wiki page.
Neil Dantam received an Academic Honors Diploma from the Indiana Academy for Science, Mathematics, and Humanities in May 2004. He studied Mechanical Engineering and Computer Science at Purdue University, receiving a Bachelor of Science in each and minor in Economics in May 2008. While at Purdue, Neil also worked with Dr. Antony Hosking on Multi-Core garbage collection for Modula-3 and with Dr. Monika Ivantysynova and Dr. Peter Meckl on a control system for the Purdue Hydraulic Car.
Neil has worked as an intern at Delaware Machinery, ContactSul, Raytheon, C-SPAN Archives, Lincoln Laboratory, and iRobot Research, in the areas of web development, software engineering, and robotics.
In Fall 2008, Neil began the Ph.D. in Robotics at the Georgia Tech Humanoid Robotics Lab directed by Prof. Mike Stilman. In 2010, Neil began developing the Motion Grammar, a new approach to robot perception, planning, and control that combines powerful results from both Modern Control Theory and Language and Automata Theory to provide both an expressive programming representation and formal performance guarantees. Work is ongoing, with many exciting results to present.