Robotics

Dexterous Manipulation

Our engineers are lead designers on the NASA Robonaut humanoid robot project. This includes design of Robonaut’s manipulators, end effectors, computer vision system and autonomy software. The Robonaut employs the extreme kinematic control of high degree-of-freedom manipulators and the closed loop control of such manipulators.

Robotic Dexterity
Dexterity allows a robot to place its manipulator in a variety of configurations. However, because there is a huge combination of different configurations, choosing the correct one is a difficult task. We develop complex software that automates the most complex manipulators.

Mobile Manipulation
Our engineers work on a variety of NASA robots that combine mobility with dexterous manipulation. This includes work on the coordination of the robot’s planar degrees of freedom with its manipulation capabilities. Mobile manipulation requires careful integration of a variety of robotic hardware and software to accomplish sophisticated tasks.

Computer Vision

TRACLabs engineers have decades of experience in developing and fielding computer vision systems for robot applications. These vision systems have been used to track people, recognize their gestures, identify objects, return the six degree-of-freedom pose of objects and perform terrain analysis for obstacle avoidance and path planning. Our computer vision software runs in real-time using range images from LIDAR or stereo.

Object Recognition
Object recognition is one of the most difficult computational problems for robots. The nearly infinite variety of objects in all kinds of poses makes recognizing objects from camera images difficult. TRACLabs researchers are tacking this problem by applying a sequence of specialized processing filters to this 3D data to extract object type, location, and orientation.

Multi-Robot Coordination

For many applications, a robot team can accomplish tasks much more quickly and effectively than a single unit. As in human teams, tasks and roles are divided up and assigned separately to specialist robots—then executed concurrently to save time.

Multi-Robot Coordination
TRACLabs engineers develop frameworks allowing multiple robots with different capabilities to work together to solve complex problems quickly. These tasks require coordinating unique hardware and software systems to accomplish things like structure assembly or car manufacturing.

Playing Roles
Multiple robots need to interact with each other and assume their own specific roles and responsibilities as members of a team. Our planning and execution software assigns roles to robots and coordinates their activities to accomplish these tasks. Just like people roles are assigned to robots based on their capabilities.

Human-Robot Interaction

Developing human-robot interaction software for NASA robotics over the past decade is one of the cornerstones of our work. In space and other restrictive environments, it is critical that astronauts are able to function seamlessly with robotic counterparts.

A Spaceman's Best Friend
This area of robotics not only involves side-by-side human-robot interaction to perform joint tasks, but also remote interaction via an operator control unit (OCU). Adjustable autonomy, a term coined by TRACLabs engineers, allows for both teleoperation of robots and supervision of autonomous robots, as well as any mode in between.

Robots that are operating very distant from human operators (for example, on the Moon) need special software for summarizing their activities and presenting this summary to operators. TRACLabs is developing such software under a NASA grant. This also has applicability in the military and private sector in the operation of unmanned vehicles and drones.

For more information, please visit www.traclabs.com.