Sensorimotor Integration in Robots and Animals: Signals, Geometry and Mechanics

Noah J. Cowan, LIMBS Laboratory, Johns Hopkins University

Animals execute split-second maneuvers to avoid obstacles, catch prey, and evade predators amidst myriad information from thousands of sensors. Decoding animal sensory systems that achieve this extraordinary closed-loop performance, and designing sensor systems for robots to match it, requires integration along three conceptual axes: spatiotemporal signal processing, geometry, and mechanics. Toward this long-term goal, I will describe several robotic and biological systems that highlight different combinations of these axes.

Signal processing and mechanics. Wall following in cockroaches and refuge tracking in weakly electric fish both reveal that the mechanics of a locomotor task is encoded at the earliest stages of neural processing.

Mechanics and geometry. Mapping rigid motions into a camera image plane provides image-based generalized coordinates for the mechanical system.

Geometry and signal processing. Spatial sampling kernels generate featureless hooks for sensor-based control using natural images.


Noah J. Cowan received the B.S. degree from the Ohio State University, Columbus, in 1995, and the M.S. and Ph.D. degrees from the University of Michigan, Ann Arbor, in 1997 and 2001, all in electrical engineering. He was a postdoctoral fellow in the PolyPEDAL laboratory at the University of California, Berkeley, before joining the Johns Hopkins University faculty in 2003, where he is now an Assistant Professor in the Department of Mechanical Engineering. Prof. Cowan is the director of the Locomotion In Mechanical and Biological Systems (LIMBS) Laboratory, and his research interests include sensor-based control in robotics and biology and medical robotics.

See also: