Charles M Higgins

Charles M Higgins

Associate Professor, Neuroscience
Associate Professor, Neuroscience - GIDP
Associate Professor, Applied Mathematics - GIDP
Associate Professor, Electrical and Computer Engineering
Associate Professor, Entomology / Insect Science - GIDP
Associate Professor, BIO5 Institute
Primary Department
Department Affiliations
Contact
(520) 621-6604

Research Interest

Charles Higgins, PhD, is an Associate Professor in the Department of Neuroscience with a dual appointment in Electrical Engineering at the University of Arizona where he is also leader of the Higgins Lab. Though he started his career as an electrical engineer, his fascination with the natural world has led him to study insect vision and visual processing, while also trying to meld together the worlds of robotics and biology. His research ranges from software simulations of brain circuits to interfacing live insect brains with robots, but his driving interest continues to be building truly intelligent machines.Dr. Higgins’ lab conducts research in areas that vary from computational neuroscience to biologically-inspired engineering. The unifying goal of all these projects is to understand the representations and computational architectures used by biological systems. These projects are conducted in close collaboration with neurobiology laboratories that perform anatomical, electrophysiological, and histological studies, mostly in insects.More than three years ago he captured news headlines when he and his lab team demonstrated a robot they built which was guided by the brain and eyes of a moth. The moth, immobilized inside a plastic tube, was mounted on a 6-inch-tall wheeled robot. When the moth moved its eyes to the right, the robot turned in that direction, proving brain-machine interaction. While the demonstration was effective, Charles soon went to work to overcome the difficulty the methodology presented in keeping the electrodes attached to the brain of the moth while the robot was in motion. This has led him to focus his work on another insect species.

Publications

Higgins, C., Rivera-Alvidrez, Z., Lin, I., & Higgins, C. M. (2011). A neuronally based model of contrast gain adaptation in fly motion vision. Visual neuroscience, 28(5).

Motion-sensitive neurons in the visual systems of many species, including humans, exhibit a depression of motion responses immediately after being exposed to rapidly moving images. This motion adaptation has been extensively studied in flies, but a neuronal mechanism that explains the most prominent component of adaptation, which occurs regardless of the direction of motion of the visual stimulus, has yet to be proposed. We identify a neuronal mechanism, namely frequency-dependent synaptic depression, which explains a number of the features of adaptation in mammalian motion-sensitive neurons and use it to model fly motion adaptation. While synaptic depression has been studied mainly in spiking cells, we use the same principles to develop a simple model for depression in a graded synapse. By incorporating this synaptic model into a neuronally based model for elementary motion detection, along with the implementation of a center-surround spatial band-pass filtering stage that mimics the interactions among a subset of visual neurons, we show that we can predict with remarkable success most of the qualitative features of adaptation observed in electrophysiological experiments. Our results support the idea that diverse species share common computational principles for processing visual motion and suggest that such principles could be neuronally implemented in very similar ways.

Higgins, C. M., & Pant, V. (2004). A biomimetic VLSI sensor for visual tracking of small moving targets. IEEE Transactions on Circuits and Systems I: Regular Papers, 51(12), 2384-2394.

Abstract:

Taking inspiration from the visual system of the fly, we describe and characterize a monolithic analog very large-scale integration sensor, which produces control signals appropriate for the guidance of an autonomous robot to visually track a small moving target. This sensor is specifically designed to allow such tracking even from a moving imaging platform which experiences complex background optical flow patterns. Based on relative visual motion of the target and background, the computational model implemented by this sensor emphasizes any small-field motion which is inconsistent with the wide-field background motion. © 2004 IEEE.

Higgins, C. M., & Koch, C. (2000). Modular multi-chip neuromorphic architecture for real-time visual motion processing. Analog Integrated Circuits and Signal Processing, 24(3), 195-211.

Abstract:

The extent of pixel-parallel focal plane image processing is limited by pixel area and imager fill factor. In this paper, we describe a novel multi-chip neuromorphic VLSI visual motion processing system which combines analog circuitry with an asynchronous digital interchip communications protocol to allow more complex pixel-parallel motion processing than is possible in the focal plane. This multi-chip system retains the primary advantages of focal plane neuromorphic image processors: low-power consumption, continuous-time operation, and small size. The two basic VLSI building blocks are a photosensitive sender chip which incorporates a 2D imager array and transmits the position of moving spatial edges, and a receiver chip which computes a 2D optical flow vector field from the edge information. The elementary two-chip motion processing system consisting of a single sender and receiver is first characterized. Subsequently, two three-chip motion processing systems are described. The first three-chip system uses two sender chips to compute the presence of motion only at a particular stereoscopic depth from the imagers. The second three-chip system uses two receivers to simultaneously compute a linear and polar topographic mapping of the image plane, resulting in information about image translation, rotation, and expansion. These three-chip systems demonstrate the modularity and flexibility of the multi-chip neuromorphic approach.

Melano, T., & Higgins, C. M. (2005). The neuronal basis of direction selectivity in lobula plate tangential cells. Neurocomputing, 65-66(SPEC. ISS.), 153-159.

Abstract:

Using a neuronally based computational model of the fly's visual elementary motion detection (EMD) system, the effects of picrotoxin, a GABA receptor antagonist, were modeled to investigate the role of various GABAergic cells in direction selectivity. By comparing the results of our simulation of an anatomically correct model to previously published electrophysiological results, this study supports the hypothesis that EMD outputs integrated into tangential cells are weakly directional, although the tangential cells themselves respond to moving stimuli in a strongly directional manner. © 2004 Published by Elsevier B.V.

Pant, V., & Higgins, C. M. (2012). Tracking improves performance of biological collision avoidance models. Biological Cybernetics, 106(4-5), 307-322.

PMID: 22744199;Abstract:

Abstract Collision avoidance models derived from the study of insect brains do not perform universally well in practical collision scenarios, although the insects themselves may perform well in similar situations. In this article, we present a detailed simulation analysis of two well-known collision avoidance models and illustrate their limitations. In doing so, we present a novel continuous-time implementation of a neuronally based collision avoidance model. We then show that visual tracking can improve performance of thesemodels by allowing an relative computation of the distance between the obstacle and the observer.We compare the results of simulations of the two models with and without tracking to show how tracking improves the ability of the model to detect an imminent collision.We present an implementation of one of thesemodels processing imagery from a camera to showhow it performs in real-world scenarios. These results suggest that insects may track looming objects with their gaze. © The Author(s) 2012.