Researchers unveil neuromorphic exposure control system to improve machine vision in extreme lighting environments
- Date:
- March 4, 2025
- Source:
- The University of Hong Kong
- Summary:
- A research team has recently developed a groundbreaking neuromorphic exposure control (NEC) system that revolutionizes machine vision under extreme lighting variations. This biologically inspired system mimics human peripheral vision to achieve unprecedented speed and robustness in dynamic perception environments.
- Share:
A research team led by Professor Jia Pan and Professor Yifan Evan Peng from the Department of Computer Science and Department of Electrical & Electronic Engineering under the Faculty of Engineering at the University of Hong Kong (HKU), in collaboration with the researcher at Australian National University, has recently developed a groundbreaking neuromorphic exposure control (NEC) system that revolutionizes machine vision under extreme lighting variations. Published in Nature Communications, this biologically inspired system mimics human peripheral vision to achieve unprecedented speed and robustness in dynamic perception environments.
Traditional automatic exposure (AE) systems rely on iterative image feedback, creating a chicken-and-egg dilemma that fails in sudden brightness shifts (e.g., tunnels, glare). The NEC system solves this by integrating event cameras -- sensors that capture per-pixel brightness changes as asynchronous "events" -- with a novel Trilinear Event Double Integral (TEDI) algorithm. This approach: Operates at 130 million events/sec on a single CPU, enabling edge deployment.
"Like how our pupils instantly adapt to light, NEC mimics biological synergy between retinal pathways," explained Mr. Shijie Lin, the first-author of the article. "By fusing event streams with physical light metrics, we bypass traditional bottlenecks to deliver lighting-agnostic vision."
In tests, the team has validated NEC across mission-critical scenarios:
- Autonomous Driving: Improved detection accuracy (mAP +47.3%) when vehicles exit tunnels into blinding sunlight.
- Augmented Reality (AR): Achieved 11% higher pose estimation (PCK) for hand tracking under surgical lights.
- 3D Reconstruction: Enabled continuous SLAM in overexposed environments where conventional methods fail.
- Medical AR Assistance: Maintained clear intraoperative visualization despite dynamic spotlight adjustments.
Professor Jia Pan said, "This breakthrough represents a significant leap in machine vision by bridging the gap between biological principles and computational efficiency. The NEC system not only addresses the limitations of traditional exposure control but also paves the way for more adaptive and resilient vision systems in real-world applications, from autonomous vehicles to medical robotics."
Professor Evan Y. Peng commented, "Our collaborative work has been instrumental in pushing the boundaries of neuromorphic engineering. By leveraging event-based sensing and bio-inspired algorithms, we've created a system that is not only faster but also more robust under extreme conditions. This is a testament to the power of interdisciplinary research in solving diverse complex engineering challenges."
In the long term, the NEC paradigm offers a novel event-frame processing scheme that reduces the processing burden of high-resolution events/images and incorporates bio-plausible principles into the low-level control of the machine eyes. This opens new avenues for camera design, system control, and downstream algorithms. The team's success in embodying neuromorphic synergy in various systems is a milestone that can inspire many optical/image/neuromorphic processing pipelines and implies direct economic and practical implications for the industry.
Story Source:
Materials provided by The University of Hong Kong. Note: Content may be edited for style and length.
Journal Reference:
- Shijie Lin, Guangze Zheng, Ziwei Wang, Ruihua Han, Wanli Xing, Zeqing Zhang, Yifan Peng, Jia Pan. Embodied neuromorphic synergy for lighting-robust machine vision to see in extreme bright. Nature Communications, 2024; 15 (1) DOI: 10.1038/s41467-024-54789-8
Cite This Page: