Inspired by the way the human retina works, NILEQ neuromorphic cameras donโt capture a series of images, but instead track changes in brightness across the sensorโs individual pixels. This generates far less data and operates at much higher speeds than a conventional camera. These cameras are ideal for environments without reliable Global Navigation Satellite Systems (GNSS) and where processing power is limited.
๐ ๐ฃ๐ผ๐๐ฒ๐ฟ๐ณ๐๐น ๐ฃ๐ฎ๐ฟ๐๐ป๐ฒ๐ฟ๐๐ต๐ถ๐ฝ: ๐ก๐ฒ๐๐ฟ๐ผ๐บ๐ผ๐ฟ๐ฝ๐ต๐ถ๐ฐ ๐๐ฎ๐บ๐ฒ๐ฟ๐ฎ๐ ๐ฎ๐ป๐ฑ ๐๐ ๐จ๐
Combining Inertial Measurement Units (IMUs) with Neuromorphic camera data using sensor fusion, creates a robust navigation system capable of operating in environments where GNSS is unavailable. The technology tracks precise “terrain fingerprints” by analyzing brightness changes, comparing them against pre-loaded satellite imagery databases to correct IMU accuracy drift. The approach offers significant benefits: minimal computational overhead, low power consumption, and the ability to navigate in challenging environments.
๐๐ฟ๐ผ๐บ ๐๐ฎ๐ฟ๐๐ต ๐๐ผ ๐ฆ๐ฝ๐ฎ๐ฐ๐ฒ: ๐๐
๐ฝ๐ฎ๐ป๐ฑ๐ถ๐ป๐ด ๐ก๐ฎ๐๐ถ๐ด๐ฎ๐๐ถ๐ผ๐ป ๐๐ผ๐ฟ๐ถ๐๐ผ๐ป๐
Currently developed for drone navigation, this technology shows immense promise for space exploration. Potential applications include aerial navigation on the Moon and Mars where GNSS is not reliably available. The challenge for planetary surface navigation is converting aerial satellite imagery into a format that rovers can use to identify surface landmarks
Chris Shaw of Advanced Navigation states: โThis approach of using the neuromorphic camera alongside low-cost, inexpensive inertial sensors, provides a big cost and size benefit.โ The companies are planning to start flight trials of the neuromorphic navigation system later this year, with the goal of getting the product into customers hands by the middle of 2025.
๐จ๐๐ฒ ๐ผ๐ณ ๐ก๐ฒ๐๐ฟ๐ผ๐บ๐ผ๐ฟ๐ฝ๐ต๐ถ๐ฐ ๐๐ฎ๐บ๐ฒ๐ฟ๐ฎ๐ ๐ณ๐ผ๐ฟ ๐ก๐ฎ๐๐ถ๐ด๐ฎ๐๐ถ๐ผ๐ป ๐ถ๐ป ๐๐ป๐๐ถ๐ฟ๐ผ๐ป๐บ๐ฒ๐ป๐๐ ๐๐ถ๐๐ต๐ผ๐๐ ๐ฟ๐ฒ๐น๐ถ๐ฎ๐ฏ๐น๐ฒ ๐๐ก๐ฆ๐ฆ
