menu button

๐—จ๐˜€๐—ฒ ๐—ผ๐—ณ ๐—ก๐—ฒ๐˜‚๐—ฟ๐—ผ๐—บ๐—ผ๐—ฟ๐—ฝ๐—ต๐—ถ๐—ฐ ๐—–๐—ฎ๐—บ๐—ฒ๐—ฟ๐—ฎ๐˜€ ๐—ณ๐—ผ๐—ฟ ๐—ก๐—ฎ๐˜ƒ๐—ถ๐—ด๐—ฎ๐˜๐—ถ๐—ผ๐—ป ๐—ถ๐—ป ๐—˜๐—ป๐˜ƒ๐—ถ๐—ฟ๐—ผ๐—ป๐—บ๐—ฒ๐—ป๐˜๐˜€ ๐˜„๐—ถ๐˜๐—ต๐—ผ๐˜‚๐˜ ๐—ฟ๐—ฒ๐—น๐—ถ๐—ฎ๐—ฏ๐—น๐—ฒ ๐—š๐—ก๐—ฆ๐—ฆ

Inspired by the way the human retina works, NILEQ neuromorphic cameras donโ€™t capture a series of images, but instead track changes in brightness across the sensorโ€™s individual pixels. This generates far less data and operates at much higher speeds than a conventional camera. These cameras are ideal for environments without reliable Global Navigation Satellite Systems (GNSS) and where processing power is limited.

๐—” ๐—ฃ๐—ผ๐˜„๐—ฒ๐—ฟ๐—ณ๐˜‚๐—น ๐—ฃ๐—ฎ๐—ฟ๐˜๐—ป๐—ฒ๐—ฟ๐˜€๐—ต๐—ถ๐—ฝ: ๐—ก๐—ฒ๐˜‚๐—ฟ๐—ผ๐—บ๐—ผ๐—ฟ๐—ฝ๐—ต๐—ถ๐—ฐ ๐—–๐—ฎ๐—บ๐—ฒ๐—ฟ๐—ฎ๐˜€ ๐—ฎ๐—ป๐—ฑ ๐—œ๐— ๐—จ๐˜€

Combining Inertial Measurement Units (IMUs) with Neuromorphic camera data using sensor fusion, creates a robust navigation system capable of operating in environments where GNSS is unavailable. The technology tracks precise “terrain fingerprints” by analyzing brightness changes, comparing them against pre-loaded satellite imagery databases to correct IMU accuracy drift. The approach offers significant benefits: minimal computational overhead, low power consumption, and the ability to navigate in challenging environments.

๐—™๐—ฟ๐—ผ๐—บ ๐—˜๐—ฎ๐—ฟ๐˜๐—ต ๐˜๐—ผ ๐—ฆ๐—ฝ๐—ฎ๐—ฐ๐—ฒ: ๐—˜๐˜…๐—ฝ๐—ฎ๐—ป๐—ฑ๐—ถ๐—ป๐—ด ๐—ก๐—ฎ๐˜ƒ๐—ถ๐—ด๐—ฎ๐˜๐—ถ๐—ผ๐—ป ๐—›๐—ผ๐—ฟ๐—ถ๐˜‡๐—ผ๐—ป๐˜€

Currently developed for drone navigation, this technology shows immense promise for space exploration. Potential applications include aerial navigation on the Moon and Mars where GNSS is not reliably available. The challenge for planetary surface navigation is converting aerial satellite imagery into a format that rovers can use to identify surface landmarks

Chris Shaw of Advanced Navigation states: โ€œThis approach of using the neuromorphic camera alongside low-cost, inexpensive inertial sensors, provides a big cost and size benefit.โ€ The companies are planning to start flight trials of the neuromorphic navigation system later this year, with the goal of getting the product into customers hands by the middle of 2025.

Leave a Reply

Your email address will not be published. Required fields are marked *