System aspects of a bionic eyeglass



In spite of the impressive advances related to retinal prostheses, there is no imminent promise to make them soon available with a realistic performance to help navigating blind persons. In our new project, we are designing a Bionic Eyeglass that is providing a wearable TeraOps visual computing power to advise visually impaired people in their daily life. In this paper the system aspects are explained. There are three different types of situations (home, office, street) and a few standard image flows (with some auditory information). The basic tasks are indoor and outdoor events, defined by blind people. Two types of cellular wave computing algorithms are used: general purpose spatial-temporal event detection by analogic subroutines developed so far, and recently developed multi-channel mammalian retinal model followed by a classifier. Typical indoor and outdoor event detection processes are being considered.

In spite of the impressive advances related to retinal prostheses, there is no imminent promise to make them soon available with a realistic performance to help navigating blind or visually impaired persons in everyday needs. In our new project, we are designing a Bionic Eyeglass that is providing a wearable TeraOps visual computing power to give them support in their daily life. The presented system differs from existing topographic classification techniques in the intensive multi-channel retina-like preprocessing of the input flow, as well as the specific semantic embedding technique. The system is designed and implemented using the Cellular Wave Computing principle and the adaptive Cellular Nonlinear Network (CNN) Universal Machine architecture .
There is a strong biological motivation behind building a multi-channel adaptive algorithmic framework. It has been known since long that the mammalian visual system processes the world through a set of separate spatialtemporal channels and some outer retinal effects can be represented using the CNN Universal Machine [1]. However, the striking new result is that the organization of these channels begins already in the retina, where a vertical interaction across many parallel stack representations can be identified[4].
Our Bionic Eyeglass makes a major difference compared
to any other devices made for visually impaired people since
it is based on
• a cellular visual microprocessor family developed via the CNN Universal Machine principle with unprecedented computing power on a ~1 cm2 silicon chip with ~1W dissipated power,
• a dual visual input architecture (called the Bi-i [5]), and its software technology [6] and system implementation based on the above type of microprocessors ,
• a multi-channel mammalian retinal model [7] based on the recently discovered retinal operation and implemented real-time on the Bi-i., and
• the cellular wave computing algorithms combining topographic and non-topographic multimodal sensory flows

Free download research paper


CSE PROJECTS

FREE IEEE PAPER AND PROJECTS

FREE IEEE PAPER