The automaker has worked with chip maker Nvidia to create an “automobile-computer model” that attempts to mimic the way the human brain processes new information.
The autonomous A7, known as ‘Jack,’ served as a demonstrator for the artificial intelligence algorithms, programmed into the tablet-sized zFAS driver-assistance control that serves as the brain for managing its autonomous systems.
“The Audi processor … analyzes every frame of video that comes in, and it senses edges which it groups into shapes,” the company notes. “It learns that the shapes are objects, then learns to differentiate those objects.”
Importantly, the visual data collected by the cameras and interpreted by the computer is then logged into a database. The information then serves to refine and advance the technology, improving object recognition as the vehicle continues to log additional miles.
The whole process involves terabytes of data and must be completed very quickly, relying on an integrated Nvidia Tegra processor rather than sending the information into the cloud.
Audi suggests the advancements in machine learning will help autonomous technology quickly progress from prototype to production.
Leave a Reply