This week, we paid a visit to Audi's Electronics Research Laboratory (ERL) in northern California. Rather, it paid a visit to us, bringing along its Audi Urban Intelligence Assist demo vehicle to San Francisco and taking me for a spin.
The Audi Urban Intelligence Assist (AUIA) vehicle is, essentially, an Audi A6 3.0T that has been outfitted as a test bed for a variety of technologies being developed by Audi ERL and its partners at the University of California at Berkeley, the University of California at San Diego, and the Center for Advanced Transportation Technologies at USC. During my ride, a variety of new technologies were demonstrated, including Driver Attention Guard, Intelligent Merge/Lane Assist, and predictive modeling for traffic, parking, and driver behavior.
Driver Attention Guard
Using an array of cameras hidden in plain sight around the AIUA vehicle's cabin, the Driver Attention Guard system, developed by researchers at UC San Diego, watches the driver's head and face and discerns whether he or she is watching the road. On a closed course with an Audi ERL researcher in the driver's seat, I was shown how the cameras were able to detect when the driver was looking away from the road.
Where some driver attention monitoring systems would sound an alert of flash a light, when the Audi system detected an inattentive driver, it worked more proactively, activating the vehicle's adaptive cruise control and lane-keeping assistance systems and taking over control of the vehicle until the driver's attention returned to the road. In this state, the AIUA vehicle was able to maintain a safe following distance from a lead vehicle and stay within marked lane lines.
The Driver Attention Guard system also disables the accelerator pedal, preventing the driver from accelerating and overriding the adaptive cruise system until their attention has been returned to the task of driving. Even with the driver pressing the gas pedal, the system would not crash into the lead vehicle if the driver was looking away. The system was even able to bring the car all the way down to a complete stop when the leader vehicle did.
I think the tech could also be useful for disabled drivers. There's no reason why this system that is designed to protect someone while rubber-necking or dealing with rowdy children can't also notice that, for example, an epileptic is suffering a seizure or an older driver has been incapacitated by a heart attack and safely bring the car to a stop.
Intelligent Merge/Lane Assist
An evolution of Blind Spot Monitoring, Intelligent Merge/Lane Assist adds forward and rear aiming LiDAR scanners the the Audi's sensor package, allowing it to not only detect when a car is in the blind spot, but if a vehicle approaching or being approached is entering the space in the adjacent lane. Because the LiDAR sensors also know how fast those vehicles are approaching, the Merge/Lane Assist's computers can make continuously-adjusting predictions about where these vehicles will be in the next 6 seconds and make recommendations for drivers preparing to change lanes.
So, as the driver touches the capacitive turn signal stalk in preparation of signaling a lane change, a color-coded graphic appears in a head-up display indicating whether it's safe to proceed or not. A green graphic indicates all clear. Yellow means hold off a bit, and red means no-go. The graphic also shows your current speed and either a forward or rearward facing arrow indicating what speed would be optimal for a safe change. So if you're doing 45 mph on an off ramp, the system may show a yellow icon recommending you speed up to 60 while merging to avoid collision. Or if you're trying to squeeze in behind a truck to exit the highway, the system may indicate green, but also recommend deceleration to match the truck's speed.
Of course, any good driver will want to peek over their shoulder to double-check the system, so the Intelligent Merge/Lane Assist system also lights up color-coded LEDs in the wing mirrors to match the recommendations indicated in the HUD.
Predictive navigation and parking
Researchers at the Center for Advanced Transportation Technologies, USC demonstrated technologies that could allow a future Audi vehicle to see into the future. By analyzing historical trends, current conditions, and scheduled events, the navigation software was able to predict trends in traffic with pretty decent accuracy.
By looking ahead, the system aims to prevent those situations where your GPS indicates all-clear at the beginning of your commute, but fails to account for the rush hour that's about to start in 20 minutes or the ballgame that's going to be ending in an hour. It's also able to guess where parking will be available when you reach your destination 20 minutes from now, instead of telling you what's open now, when you're nowhere near. Researchers were able to claim 97 percent accuracy on its parking availability predictions 10 minutes into the future and 91 percent accuracy 20 minutes out. Not bad.
The system, as demonstrated, started on a smartphone app, where the destination could be searched for and selected before even reaching the vehicle. Once in the car, the smartphone was tapped to an NFC pad on the center console where the destination was transferred to the dashboard and one final ETA calculation was made to predict what parking would be available in the area and navigate there. During our demo, we navigated to a block where two parking spaces should have been available, but found none. To be fair, a guy in a Mazda3 was parked across two spaces, proving that even the highest tech is no match for crappy San Francisco drivers. With a button tap or two, we were navigating to another available space around the corner.
When the trip was over and parking had been found, the same phone was again tapped to the NFC pad to trigger walking directions to the final destination. When asked why near-field communciation was chosen as opposed to the more universal Bluetooth, the researchers reminded us that this tech is still in the early testing phases with many details subject to change if or when it finally reaches a production car.
In addition to smarter navigation, Audi ERL also demonstrated more natural navigation that used more natural speech cues and landmarks when giving turn by turn directions. For example, when approaching a turn, the system would say "Turn right at the Old Navy" or "Turn left before the red brick building" while highlighting the aforementioned landmark on the 3D map in the dashboard.
A car that learns about you
The demonstrated technology that intrigued me the most was perhaps the most difficult to demonstrate. Researchers from the Center for Advanced Transportation Technologies, USC showed off software that enables your car to learn about you, the driver, and make adjustments to its systems based on what it observed.
For example, by watching the brake and throttle applications of a driver for about an hour, Audi's software was able to then nearly perfectly anticipate the behaviors of that driver when following a lead vehicle. It knew whether you were a vigilant or relaxed driver, aggressive or casual, and what your reaction times were. That sounds a bit scary, but what is able to be done with that driver profile is pretty interesting.
Let's imagine that you and I are both Audi A8 drivers equipped with forward collision monitoring systems. In most cars, the engineers program baseline settings for these systems that's good enough for everyone. This means that, for some drivers, the forward collision alert system will be too sensitive, but for other drivers, not nearly sensitive enough.
With Audi's software on board, the system could learn that I have a slower reaction time than you do but tend to follow lead cars more closely and program the forward alert system to be more assertive in warning me that I'm approaching the car ahead too quickly. You, being an alert driver with quicker reflexes, might get a bit more leeway before the klaxons begin to sound. Both of us would get a system that kept us safe while not being too annoying, without us having to adjust a single setting.
The system could be used to inform the decisions made by the navigation system for estimating when to leave for a trip. Learning about a driver's preferences for parking price, willingness to walk, and traffic conditions could be taken into account by the navigation system. A more hesitant driver might be given a bit more wiggle room by the Intelligent Merge/Lane Assist system. It could also be smart enough to consider environmental factors, such as rain or night-time driving habits, to further inform these decisions.
Today, these tech demonstrations required a trunk full of computers to operate -- and often a laptop-toting researcher in the passenger seat -- but could find themselves miniaturized, centralized in the cloud, and implemented in vehicles within the next decade. To my eye, they already appear ready for prime time. However, while Audi was happy to share these technologies with me and to hint at even more advanced features down the road, none of the researchers was willing to comment on which features would make it to production or when. We'll just be keeping our eyes peeled.