Off-the-shelf equipment turns cars into networked intelligent vehicles
Researchers at EPFL are trying to improve the reliability and fault tolerance of intelligent systems by cooperative data sharing from other vehicles.
Vehicles are increasingly being outfitted with features to enhance intelligence and data collection such as cameras, Light Detection and Ranging (LIDAR) sensors, and navigation and mapping systems.
Researchers at EPFL are working to improve the reliability and fault tolerance of these intelligent systems by combining the data they gather with that from other vehicles.
The team is hoping that the data they collect will extend features and capabilities like the ability to extend the field of view of a car that is behind another car.
Using simulators and road tests, the team has developed a flexible software framework for networking intelligent vehicles so that they can interact.
“Today, intelligent vehicle development is focused on two main issues, the level of autonomy and the level of cooperation,” says Alcherio Martinoli, who heads EPFL’s Distributed Intelligent Systems and Algorithms Laboratory (DISAL).
Over the past several years, his team has been working on cooperation issues, which have yet to garner much attention from the automotive industry. As part of his PhD thesis, Milos Vasic has developed cooperative perception algorithms, which extend an intelligent vehicle’s situational awareness by fusing data from onboard sensors with data provided by cooperative vehicles nearby.
The researchers used the cooperative perception algorithms as the basis for the software framework. Cooperative perception means that an intelligent vehicle can combine its own data with that of another vehicle – such as the one it wants to overtake, which has a wider field of view. In this way, the decision whether or not to overtake can be made safely.
According to Vasic, “cooperative perception makes overtaking safer and more fluid.”
With help from Groupe PSA, the team took two Citroen C-Zero electric cars and retrofitted them with a Mobileye camera, an accurate localization system, a router to enable Wi-Fi communication, a computer to run the software and an external battery to power everything. Using off-the-shelf equipment, the team was able to turn standard cars into intelligent vehicles.
One of the difficulties in fusing data from the two vehicles involved relative localization. The cars needed to be able to know precisely where they are in relation to each other as well to objects in the vicinity.
If a single pedestrian does not appear to both cars to be in the same exact spot, there is a risk that, together, they will see two figures instead of one.
This is where other intelligent features come into play. By using other signals, particularly those provided by the LIDAR sensors and cameras, the researchers were able to correct flaws in the navigation system and adjust their algorithms accordingly.
This exercise was all the more challenging as the data had to be processed in real time while the vehicles were in motion.
Tests involved only two vehicles, however, the longer-term goal is to create a network between multiple vehicles as well with the roadway infrastructure.
Cooperative networks of this sort could eventually be used to optimize a vehicle’s trajectory, save energy and improve traffic flows.
The team is aware that many questions remain unanswered. For one, there is the issue of liability in the case of an accident. Determining liability, which already involves the owner, car manufacturer and software designer or supplier, becomes even more complicated when vehicles cooperate.
“The answers to these issues will play a key role in determining whether autonomous vehicles are accepted,” adds Martinoli.