Vision Revolution Drives Down the Cost of Autonomous Vehicles
Fully autonomous computer-based systems that relieve individuals from the burden of driving are likely to become commonplace in vehicles over the next few decades. Currently, however, such vehicles are expensive to manufacture, mainly due to the complexity and cost of the active safety and driving hardware that must be integrated into them to ensure that they will be safe on the road.
Now, however, a small Hungarian research team at AdasWorks (Budapest, Hungary) -- the computer vision research arm of Kishonti Informatics (Budapest, Hungary) -- has developed a new system that looks set to change all that.
While current autonomous vehicles employ a plethora of radar, ultrasound, vision based systems, and prerecorded 3D laser scan maps of the road to determine the direction and speed of a vehicle as well as the conditions of the environment surrounding them, the AdasWorks solution relies solely on capturing visual data from a single front facing camera. The image data is then processed on a commercially available application processor running the company’s own AdasWorks image processing software.
autonomous car using nvidia and Point Grey Cameras
Figure 2. AdasWork and ThyssenKupp Presta Hungary demonstrate a complete autonomous control system for a Mercedes Benz C200 saloon
To highlight the effectiveness of its system, AdasWorks recently teamed up with ThyssenKupp Presta Hungary (Budapest, Hungary) to demonstrate how the AdasWorks software could be deployed as part of a complete autonomous control system for a Mercedes Benz C200 saloon. Working together, the companies proved that such a system could be built at a tenth of the cost of current systems, making it commercially feasible to install in mass market cars.
Flea33 USB 3.1 Camera
Figure 3. Point Grey Flea3 USB 3.1 camera with Micro-B USB 3.1 connector, and GPIO connector
After evaluating a number of cameras for use in the application, the engineers at AdasWorks eventually decided to employ the Flea3 (FL3-U3-13E4C-C) 1.3 MPixel camera from Point Grey (Richmond, BC, Canada) which was mounted to the front of the Mercedes Benz C200 saloon. Images captured by the camera were then transferred over a USB 3.1 interface to the small electronic control unit which comprised an NVIDIA Tegra K1 application processor running the AdasWorks software.
"The Flea3 camera was an ideal solution for the demonstration because it enabled the application processor to trigger the camera to capture time-stamped 1280 x 800 color images at variable frame rates and to transfer those images over the USB 3.1 interface directly to the processor board,” says Zoltan Prohaszka, computer vision expert and chief scientist of AdasWorks.
The AdasWorks software running on the NVIDIA Tegra K1 processes the image data captured by the front facing Flea3 camera to detect the center of the road, and based on its visible length and curvature, computes the trajectory of the vehicle and determines the optimum speed it should be travelling.
The software also ascertains whether the car is in danger of collision with a vehicle in front of it and if has departed from the lane it is travelling in. A lane detection algorithm detects both the type and color of the lane markings in the driving and neighboring lanes, while a car detection algorithm detects road vehicles and estimates their distance from the car using measurements from multiple frames captured by the camera.
Figure 4. The AdasWorks software on the NVIDIA Tegra K1 processes Flea3 image data to detect the center of the road.
In addition, a pedestrian detection algorithm can recognize pedestrians and the way in which they are moving relative to the path of the vehicle to determine whether or not they are in danger of being struck.
Controlling The Car
Based on the analysis of the scene in front of the vehicle, the AdasWorks software extracts the trajectory -- or the position of the vehicle over time -- from the image data, as well as computes the desired amount of acceleration or braking that should be applied to maintain the vehicle at a safe speed. having done so, the data are transferred over an Ethernet interface to a MicroAutoBox Power PC-based system from dSPACE (Paderborn, Germany).
ThyssenKrupp Presta Hungary has developed a state-of-the-art trajectory controller algorithm, which was running on the dSPACE hardware and calculates the steering angle based on the actual position of the vehicle and the received trajectory points. The dSPACE system also acts as a communication interface for the vehicle by converting the digital steering, acceleration and brake output values computed from the image data on the application processor into signals that can be used to control electromechanical actuators within the vehicle. These are used to rotate the steering column, move the accelerator pedal, and apply a pneumatic braking system.
On Track Testing
Once the system had been installed in the Mercedes C200 saloon, the vehicle was shipped to the Hungaroring Formula 1 motor-racing circuit in Hungary. There, the computer vision algorithms running on the application processor were optimized based on an analysis of video recordings of the car’s behavior around the track, and the effectiveness of the trajectory and speed control systems in the vehicle were evaluated. having done so, the fully autonomous capability of the system was finally demonstrated, as the vehicle successfully navigated the circuit without any human intervention.
Now that the feasibility of the system has been demonstrated, the AdasWorks solution is likely to prove an attractive proposition to Tier 1 OEMs considering building semi-autonomous systems which also automate many driving operations while keeping a driver in ultimate control of a vehicle. Vehicles equipped with such semi-autonomous systems will pave the way to the eventual deployment of fully automated driverless systems, where the new AdasWorks solution looks set to play an equally important role.