The whole idea behind autonomous cars driven by sensor and camera based computers is that they will make road traffic safer. According to the market research institute Emnid, 41% of Germans feel that driverless cars can accomplish this goal. Accordingly, the demand for advanced driver assistance systems is growing sharply in most automotive categories.
By 2025, autonomous driving is likely to be commercialized. Drivers can then relax and leave the operations of monitoring the roadway and controlling the vehicle to the system, as full vehicle control will be accomplished by efficient micro-controllers and highly innovative and precise sensors. Drivers will be able to use their smartphone, write e-mails, watch a video or enjoy a meal or drink. This not only improves driving comfort but also offers significant time-saving benefits.
Also, totally new services will arise in the commercial sector: A rental car can, for example, drive autonomously and directly to the user at a specific time and not the other way around, i.e. the user to the vehicle. Or a moving van comes back unmanned to the rental company after finishing its task.
Apart from increased comfort, driverless cars also deliver decisive advantages with regard to two vital areas: Safety and environmental protection.
According to the German Federal Statistical Office, in 2015, there were over 2.5 million road traffic accidents, 300,000 of which resulted in personal injuries. Advanced driver assistance systems (ADAS) will help to curtail this huge figure, as they progressively exclude human errors, e.g. due to strong emotions, fatigue or other distractions.
If assistance systems are complemented with car-to-x and car-to-car communication, it is possible to decrease CO2 emissions considerably. Route-related details, e.g. the current traffic jam, traffic light status and road work information, ensure proactive and efficient driving behavior. Needless acceleration before a red light or the incompetent by-passing of a traffic jam will then be a thing of the past. The limit values for CO2 emissions of 95 g/km by 2020, as specified by the EU, will then be achievable.
From ADAS to Autonomous Driving
Latest models are already fitted with a multitude of assistance systems. However, drivers still have total control over their vehicles. According to the classification of the Society of Automotive Engineers (SAE) for autonomous driving, this matches SAE level 3 - "partial automation". Based on the driving mode, electronic systems assume all aspects of the dynamic driving task, the driver only has to respond suitably to the requirements, e.g. conditional acceleration or deceleration. An example of this is the automatic park pilot system, which moves the car into a suitably-sized parking spot. This system has been combined in almost all luxury class cars.
Autonomous driving includes replicating the human senses of hearing and vision as well as the way the brain processes data. These tasks are handled by extremely innovative and very precise sensors and electronic control units (ECUs). Both the systems and the individual components must achieve the maximum demands regarding functional safety (ISO 26262). Some examples of current assistance systems demonstrate the function:
Lane Departure Warning System or Steering Pilot
A steering pilot assists the driver with lateral control and helps to maintain the vehicle in the middle of the lane when the road is straight and the bends moderate. This type of steering pilot has, for example, been incorporated in the new Mercedes Benz E-Class (model 213).
One of the standard prerequisites for this technology is an angle sensor that reads the current steering angle and a camera system that records the environment. Using the lane markings detected by the camera, the ECU establishes a target course based on the middle of the lane and transmits this data to the steering system. This system can then dynamically respond to this by intervening in the steering process or passively through a haptic or acoustic warning signal.
Advanced camera systems use stereo cameras, comprising of two high resolution CMOS mono-cameras fitted in a casing approximately 20 cm behind the windshield. While a mono-camera estimates distances, the stereo version measures the distance to an object and its height above the road surface. At a mean distance of between 20 and 30 m, it can establish the distance to an object down to 20 to 30 cm.
The distance assistant uses a radar system to measure the distance and the comparative speed to vehicles on the road ahead. Such radar systems with a small range of up to 250 m are also used for braking, parking and distance warning systems at approximately 75 GHz to 76 GHz (wideband). Images of the stereo camera ensure greater object and distance detection through object fusion.
The AEC-Q100 certified STRADA431 from STMicroelectronics is, for example, suggested as a 24 GHz radar sensor because of its compact design in a QFN package with a footprint of 6 x 6 mm2. The 3.3 V supply voltage, the on-chip power sensor, and the extra temperature sensor enable the installation of an ASIL B compatible system.
Cruise control is of paramount importance for autonomous driving. Since the driver is not actively involved in the driving process, acceleration and deceleration must happen automatically. To achieve this goal, sensors measure the latest speed and the opening time of the throttle valve is regulated accordingly. The rotational speed sensor TLE4941plusC from Infineon has been developed as an ESP and ABS sensor. A sensor has to be installed on all four wheels for this purpose. In this respect, the TLE4941plusC, designed as a differential Hall Effect sensor, not only provides accurate values but also displays high robustness.
The Infineon sensor provides ESD protection up to 12 kV and has been qualified for automotive applications in step with AEC-Q. It can be applied with a highest voltage of 20 V which is converted internally into a 3 V supply via a voltage regulator.
Many challenges still have to be taken care of so as to realize more advanced ADAS and finally autonomous driving. One challenge arises from the trend away from local toward central processing of sensor data. This has the advantage that data from all the sensors of the different subsystems are available to the processing ECU and that the applicable data can be used for several functions. The sensor modules then perform only sensory and data transmission tasks without any processing and decision-making tasks, thus eliminating data losses because of pre-processing or compression in the sensor module. Consequently, the sensor modules can become smaller, energy saving and more cost effective.
This does, however, need a "super ECU". Taking all the required sensors into consideration, tier 1 supplier Continental AG believes approximately 1 Gb of data will be produced each minute. The data has to be processed in real time for safety-relevant systems. At present, there is no computing unit that can realize this safely.
Guaranteeing safety and security is another challenge. As both data security and functional safety must be ensured. The issue of data security must be solved at the latest when a car-to-x communication system is executed in a vehicle. Since security gaps provide hackers the chance to log into the car so as to read out data, to stall the car via remote control, or to take control of the car.
Component Manufacturers, Distributors and Suppliers, and Vehicle Manufacturers are working together to solve these challenges so that nothing stops users from reading messages, chatting, or working while traveling by car.
This information has been sourced, reviewed and adapted from materials provided by Rutronik Elektronische Bauelemente GmbH.
For more information on this source, please visit Rutronik Elektronische Bauelemente GmbH.