An architecture of the modern age
Autonomous driving with the Highway Pilot
Autonomous driving is the trend and buzzword of an eventful morning. With the architecture for the Highway Pilot, Cognizant Mobility becomes a supplier of the future.
Autonomous driving is regarded as the future model of an eventful modern age. In our mind's eye we see electric and naturally flying cars navigating accident-free through the canyons of reflecting skyscraper landscapes at various altitudes. Real, intelligent, and completely autonomous driving, in total safety. The possibilities? Endless. Perfect last-mile transport, whether for customers in the store or the package to the front door. The facts? Must remain on the ground at first, just like the inclined non-multiple driver. The flying car as a mass product is and will be, in the short and long term, like the pumpkin in Cinderella: fantasy.
The fact that car lovers will become obsolete in the completely autonomous level 5 vehicle at the wheel, however, is concrete dreams of the future, whose approaching chords are already audible. Audi, Mercedes, Tesla: Conductors of an industrial symphony whose true actors all too often play in the background, literally. Self-propelled buses are already in active test operation, the applause goes to the manufacturers. But like the melody of progress, the future of driving is also built on a basic architecture, without which hands on the steering wheel and driver on the seat must remain. What do the building blocks on which autonomous driving is based look like? And what is a highway pilot? And who provides the architecture on which such building blocks can stand?
The long road to autonomy begins with considerations
Although the current state of the art shows promising tests from the field of future mobile technology at levels 4 and 5, today's vehicles are in series production at level 2 of autonomous driving on the roads. Level 1 is available in practically every car today: cruise control, electronic stability control, automatic, rain sensor, cornering lights and anti-lock braking systems. Assistance systems, which often perform their sometimes more, sometimes less necessary service unnoticed. Stage 2 already takes over permanent control of longitudinal or lateral guidance, but not both. In concrete terms: the vehicle controls the speed and the distance to the vehicle in front, brakes when necessary, accelerates when the A.I. says so. However, the lane change must be carried out by the driver, with the hand remaining at the wheel at any time. Or, the intelligent lane change assistant glides elegantly into the side lane. Braking and accelerating, however, is still the responsibility of a human foot. The situation is similar with the often-quoted parking system. Some luxury class vehicles are already able to assess the gap and navigate into it.
ADAS (Advanced Driver and Assistance Systems) is called, which makes level 2 a level 2. This is not yet genuine autonomous driving. After all, we want to be able to let go both lengthwise and crosswise, and soon. The step in this direction, the mastering of the next act, is complex. A new generation of environmental sensors must communicate with each other and with the vehicle. Every eventuality has, to be considered - but how can this inconspicuous sounding step be mastered? The data that comes from streets is almost immeasurable. Thousands, millions of possibilities in traffic must be processed. It is impossible to program a new function for each one of them.
In A Nutshell
- Model-based Architecture Development
- ADAS Driving Functions
- FAS Functions
- Neural Networks
- Perception & Prediction
- Environment Model
- Virtual Safeguarding
- PTC Modeler
Head of Domain Highway Pilot
Meanwhile, the stage set is divided, and a new player enters the scene: Deep Learning. For some glorified statistics, for others only heuristic brute force heuristics. But for automotive developers what is so urgently sought in the industry: potential. Before the function, the concept. What does a logical architecture look like on which the information from neuronal learning can be based? How do we enable the technical transformation?
Because yes, it cannot be dismissed: The orchestra of quiet but important sounds? This is Cognizant Mobility, which ensures that a highway pilot can become a reality and, completely detached (...) accelerates and changes lanes.
How the environment shapes autonomous driving - and Cognizant Mobility is setting the pace
When our eye receives a stimulus, our optic nerves transmit the information to our brain in less than a tenth of a second. The brain processes the data received and helps us to make a decision. Brake, or accelerate? Turn the wheel? Reevaluating the person next to you because it helps them and does not slow us down? To reach the next level of autonomous driving, an artificial highway pilot must be able to make these decisions. Neural learning helps us in development to collect data in bulk. The result: a black box full of raw data, whose evaluation and resulting decision paths often remain unclear. What is needed is an environment model: a coherent model, a brain that can recognize the events occurring around the vehicle and convert them into actions, i.e. autonomous driving functions. It can interpret the raw input and derive decisions from it: Is the captured video input a shadow from above? Or an obstacle on the road? Small differences that require intelligent assessments.