Do you really want to print the Internet?
Save paper and protect the environment by using the bookmark or e-mail forwarding function instead.
Wednesday, October 31, 2018, 07:30 a.m. Two big trucks and several cars and vans are rolling onto the Daimler testing grounds in the German town of Immendingen, which is located northwest of Lake Constance.
Dense clouds loom above us, the thermometer indicates below-zero temperatures, and a thin coating of frost has covered the test track overnight.
The conditions are not ideal for carrying out today’s plan: shooting a video for the TV science show “Galileo” about autonomous driving. The video will feature two “sensitive” concept vehicles — in other words, vehicles that have not yet been developed to the point of series production: the smart Vision EQ fortwo and the Vision URBANETIC.
Today the weather is freezing, and the roads on the testing grounds are as smooth as glass. And they expect us to show on “Galileo” how well Daimler is already doing in the field of autonomous driving — under these conditions? Challenge accepted. Warmly bundled up in caps and mufflers, we meet the rest of our colleagues. Each of them is involved in a different way with the topic of autonomous driving.
In addition to the experts for the two show cars, the Vision URBANETIC and the smart Vision EQ fortwo, the safety specialists for the next-generation driver assistance systems are also there — not to mention the four-person film crew that “Galileo” has commissioned to produce today’s video. It’s an ambitious schedule, but the road surface has to be de-iced before we can start. In the meantime, the initial camera settings are arranged. And then we’re ready to go.
Sequence 1: Partially automated driving is already possible today
We begin by asking the safety specialists some questions. They point out that today cars can already be programmed and remotely controlled, thus providing possible scenarios for the further development of automotive safety systems. To illustrate this, they have an E-Class “steer itself” along an obstacle course defined by a series of pylons standing at 25-meter intervals. The second part of this sequence is a staged drive by two cars, one behind the other. The vehicle driving in front suddenly swerves in order to avoid an obstacle directly in front of it. The car behind it must quickly react to the swerve and makes an autonomous emergency braking maneuver. Of course in this case the obstacle in question is a “soft crash target” — a standardized soft target used for training purposes.
Our constant companion today is “Galileo” host Vincent Dehler. In the video, Vincent is to experience at first hand how an A-Class autonomously makes an emergency braking maneuver in order to avoid a sudden obstacle that the driver has failed to notice. The maneuver can protect the car’s occupants as well as other road users from serious accidents.
So much for a part of the automated systems that we can already experience today. And what will the future look like? By now, a few rays of sunshine have penetrated the cloud cover and the braking maneuvers of the A-Class have melted and dried off the ice on the roadway. It’s time to take a look at the two concept cars.
Sequence 2: Two candidates from the future of mobility with autonomous driving
The smart Vision EQ fortwo kicks off the sequence as it slowly rolls off the trailer. A few years from now, it will be able to do that alone, but now it’s being remotely controlled by one of its developers who is with us today. After settling down on the asphalt, it opens its doors, like wings, above the rear axle and greets Vincent and the film crew by displaying the message “Hello, Galileo” across its black panel grille — a display area in the front between the headlights. This option for individualizing the car merely makes us smile on the day of the shooting, but it’s actually a very practical way to achieve its aim.
That’s because the smart Vision EQ fortwo represents a vision of future urban mobility as an individualized and extremely flexible system of local public transportation. The smart Vision EQ fortwo is the perfect vehicle for carsharing, because it picks up its passengers directly at the place of their choosing. The display notifies the user that this is “his/her” vehicle after the vehicle has linked itself up with the user’s smartphone. The car is powered by an electric motor, and it drives autonomously without pedals or a steering wheel.
For Vincent, making the first drive around the testing grounds without having his own hands on the steering wheel is quite a new experience. However, the drive gives him an idea of how an autonomously driving smart can one day make mobility easier for its two occupants.
A bus ride into the future in the Vision URBANETIC
When the Vision URBANETIC is standing next to the smart of the future, it looks like a gentle giant. It’s got room for ten people — enough for Vincent, the cameraman, and our experts, who answer questions during the drive in this concept vehicle.
The special thing about the Vision URBANETIC is that it can transport not only people but also goods. That way it can fulfill the needs of cities, companies from various sectors, travelers, and commuters. “How can it do that?” asks Vincent. It’s a good question.
Here’s the answer: The concept is based on an autonomously driving, electrically powered chassis that can carry various swap bodies for transporting either passengers or merchandise. As a ride-sharing vehicle, the Vision URBANETIC can carry up to twelve passengers, while the cargo module provides capacity for up to ten standard European pallets. The vehicle is just over five meters long, and it has a lot of integrated IT equipment that can analyze supply and demand in real time. In the future, it will thus be able to send the autonomously driving fleets of Vision URBANETIC vans to their urban destinations as needed.
At the moment, the van on the testing grounds in Immendingen is the only one in existence, but in the future this concept could be used to shorten waiting and delivery times and avoid congestion, for example. The overall system can recognize things such as a group of people standing in a certain area. It can thus specifically send vehicles to these points in order to directly make a needed pickup — flexibly, without fixed routes or rigid schedules.
Sequence 3: The “eyes” of tomorrow’s vehicle
Speaking of schedules, the practical part of the shoot, which features the two show cars, is now in the can, but the shooting day is far from over. Together with the film team, we return to the Sindelfingen plant, which is our next shooting location. Here we find out how important it is for an autonomously driving car to be aware of its entire surroundings and to understand what it perceives. This is achieved by means of Deep Learning, Image Understanding, and the fusion of the camera and the lidar. The network of an autonomous vehicle is trained through many examples of cars and pedestrians.
Does this mean that a camera can learn and understand things? Yes, it does. Semantic segmentation — in other words, exactly what the camera and the sensors are seeing — is a key component of autonomous driving. In this process, the camera functions in the same way as the driver’s eyes normally do. The camera must be able to reliably recognize and classify objects and human beings at any time of day and under all weather conditions.
The present state of segmentation is based on major advances in the development process. For example, the current possible range of the technology is between 50 and 500 meters, depending on the specification. In addition, the vehicle can see spatially, thanks to a stereo camera with two camera eyes.
This is how the vehicle recognizes pedestrians, bike riders, other vehicles, streets, vegetation, and much more. All of the “visual” information that has been received is registered and evaluated by a computer in the vehicle and marked with a set of individual colors. That helps the developer and makes the process more visible.
Of course we can’t resist a chance to experience this technology live, for our own sake and for the video’s viewers. After arriving in Sindelfingen, we therefore climb into the back seat of a test vehicle based on the current S-Class, buckle our seatbelts, and hit the road. This is what we see: Passers-by are indicated in red and other vehicles in blue, for example.
The play of colors can be easily seen in a monitor in the interior of the S-Class which is being used to develop this technology in practice. After a drive through the rush hour traffic in Sindelfingen, it’s clear to us that this vehicle is driving in a very colorful world. The camera easily recognizes bike riders who are crossing the street at speed, as well as pedestrians standing between parked cars and waiting to cross, even if the pedestrians are mostly hidden.
And talking about the evening rush hour, the sun is setting as we end our day of shooting. The effort that was required to shoot a video of just few minutes is quickly forgotten. And you can see the results here.