Do you really want to print the Internet?
Save paper and protect the environment by using the bookmark or e-mail forwarding function instead.
Whenever we talk about automated or autonomous driving, there’s one topic that particularly unsettles people: How do we know what these automatic vehicles are doing?
Whom do we look at as we’re crossing the street? Will they stop for us? Have they seen me?
For the first time, the other road users no longer have an opportunity to communicate with a fellow human being and thus to predict how he or she will react. Solving this problem will be a crucial factor determining whether these vehicles are accepted or not. At the heart of this issue is empathy, because mutual empathy is the crucial element of successful communication.
I discussed this issue with a variety of experts at a Future Insight Talk in our living room in Berlin. A convergence.
What is empathy?
It’s basically impossible to share another person’s feelings. We can’t really have the same feelings or put ourselves in the other person’s shoes. People who have never been in love dismiss heartache as silly. People who don’t have any contact with animals don’t understand why stroking a cat’s fur is a pleasure. And the odd sounds that young parents make to their babies are still a mystery to many people who don’t have kids.
One current hypothesis about the function of empathy assumes that we imagine other people being within us, but that we at the same time experience our own feelings and then transfer them to these imaginary others. Thus we can sense other people’s intentions — as long as we have already experienced such intentions ourselves. Our sensory apparatus performs this function constantly. It seeks out living things, because living things behave in ways that elicit reactions.
Many people feel uncomfortable in the presence of spiders or snakes because they can’t identify with these creatures, which can move in all directions. By contrast, we find it easy to identify with birds. Every child can imitate the way birds fly. A bird’s flight has a direction, it has a certain rhythm, and it seems possible to tell whether it’s strong or lazy.
We realize clearly how important the function of empathy is when we move through a pedestrian zone. People basically don’t walk smack into each other. If that’s about to happen, and if both people then dodge in the same direction, both of them have to laugh. For a short moment they seem to wake up from a trance. Their “perception process,” or subconscious automatic behavior, has briefly ceased and they’ve been tickled awake. In terms of overall mobility, this function of empathy is responsible for the harmonious flow of traffic, the low rate of traffic accidents, or the rhythm of a city.
It’s impossible not to communicate
The first axiom of communication according to Paul Watzlawick (“One cannot not communicate”) points to the reflexive character of the perception process. It’s a fleeting process that is fed by continuous mutual feedback. In cybernetics, the science of controlling and regulating machines, living organisms, and social organizations, the result of this process — such as a flow of traffic — is called emergence.
A new structure is created — in this case, a rhythm that nobody is dictating to others. The hectic rhythm of New York is different from the more leisurely one of Berlin. However, emergence cannot be recognized in individual actions. It doesn’t recognize any relationships between cause and effect.
We try to sense them
What happens when self-driving cars are inserted into this activity of ours? Our empathy fails to connect, because it cannot sense a living being. After all, a self-driving car is a robot, even though it’s a very complex one that does not operate as predictably as a coffee maker.
Initially our empathy may even produce false assumptions, such as the comment made by a passerby about the first outing of the self-driving S-Class: “It’s driving like an old lady.” An autonomously driving car may even be simply dismissed as uninteresting because it’s categorized as a thing that is clearly not alive. An even more dangerous possibility is that an involuntary glance at the driver turns out to be a false assumption, because the person sitting in the driver’s seat is not actually responsible for the car’s movements.
A cooperative car
That’s why we are doing research with a “cooperative” car that sends signals to our perception process. The aim is to help people learn how to deal with the unaccustomed “semi-alive” nature of self-driving vehicles. We want self-driving vehicles to be almost as easily predictable and assessable as conventionally driven vehicles or pedestrians. In recent years, Vera Schmidt has been able to use show cars to demonstrate the development of the Human Machine Interface (HMI) on vehicle exteriors. When Renzo Piano designed the Mercedes-Benz Design Center in 1998, the specialist area of HMI did not yet exist.
As with most research topics today, a variety of perspectives are needed in order to resolve these issues. Closed factory gates are no longer a timely corporate policy. Jasmin Eichler, our head of research, has said:
The open network is playing a pioneering role, because pioneering work combines a wide spectrum of strengths.
That’s why some of the people we invited to collaborate with us didn’t seem to have anything to do with cars at first glance:
See Like a Pony (SLAP)
In our attempt to define empathy — the feeling of an unspoken understanding — the main actors are three ponies that form a small herd together with Sabine Engelhardt. Sabine is normally responsible for the fragrancing of vehicle interiors, but this fall she was more concerned with the long treks she has been making with her ponies for many years. The small pony herd was equipped with cameras that could reveal the spontaneous interactions, or perception processes, going on.
An understanding of the basic function of empathy is helpful for developing the concept of a cooperative vehicle — for example, with regard to signaling by means of movements. We implemented the knowledge we gained by equipping the cooperative vehicle with a “waking-up” process.
Carola Zwick is a professor at the Weissensee Academy of Art and the owner of a design studio in Berlin. One of her specialties is the process of designing by means of 3D printing. She directly prints out her concepts and ideas and then experiments with them. The 3D printing is a component of the creative process itself. As a result, the study she conducted resulted in objects that can be touched and tried out. How can you show that the self-driving vehicle has not perceived something? This is a tricky and paradoxical question, because of course the self-driving vehicle doesn’t know what it hasn’t perceived. What Zwick represented was a process of sensing, like that of the waving tentacles of a sea anemone, that obviously are looking for something but aren’t finding it.
Manga and anime films animate inanimate objects in 2D. That’s the special skill of these films, as the CEO of Polygon Pictures, Shuzo John Shiota, explained at a CyberArts Conference at Ars Electronica. We will have to come to terms with self-driving cars in our daily lives — they’re simply different.
What could motivate us more effectively than a manga smile on a car?
Summary: Human First
Maya Ganesh is a scientist whom we work with on the topics of ethics, artificial intelligence, and sustainability. She provided a framework for the discussion by pointing to the effects of technology: The use of technology shapes us. A person who talks too long to Amazon’s Alexa in tones of command will get used to this way of speaking.
In the process of designing self-driving vehicles, we have a deeper responsibility for their conception and design than we did back when we were only doing product design.
In order to make the principle of “human beings first” a reality, we therefore have to investigate what it means to be a human being, in what direction this principle should develop, and what means we create for this process at Daimler.