Interacting with an autonomous car has been quite a topic for many in the industry lately. Only, how do we have to design the gradual transformation from a shuttle service with driver to a driverless shuttle? Taking our moovel Flex ride-pooling pilot as use case MA student Raphael Zimmermann developed the “Amp,” an external interface prototype for the autonomous vehicle.
Nowadays more and more communication between humans is replaced by a human-machine interaction. At moovel this is a hot subject, naturally, regarding all aspects of mobility. Since, in the future, we need new ways of interaction with autonomous vehicles. Especially interesting for us is how interactions with autonomous ride-pooling services will be like. How would the communication externally and internally be if more and more tasks of a driver or operator will be handled by the vehicle?
One key challenge for us lies in connection with moovel Flex, our shuttle pilot project in Stuttgart as part of the moovel-on-demand platform: How will the future autonomous vehicle find us, and pick us up and take us where we want to go? Launched in Stuttgart in the end of 2017, this add-on to public transport is mobility-on-demand as a service. Its pooling method enables cheaper rides for all passengers. Only, pick up locations are typically dynamic stops along the road, so-called virtual stops.
The interior of the future autonomous vehicle, for now our moovel Flex shuttle, with the Amp. This concept screen serves as exterior and interior communication between car and passenger. Upon entering the car the system tracks the passenger and greets him or her accordingly using a highly visible font.
Master of Arts in Design student Raphael Zimmermann of the University of Applied Sciences in Potsdam worked on a use case including user journeys and touchpoints and developed a “multi widget” as prototype, a round-shaped screen on the windshield of the (soon autonomous) shuttle or maybe even any car that can be integrated into a fleet of (self-driving) shuttle vehicles. The result is documented in his thesis titled “How to interact with autonomous machines – a concept for communicating between humans and autonomous vehicles.” (For more please see PDF download below.)
Raphael worked on the challenge of finding out what external and internal interfaces are needed for human-autonomous communication. He was looking into signage that needed to work intuitively. Developing the first prototype at Fraunhofer Institute for Industrial Engineering IAO in Stuttgart, he asked test persons for evaluation. Finally, he asked test passengers to use a moovel Flex shuttle especially prepared with the “multi widget” prototype. This use case was taken to the streets of Stuttgart while a driver was still steering the car.
One challenge many unmarked shuttle services face already today is how the car, respectively its driver, and the passenger are able to recognise each other at the pick-up location. For example, ride sharing service Uber let its driver use an LED sign stuck on the windshield while Lyft drivers can use a sleeve with the brand name on their windshield. Likewise a challenge for the future autonomous vehicle will be the technological deviancy when it comes to pinpointing the virtual stop. The radius in which car and passenger have to find each other can vary up to 50 metres and is time-critical for the shuttle to stay on schedule.Therefore the 50-metres radius is also the distance limit in which the passenger has to be able to spot the signage of his booked autonomous car.
Raphael Zimmermann created a concept of how to spot the Amp screen from afar.
MA student Raphael Zimmermann developed the design of a “multi widget” to serve as communication add-on for autonomous vehicles. It should have a predefined size and position on the car to steer the passenger’s attention to one central point. As a neutral information layer it should be highly visible and recognisable, and it should be attached to the front of any kind of car model.
Raphael named it “Amp,” short for “amplifier,” as it is supposed to boost the communicational ability of the autonomous car. He decided the Amp should be a round-shaped screen for universal use. Lastly, it should take the place of the driver as the central steering and controlling entity, thus being placed right on the windshield.
Once the passenger spots the car, he waves or gives a sign so that the system can recognise him.
How should the interaction evolve? For the actual test ride Raphael created touchpoints: As soon as the booked shuttle has given the passenger the exact pick-up location, the passenger can choose a colour, symbol or even his own sketch to be displayed for recognition on the Amp screen. The moment the passenger has spotted the vehicle and approaches, the Amp screen is displaying a moving circle resembling an eye to show it has “spotted” the passenger, respectively his mobile, too. Once the passenger is raising his arm as if to flag down a shuttle, the car is mirroring the gesture: the first feedback from the car. As soon as the passenger comes closer the Amp screen displays a welcome message in a big font to be easily seen (“Hey are you ready?”). Now the passenger should answer with a loud “Yes.” This is the confirmation for the Amp and it opens the car’s doors automatically.
Inside the shuttle the Amp screen serves as trip companion and displays the route, gives info about possible delays and notifies the passengers shortly before arrival.
How to spot your shuttle? The MA thesis describes a sign system for the Amp screen using different modes. To find the shuttle at the pick-up spot the passenger can choose certain colours or symbols, or even a personal sketch, to be displayed on the screen.
As the last part of his thesis, finally a trial in a real-life environment took place. Using our moovel Flex as on-demand shuttle pilot in Stuttgart the Amp prototype was attached to a moovel car. While the shuttle was still steered by a driver, he or she didn’t talk at all to the passengers – it was all done by the Amp prototype taking first responsibilities over from the driver.
In an example for the touchpoints the Amp screen shows the waiting time upon arrival at the pick-up spot. Displayed in bright colours and with high contrast it is visible in every weather condition.
What actually happened when a car with a screen at the driver’s seat would drive around in the city? Bystanders didn’t understand what was going on without further knowledge, the screen wasn’t recognised intuitively. Only the passenger didn’t have any trouble to spot – and find – the booked shuttle. It will be on the services to define a way to interact with their customers by slowly designing a transitioning process.
(slider: 6-0.png, 6-1.png,
title: Sign System
text: Final sign system with different modes, e.g. time at pick-up location, personal welcome greeting, interior car communication with en route information, time to destination, etc. )
What learnings did Raphael Zimmermann take away: First, his “service icon” was highly visible and could be spotted from afar. A visual connection between car and passenger could be established, which the passengers liked as a pleasant user experience. The closer the shuttle approached, the more personal this new interaction between vehicle and guest became. The Amp screen was able to be at the centre of interaction. It seemed for the passengers that they could recognise and accept the car as their communicational link.
As a next step, Raphael suggests, more realistic technical components should be used. For example, a screen with a strong brightness and contrast for outdoor use could improve the depiction of the signs significantly.