EXPERIENCE DESIGN
HUMAN-MACHINE INTERFACE
PEDESTRIAN SAFETY
AutoVision
Spring 2024 Studio Project Sponsored by Cognizant
Collaborators:
Leila Kazemzadeh
Adam Whitney
Cynthia Zhou
Role:
Interviews & Data Collection
Research Synthesis
Framework Analysis - Observations, Need Hierarchies, Mindmaps
After-effects Animations
Concept Rendering
INTRODUCTION
In the year
Currently, drivers are vital for operating vehicles and managing safety and legal responsibilities. However, as autonomous technology advances, the role of the driver will diminish. Despite that, AVs will eventually be far safer and more effective drivers than we are today, they've always had difficulty doing one thing, which is dealing with human error.
Driver roles will be redefined by autonomous vehicles.
THE PROBLEM
The pedestrian experience has been overlooked with the advent of autonomous vehicles. During our user research, we had many interviewees expressing that regardless of any improvement to AV technology, they were far more worried about being in the vicinity of a driverless car, than they were about a distracted human driver. This presents us with a key contradiction.
Human drivers can be inexperienced, inattentive, or under influence, but as AV technology matures, it will behave more predictably and safely than humans. Despite all this, there are no clear ways for pedestrians to understand the AVs intent in the way they would with a human driver, leading to fear and uncertainty. This adds to the poor perception of AVs.
Our Design Goal
Promote safe pedestrian behavior, by creating trust between the pedestrian and the vehicle through the communication of vehicle intent via an implicit interaction that supports pedestrian behavior rather than dictating it.
THE CONCEPT
AutoVision is an external human-machine interface for autonomous vehicles. It bridges the perception gap between vehicle and pedestrian as the driver is no longer present. It is multi-modal, with a visual as well as an audio element.
Concept Video
Concept Specifications
AutoVision utilizes pre-existing e-ink technology from the BMW iVision Dee for its visual interface. The scales map across the surface of the vehicle in customizable configurations. Additionally, the scales have light piping which allows the animations to be front lit. This prevents the visuals from being too bright but still allows for visibility in low-light scenarios.
The audio portion of the interface integrates into the car horn system, playing an indicator which is half the frequency of a car horn. This prevents it from being overwhelming and getting confused with the actual car horn.
CONCEPT DEVELOPMENT
Research Phase
To better understand the relationship between pedestrians and cars, as well as the specific challenges they face after the disappearance of the driver's role, we conducted observations, interviews, and secondary research.
2
Observational Studies
6
Secondary Research Articles
11
User Interviews
Interview Demographics
Key Research Insights
Our interviewees were generally distrustful toward AV technology as a whole. This draws to an important finding from our secondary research. Studies state that there is a significant correlation between a pedestrians level of fear and their tendency to move dangerously or unpredictably in front of a moving vehicle, especially among children.
Our group created a hierarchical framework to address the needs of our key user group, the pedestrian. We used this hierarchy to heavily inform our concept development. It draws from the sentiments and anecdotes and sentiments from interviews as well as information on autonomous vehicle interfacing in our secondary research.
Hierarchy
Aspirational Journey Map
Our aspirational journey map outlines should be communicated between the AV and pedestrian as well as how that communication would ideally affect pedestrian behavior.
Ideation Phase
The preliminary concept which our group arrived on after two rounds of concept critiques, involved a passive animated element to the exterior of a vehicle.
Concept
The interface features 3 key states, with the addition of a short transition state in between the alert state and tracking state. This additional state was identified as a necessity during user testing to further affirm certain users that it was safe to cross.
PROTOTYPE TESTING
Objectives
Hotspot Testing
Scenarios
7
Testers
2
5
Rounds
In order to destermine where on the AV our interface would make the most impact, we conducted a test in which users acted as pedestrians, crossing in front of and beside a moving car and recorded where their gazes most often rested.
Simulate Driverless Future AV Scenarios: Tracing paper was used to cover the front of the vehicle in a way that obscured the driver from the users while still allowing the driver visibility during the test.
Eye Tracking Glasses: Users wore these glasses throughout the test in order to track and record the direction of their gaze as they crossed in front of the car.
Results
We found that users primarily looked at the hood, grill, and above the front tire on the side where they started crossing, regardless of whether they were crossing in front of the car or walking beside it.
Animation Testing
Indicator States
6
3
Testers
4
Animation Types
Various animations with different sizes, dynamism, and directionality were tested with users via a paper car prototype with a projector on the inside. The projector mapped our animations on the outside of the car with the help of the program, Madmapper, allowing us to act out a crossing scenarios that included our animated signifiers with users in order to determine which were the most effective.
Animation Sample
MADMAPPER Animation Software
Projector Setup inside “AV”
Audio Testing
3
5
Testers
Indicator States
3
Sound Patterns
Audio indicators were tested similarly to the animations, using a paper car and speaker which we used to act out pedestrian crossing scenarios with users. We observed how our “pedestrians” responded to various sound patterns and how they interpreted the signals to determine the most appropriate audio cues for our interface at each of it’s stages. We played traffic and rain white noise to test how well the audio cues cut through.
Testing Setup
Audio Testing States
Results
General attitudes from the testing rounds for both the animation testing and the audio testing.
Final Design Decisions
IMPLEMENTATION
Style Guide
The system allows manufacturers to apply the polygons to fit the unique contours of their vehicles so long as they cover the hotspots identified during testing. Additionally, the colors for the indicator animation can be selected by the manufacturer so long as they meet the contrast ratio greater than 3:1.