How everyday products are shaped by social bias

July 26, 2021

Two UM-Dearborn human-centered design experts break down why designing products that don’t leave some people on the sidelines takes a more empathetic approach to engineering.

A young woman in a dress and work boots sits on a riding lawmower, flanked by an older man who is explaining how to use it.
A young woman in a dress and work boots sits on a riding lawmower, flanked by an older man who is explaining how to use it.
Products, like riding lawnmowers, often contain engineering quirks that bias them toward a target demographic. In this photo, Nichole Becker gets ready to take hers for a spin with some helpful tips from neighbor Mike Pemberton.

Earlier this summer, my girlfriend, Nichole, and I purchased our first riding lawnmower. It’s really more Nichole’s mower: I have a bias toward hand tools, and Nichole was the one with the foresight to see that a push mower wasn’t adequate to maintain the wild parts of the 7-acre farm we just purchased. With some help from a neighbor, we had the shiny yellow 50-inch rig off the pallet and ready to roll about an hour after delivery. But within a few minutes of firing up the engine, her excitement turned to frustration: About every 20 seconds, the mower blades would turn off and the motor would sputter, seemingly for no reason.

At least half the new things we’ve purchased for our farm don’t seem to work out of the box, so at first, we thought we were just continuing our streak of bad luck. But then our neighbor, who used to run a mowing business, had an interesting diagnosis: “I think she might be too light,” he said. He explained there is a safety sensor under the seat that automatically turns off the blades and engine if you hop off (or fall off) the seat. Was it possible little bumps in the terrain might be lifting her off the seat just enough to trigger the sensor? I have about 35 pounds on her, so as an experiment, I took it for a spin. No issues. When she called the company to see if there was an adjustment for the sensor, they told her there wasn’t and that since “your husband” can use it, they didn’t consider it a manufacturer’s defect worthy of a return. (We did confirm with them that the mower doesn’t advertise a minimum operator weight). Nichole also polled other women farmers and quickly found out this frustration wasn’t hers alone.

Headshots of Assistant Professors DeLean Tolbert Smith and Georges Ayoub
Headshots of Assistant Professors DeLean Tolbert Smith and Georges Ayoub
DeLean Tolbert Smith and Georges Ayoub

UM-Dearborn Assistant Professors DeLean Tolbert Smith and Georges Ayoub, who both teach in the undergraduate human-centered engineering design program, say this is a pretty classic example of how bias can impact product design. They say in this case, the company has likely judged that its customer base is mostly men, who on average weigh 30 pounds more than women, and they’ve designed the safety sensor around that customer, to the frustration of many outside the target group. Indeed, the world of consumer products is full of such problems. In her introductory engineering course, Smith teaches her students about automatic soap dispensers whose sensors wouldn’t activate for people with darker skin tones; or the Ford Windstar — a minivan that initially flopped because its male-led design team failed to include features for women with children, its target audience. Ayoub’s go-to teaching example: autonomous vehicles (AVs) whose cameras don’t reliably recognize darker-skinned people as pedestrians — a design problem that transcends mere inconvenience.

Smith and Ayoub say the sources of such bias are varied. In Ayoub’s AV example, biased machine learning algorithms were the culprit (a topic unto itself that UM-Dearborn artificial intelligence experts Marouane Kessentini and Birhanu Eshete helped us explore a few weeks ago). But human beings are often the more immediate source, especially implicit or hidden biases that aren't conscious but can still have real consequences. Smith says it can occur, for example, in the testing phase if a company doesn’t include a broad enough range of people. Sometimes a product may reflect systemic norms in a society, like an aisle full of supposedly skin-tone bandages that only blend with lighter skin. Sometimes a company might unknowingly neglect an important user audience altogether. Other times, as in the case of the Ford Windstar, a design team may simply fail to adequately understand its target audience. In fact, in that case, Ford actually redesigned the Windstar minivan with the help of 30 women engineers to remedy its initial shortcomings. After Ford’s success, Smith says other companies began to include more women engineers and non-engineers on their design and advisory teams, which paid off in numerous ways, including some of the highest performance on government crash test ratings.

When she teaches this topic, Smith says her students often think the solution to these bias-based challenges is to only design “general products” that work for almost anybody. “But what we teach is that you have to have a thorough understanding of who you’re designing for and make sure you’re thinking beyond your own range of experiences,” Smith says. “Designing a product for a specific group of people is totally OK,” as is the case, for example, when you’re creating an adaptive technology for people with a certain disability. “So I think what we’re trying to do is help students recognize how essential it is to have empathy for whomever you’re designing for and collect a diverse enough set of experiences so you can do it well.”

In fact, in the context of the human-centered engineering design program, Ayoub and Smith consider empathy a technical engineering skill that’s just as essential and teachable as calculus or physics. “To be honest, someone who lacks empathy is probably not going to be a successful designer,” Ayoub says. “And empathy is a feeling, and feelings cannot be taught only through books. It needs to be taught through connections and examples and topics that involve students directly in talking with and listening to users. Knowing how to do that well can actually be quite technical,” Ayoub says.

Smith says embedding this process of listening to users’ needs more broadly in the manufacturing culture could also help reduce bias, especially over a product’s lifetime. “In engineering, we have well-established processes for improving products when it comes to making them cheaper or lighter or more efficient to produce,” Smith says. “But we don’t always take the same iterative approach to understanding users’ experiences of products, and that can change as society evolves.” A perfect example, Ayoub says: the riding lawnmower. When they were designed decades ago, they were probably used almost exclusively by men. “But come on, to tell someone it’s OK because your ‘husband’ can do the mowing for you? You’re not following what is happening in the world. Women ride lawnmowers. They participate in this kind of work on farms. So the design could and should be modified.”

Nichole did learn to master the lawnmower, by the way — with a little amateur reengineering. We found that jamming a little piece of cardboard up under the safety sensor forced it farther up into the seat, leaving it less sensitive but still fully operational. As long as she rides it slowly, keeps her body straight up over the seat sensor, and lets her body move with the bumps like she’s riding a horse, it does the job. It’s a series of hacks she can live with, but should she really have to?

###

Story by Lou Blouin. If you’re interested in learning more about the undergraduate degree in Human-Centered Engineering Design, check out the official program page. We also have an excellent master’s program. Or if you’re a member of the media and would like to interview Assistant Professors DeLean Tolbert Smith or Georges Ayoub on this topic, please give us a shout at UMDearborn-News@umich.edu.

Back to top of page