College of Business faculty research how people balance technological benefits and privacy risks
The first-of-its-kind research study, which was published in the Journal of Business Research, seeks to better understand where the privacy line is when it comes to new technology.
When augmented reality smart glasses — like Google Glass and HoloLens — hit the mainstream market, many thought they were seeing the next big trend in technology.
For example, shortly after the 2013 release, Google Glass was forecast to sell more than 20 million pair by 2018. Instead, the public overwhelmingly rejected it.
Information Systems Management Professor Jun He wondered why this emerging technology with its benefits — hands-free constant access to the internet and your apps — wasn’t accepted by the average consumer.
So He — along with College of Business colleague Professor Young Ro and former COB colleague Philipp Rauschnabel — decided to find out.
The Journal of Business Research recently published the findings of their nearly three-year research study. In the research process, the faculty members developed a theoretical model to assess augmented reality smart glasses (ARSG) usage — something businesses might find helpful in the development stage of a product to help gauge the viability.
“It’s important to understand privacy concerns to these technologies before they are created and marketed to the public. After understanding concerns, you need to alleviate them through a plan to educate people on the new technology or — if needed — a product design change,” said He, who specializes in technology acceptance. “We created a model so the tech developer can better understand the consumer.” In the Google Glass example, He said product changes like a way to shut off the camera or an in-use recording indicator light may have helped adaptation.
Using the tested model, He said they found results showing something unexpected: Subjects — they surveyed nearly 300 people — didn’t want to wear smart glasses because they were worried about invading the privacy of others.
He said they were OK with giving up their privacy to get the comforts and benefits technology brings. But with the smart glass’ built-in always-on camera, subjects responded that it wasn’t right to have someone else’s actions recorded without consent.
“Humans are social creatures. We don’t want to live in isolation and we care about society and how people see us,” he said. “When people know they are being recorded — it’s just not a natural thing — their behavior changes. And the user can pick up on this change. This leaves both sides feeling uncomfortable. The easy solution is to take off the glasses and avoid the technology.”
He said their research began prior to the Facebook/Cambridge Analytica scandal. But the reaction to that public privacy concern reinforced what the team was discovering.
“As we were learning about why ARSGs weren’t desirable on a mainstream level, the news about the Cambridge Analytica data leak came out. Reading and listening to what people were saying, there was more concern about their friends or family’s information than their own,” He said of the 2018 news, where it was found that up to 87 million Facebook users and their Facebook friends’ information was harvested by the now defunct political consulting firm. “Even as accepting as people have been with technology adaptation and information sharing, there is a privacy line that people don’t want crossed.”
He said the study — the first of its kind — shows the varying degrees that people are willing to give up privacy for new tech.
“It’s exciting to find answers to something you didn’t understand and to discover a new phenomena,” he said. “We have many digital gadgets that actively and passively collect our data and we allow it to happen. But there is a line and it’s important for companies to realize that this line does exist and figure out where it is prior to developing the technology of tomorrow.”