There’s a big problem brewing in tech companies today. On the one hand, we have products and services that leverage Cloud-based, ‘always-on’, sensing and recording of user data. Much of this data can be helpful, but how it is captured, stored, shared, and leaked can result in abuse. Amazon, for example, has two dozen types of domestic devices, and many of them record more data than any other company’s products. Yet Amazon’s product development strategy seems to de-prioritize privacy, considering it a feature, rather than a core user experience design strategy.
With the proliferation of data-gathering and aggregating, product designers are under pressure from the growing consumer concern over privacy. In 2022, 74% of Americans ranked privacy as a core value (Pew). To be sure, how consumers interact with and experience product privacy and security has a massive impact on user adoption and feelings of safety. Rather than opposing the user experience requirement for privacy and security, designers ought to embrace it and champion higher standards of legal and ethical rights found in legislation such as GDPR.
The history of technology adoption and public relations has shown that companies that disrespect user privacy (eg Facebook) are hurt in the long run by publicity and user defection. Yet products are being made, from VR to smart devices, with privacy and safety as an afterthought. In this Miniclass, we will look at how users perceive privacy and security and how product designers can build trust, safety, and security into their product strategy based on guidelines for trust, privacy and security in UX Design.
Topics we will cover in the Designing for trust, privacy and security in UX Miniclass:
A brief, but deep-dive into a topic, led by Frank Spillers. The sessions provide an orientation to key points on a topic. The sessions are FREE to members of the UX Inner Circle.