If there were ever a time for Design and Ethics to come together, that time is now.

Katerina Karagianni

The world of ethics is a complicated one. It’s full of hard questions, moral grayzones and difficult choices. And it’s a world that UX and product designers rarely have the time or energy to enter. Even worse, we mostly think that ethical concerns don’t really apply to us. After all, what we do can only be making the world a better place, right?

The digital landscape we have been designing is looking more and more like this…

Hardly the innocent technologically advanced utopia many of us envisioned when we started our careers, right?

If there were ever a time for Design and Ethics to come together, that time is now. I believe that we are now at a crucial crossroad, where certain trends, market forces and technological developments are converging, making it imperative for a serious discussion on ethics to be had in the tech design industry. Here’s why:

Attention Economy

Photo by Jamie Street on Unsplash

We are living in what has been dubbed the Extractive Attention Economy, where the most valuable currency is our attention. As the Centre for Humane Technology, a world class team of concerned former tech insiders, explains:

“Today’s tech platforms are caught in a race to the bottom of the brain stem to extract human attention. It’s a race we’re all losing.”

As an industry, we’ve mastered the art of Influence and Persuasion, we’ve written books on how to design habit-forming products and are obsessed with seemingly innocent buzzwords like “seducing”, “engaging”, “captivating” and of course, the shameless “addictive”.

The result: digital addiction, social isolation, breakdown of truth, mental health issues, polarization, proven loss of ability to focus, learning and socializing difficulties for children… to name a few. It seems like we need to slow down and rethink many of our strategies from a moral standpoint.

Data & Privacy

Photo by Markus Spiske on Unsplash

We’ve heard it before: “Data is the new oil”. For the first time in history, an unimaginable amount of personal and behavioural data is available, while advancements in infrastructure and big data mean that it is now possible for that information to be processed and used in meaningful ways. The effects are all around us, from retargeting, hyper-personalization and predictive technologies, to steering electoral campaigns and political manipulation.

There are all sorts of moral and political questions concerning our civil rights and the ownership of data, but even at the very basic level, designers have both the power and responsibility to create transparent interfaces, explaining how and what data is used, as well as offering users clear control over those choices. Despite Europe’s GDPR clearly requiring explicit and informed consent, according to a recent study most cookie notifications are either meaningless or manipulative, nudging users to provide consent via dark patterns and other unethical techniques.

Digital Disruption

Photo by Thought Catalog on Unsplash

The products, games and services we’ve been designing are no longer confined to the digital world. In fact, there is no digital vs physical divide anymore. What we create in the digital space, has profound implications and very real effects in the actual world we live in. It affects people, families, communities and whole countries.

Look at the disruption created in their respective fields by companies like Amazon,, Lyft and Uber. Take Airbnb, who started with the humble mantra of “belonging anywhere” and has found itself at the centre of various controversies caused by the negative impact the service has had on local communities. It’s evident that the Airbnb experience has been designed and optimized for the host and guest, but not for other stakeholders like neighbours or locals. These parameters need to be included in the service or product design process and its our job as Design Thinking advocates and facilitators of UX processes, to make sure they are.

AI & Emerging Tech

Photo by Andy Kelly on Unsplash

And finally, the challenge to end all challenges: The rise of Artificial Intelligence. Machines will soon have to make moral decisions themselves and we, as experience designers, will play a role in designing those choice architectures. But, as the Moral Machine experiment clearly demonstrates, ethics and moral choices are not universal, so it seems we will be forced to enter the difficult discussions about morality and ethics, because technology is already demanding it.

AI and self-driving cars aside, even most of the emerging technologies that are being developed today, seem to come part and package with ethical dilemmas. Predictive and anticipatory technology, face recognition, emotion detection, hyper-personalization… it’s all operating in that ethical grayzone. Think about Amazon’s 2018 patent application, which describes how Alexa can detect a user’s physical or even mental illness from a change in their voice. How comfortable would you be with that technology in your house?

It seems that even if we are not ready to answer those questions ourselves, the rapid development of technology is forcing us to at least pose them and begin an open and honest discussion.

If you empathize with any of these concerns, you might find yourself asking just that…

Even if you are not a part of the design teams at Apple, Instagram, Twitter or Google, even if you are working on smaller-scale, local projects, your choices still have an impact. They still matter.

We have an obligation as a community to use our powers for good and do the right thing, or at the very least, discuss what that right thing might be. But, while we get our heads around all these newfound concerns, while we invest some time in reading up on ethical principles and until we are ready as a community to codify and infuse our design processes with frameworks and ethical checkpoints, here are 5 simple strategies to consider in our day-to-day work.

1. Focus on Usability vs Persuasion

Move away from Persuasion, back to good-old Usability, where the only agenda is to make things easy and intuitive. Focus on aligning with users’ mental models, structuring content in a meaningful way, making interfaces and tasks easy and removing obstacles. When concentrating on Usability we have the chance to apply all our knowledge surrounding the limitations of the human mind for good, by helping users to overcome those limitations, rather than exploiting them for our own benefit.

2. Don’t ignore Accessibility

This is probably the most ignored principle in UX, at least in my experience. We are outraged by the sight of a car parked in a disabled parking spot, but don’t seem to have the same reaction to a website that has absolutely no consideration for people with disabilities. I wonder why that is? For our part, ensuring that the basics are covered is the least we can do, i.e. colours, contrast, size, assistive technology, descriptive text etc.

3. Change priorities, one buzzword at a time

As a community, we need to redefine our priorities, we need to rethink our strategies and replace our buzzwords. Stop talking about engaging, seductive and captivating designs. Start designing for calm, balanced, healthy experiences. We should start investing in long term relationships, giving our users room to breathe and trusting that they will return if we really are of value to them. Redefining KPIs and success metrics, is not a small task nor one that rests solely with designers, even if they are product designers, as opposed to just UI designers. But I believe in the power of words and often how we say things can ultimately affect how we behave.

4. Widen the stakeholder net

I love this expression by Cenydd Bowles. We’ve focused so long on user-centred design, that we may have not noticed how narrow and individualistic it is. When considering our user’s needs, motivations and fears, we should also be considering that person’s community, how our product impacts the people around him/her, how it impacts society and the environment. All aspects and stakeholders should be accounted for during the design process.

5. If it’s bordering on dark patterns, don’t do it

Use your persuasive superpowers carefully. If it’s even bordering on being a dark pattern, don’t do it. If you’re not sure whether what you are doing is considered a dark pattern, double check. Remember, it’s called nudging, not coercing, nor tricking, nor manipulating.

And finally, don’t stop asking the hard questions. It’s true that we live in complicated and pluralistic times, so there are no easy answers.

Let’s at least try to ask the right questions.