Keeping people at the heart of product development


By Davina Baum and Jessica Metro

As content strategists, researchers and product designers at Facebook, we aim to create experiences that are clear, consistent and compassionate through both the language and the design. Everything from the tone we use to the controls we provide helps to make products across Facebook that are more thoughtful, more human and better for the people using them.

We work together to consider the entire user experience, paying close attention to both the product design and the content strategy. The two must coexist in a way that ensures that the products we make actually solve the problems people are facing, that the user experience is intuitive and easy-to-use, and that the visual design feels familiar and engaging.

Across Facebook’s family of apps, our teams are dedicated to designing for well-being. We take on topics like suicide prevention, including tools and resources for people at risk and their concerned family and friends; memorialization, to preserve the wishes of people who’ve passed away and support the bereaved; and harassment.

As we learn about how people engage with our products, key insights help guide the decisions we make to launch new products and features, evolve what is already in market, sunset certain features and identify and build teams around new opportunities to create better experiences for people.

Back in 2017, our team explored the ways people experience harassment on Facebook, especially through receiving unwanted messages. As a result of these explorations, we built the capability to ignore a conversation in Messenger. Let’s look at the content and product design process through the lens of that product. Our team sees it as a process in four stages — understand, design, gather feedback, build — with some fluidity between each stage as we go.

We start each project by doing the work to understand the problem. This helps us recognize people’s needs and uncover opportunities to improve the products we build. The core product team consists of a product manager, data scientist, content strategist, product designers, researchers and engineers.

In the case of the work on harassment, the team first aligned on the goal to explore whether our harassment-mitigation tools were serving people’s needs. To understand what people were experiencing, we looked at foundational research insights and survey responses. We also dove into the data to understand behavior at scale and find patterns.

To narrow our focus and scope, we condensed all of this information into “people problems,” which are concise explanations of the issues people are facing. The team discussed and debated which ones to address, combining what we knew from existing data and research — as well as engineering constraints — to define the biggest opportunities.

People problems and opportunities surfaced during a team sprint

Let’s take a step back here, because in many cases, and on our team in particular, “people problems” are serious, real-world problems. A people problem might be, “I want to stop someone I’ve just met online from sending me inappropriate images on Messenger.” These scenarios, reframed as people problems, allow us to abstract and generalize the problem so that it’s solvable — but individuals can’t be abstracted and generalized. So we never lose sight of the people who are facing these problems in the real world — where safety and reputation may be at risk.

As a result of our work to understand the problems, we decided that building additional tools to help address harassment over Messenger was the biggest opportunity. Specifically, we heard that the concept of blocking someone on Messenger or Facebook felt extreme for repetitive badgering or for those who knew the harasser in person. While the block feature did not explicitly declare to the person that they’d been blocked, there were indicators that a savvy user might be able to figure out. Even for the person doing the blocking, the action might feel rude. And in cases of harassment, blocking someone on Facebook can prevent you from being able to report them, because they’re hidden from you.

We started to think of our harassment-mitigation tools on Messenger along a spectrum: from muting, which merely turns off notifications, to blocking messages, to blocking on Facebook. But between muting and blocking messages, there was a hole we sought to fill. We wondered, how might the capability to “hide” a conversation work?

Spectrum of harassment-mitigation tools

In broad strokes, we outlined the user experience: The person being hidden could continue to send messages without knowing that their messages were going unseen, and the person hiding the conversation could proceed without being aware of annoying or harassing messages. If the recipient needed to seek out the conversation, they could find it easily and unhide it if necessary. They could also block someone they had hidden.

We had extensive conversations about product functionality, from the very specific to the very broad. We discussed questions like:

  • Where would people expect to find the option to hide a conversation?
  • How do we communicate what will happen without overwhelming someone?
  • Where should the balance be between feeling lightweight versus robust and satisfying?
  • What exactly does the person who’s being ignored see?
  • What is the visual affordance to indicate the feature is on or off?
  • Where do we surface the ability to escalate to a block?
  • How will this work across Android, iOS, mobile web, desktop web and all the other platforms we support?

Whiteboard wireframes mapping out the flow

While answering these questions, we sketched out design solutions, starting wide and narrowing to what might have the greatest impact. We created mid-fidelity mocks to get a better sense of how the top solution would fit in the existing product, explored content options in context and visualized the design details. We talked a lot amongst ourselves and with our content and design colleagues to get feedback and ensure that we were thinking deeply from the perspective of people experiencing these problems. To flesh out the interaction design and make our ideas feel like a real product, we eventually created high-fidelity prototypes.

High-fidelity interface explorations

How could we deepen our understanding of this problem space? We needed to do in-person research. Data plus existing research on harassment got us part of the way to understanding the problem. In order to ensure our solutions truly meet people’s needs, our content strategy and design teams rely heavily on research — and that often takes us away from our offices and into the world. In our case, we’re based in California, so our team planned a trip to India to get a first-hand understanding of how people who live in a culture different than our own experience harassment on Facebook and Messenger.

Our team — including two researchers, a product manager, two product designers, a content strategist and an engineer — traveled to Delhi and Ghaziabad, a smaller city near Delhi. We conducted focus groups (grouped by men and women) and in-home interviews, speaking with a wide range of people. To get a sense of how they understand and use the current blocking options, we showed them the existing tools for blocking messages, receiving message requests, or turning on the ability to review the posts you’re tagged in. We then sought their feedback on prototypes for new ideas including the feature to hide conversations and a promotion to raise awareness about message blocking.

Early prototypes used for feedback in research

In doing this research, we wanted to understand both the perspective of someone who’s being harassed or bothered on Messenger, and that of a person who’s repeatedly trying to contact someone. People in our focus groups talked about seeing Facebook as a way to connect with anyone in the world; there’s a real curiosity and interest in meeting new people online. Both men and women said that they accept any and all friend requests, no matter how remote the connection. We also heard from women who told us how easily and often the line can be crossed from friendly hellos to unwanted contact and harassment. We got a strong sense of the pervasive ways that harassment plays out in the real world, in all sorts of interactions.

We spoke to participants who said they had experienced a lot of harassment on Facebook, so we knew we’d hear difficult stories. But we also heard strength and courage — as well as a deep familiarity with the tools at hand. Men and women understood the functionality to block both a message as well as a person on Facebook, and they had no issues with blocking someone who was bothering them. “I just blocked him” was a refrain we heard over and over. But we also confirmed that in certain situations, there’s a need to stop hearing from someone who’s sending harassing messages without explicitly blocking them because it could cause serious real-world implications.

Unwanted and harassing messages have no place on our platform.

This is where the work becomes really, really hard. Harassment affects millions of people every day — well beyond Messenger. We know that a button on Messenger will not take away the memory of a harassment experience, and it won’t mean that something upsetting won’t happen again. Unwanted and harassing messages have no place on our platform. To help combat this behavior, we work to adjust and augment the tools that someone might be using.

In the early stages of this project, we had been referring to the functionality as “hide.” We learned on this research trip that people were already familiar with the capability to hide a comment on Facebook, which didn’t align with the idea of hiding an entire conversation with someone on Messenger. So, after much discussion and exploration of options, we shifted our language to “ignore.”

As with any product, we collaborated with our engineering colleagues who were building the back-end functionality and front-end interface. We needed to ensure that the product we designed was technically feasible, while still being intuitive to use. Once we launched the ignore functionality, we ran surveys and analyzed data to understand how it was (and is) performing. We found that it was quickly adopted and coexisted harmoniously with blocking and other harassment-mitigation tools. However, we also needed to account for group messages, so we iterated and did another round of usability research to ensure that the built product was intuitive, valuable and easy to understand.

Final prototype for the feature to ignore messages

As product designers, content strategists, and researchers, everyone can leverage each other’s skills on the team to deeply understand the problems people are facing, lead the team’s efforts to design solutions for those problems, show prototypes to real people for their feedback, and work collaboratively with engineers to bring the final product experience to life.

But our design work does not stop there. Testing, learning, iterating and sometimes making decisions to sunset products and features is all part of the journey to getting it right for the people who use our products. This work has been one step along that journey to determine the best way to help people avoid harassment on Messenger. We’re proud to have the opportunity to continue learning and designing for well-being, keeping people at the center of our work each day.