But what does that term “at-risk” mean? At Jigsaw, we consider “at-risk” situations to be when bad actors or situations generate exceptional risk for people. Some people may face at-risk situations because of who they are (a minority), what they do (an activist or journalist), where they live (a conflict zone or abusive household), or what they share (someone who promotes a particular ideology on social media). Over the course of our digital lives, many of us, including many of you reading this post, will be at risk, if you aren’t already.

Designing for people in at-risk situations is hard. While we try our best to address the needs and preferences of the vast majority of users, none of us have the capacity to personally witness the full diversity of the world’s situations, so there will always be contexts that designers can’t foresee. For example, unless you’ve had experience on the ground, it can be challenging to channel the experience of someone using your product in a conflict zone. And the default setting for 99% of people may not be the preferred one for the remaining 1%; it can be impossible to know when research and design are “done.” These intrinsic limitations make it harder to serve users in at-risk situations.

Designing for the high-risk user

While the typical company designs for the vast majority of users, at Jigsaw, we work to understand the needs of people in at-risk situations, and build tools to try to address some of the specific situations they face. For example, we have worked to provide access to the open internet, to protect against DDoS attacks, and help to detect phishing attacks. We meet with activists, journalists and other frequently targeted groups, and we collaborate with partners in Google to develop frameworks for understanding what at-risk users need, and how designers can help accommodate those needs.

When designing for at-risk situations, the first step is to map out the goals, stress levels, and expectations of users throughout their at-risk journey. Knowing the stages of the user journey can inform the design of a piece of technology.

Prevention

We often call the first user stage Prevention, which refers to what people do in early moments to minimize their risk down the road. For example, we have met people from high-crime parts of Mexico who think that sharing their location can increase their risk of being victimized. Technology can help in this context by making it easy to configure settings to disable any tracking or sharing of their location.

As designers, anything that could impact privacy and security — settings, accepting or declining app permissions, and terms of service — are opportunities to help people navigate risky, real-world situations. Most people aren’t particularly stressed by the choices they make at this stage of the journey — they just want to understand the possible consequences of their decisions — so there’s often time to carefully weigh tradeoffs.