Respect Diversity and Privacy with Data Minimization

Rebecca // June 11 // 0 Comments

Does your company’s DEI initiatives include building a diverse customer base? Do you want to create a community where more people feel welcome and included? Do you want your product features to enable equitable experiences for your user base? Are you a bit lost on how to start? Today I’ll explain how privacy features help your community welcome diverse members.

Your community may be your customers or your audience. You want people to include people in your community because of relevant interests or priorities, and you don’t want to exclude people for irrelevant characteristics or traits. One way set up a community where people feel welcome is to give people control over the data they share with you.

I’ll give an example. When I was a programmer, I contributed to open-source repositories. I was aware of how few women there were in that community. I didn’t want my gendered name to be a factor in code feedback, so I frequently used a nickname. At the time, I didn’t want to get extra attention, positive or negative. I wanted community members to focus on my code, not on my membership in a minority group. It does not matter whether I was right or wrong about gender bias in the open-source community. My individual experiences being a woman in tech impacted my behavior and decisions.

Luckily, I had the freedom to use a pseudonym instead of my full name, and I was more comfortable joining in and contributing to the community. I had control over the information I shared.

People want to choose what to share

People want to control what is known about them. They may want control over whether others know their gender, religion, weight, marital status, and more. It can be difficult to know what information people will consider private. However, people are typically sensitive about information that makes them vulnerable to harm. People who have lived experiences with discrimination, racism, or unfair bias may be particularly careful with such sensitive information.

One person might consider a question about age or ethnicity to be personal. Another might want to share that information with pride. You likely can’t know in advance what any particular person considers private. Furthermore, if people from a particular group are not represented in your company, you may have a blind spot about what information they consider private.

Privacy laws define some types of sensitive information (such as religion and racial or ethnic origin). They vary by jurisdiction. Privacy laws are one way to inform yourself about what people consider sensitive. However, the laws do not describe everything everyone in your community will consider private. You can build features that go beyond the legal minimum by giving control back to the individual.

Data minimization to the rescue

One way to give people control is to practice data minimization. Don’t force people to share information about their minority status if they don’t want. When you require too much information to join your community, you take away the individual’s control. Only request relevant and necessary data from your community.

If your community has different tiers, you may also have different tiers for data collection. For example, financial advisement companies may not need to know if someone is married or divorced when they sign up for a simple newsletter. But marital status may become relevant if that person signs up for specific coaching on retirement. Only ask for that data when the data is relevant to the services provided and not before.

There are additional benefits of data minimization. First, it helps you comply with privacy laws and regulations by only collecting what you need and what your community agrees to share. Second, it reduces your risk if a data breach occurs and your customer data gets shared. The less sensitive information you have about your community, the less mess you have to clean up. According to the US National Cyber Security Alliance, 60% of small businesses that face a data breach fail within half a year. Give your company a chance to be a survivor by storing less risky data.

Summary: Ask only for what you need, when you need it

Don’t leave people outside your community because you required them to reveal something they didn’t want to share. Minimize the data you require.

Bonus section: Some hard decisions

The article’s goal is to help small businesses begin to think about how diversity efforts and privacy features dovetail to provide mutual benefits. I haven’t addressed all the tough questions that can arise for data-driven online B2C companies when they think about data minimization. I briefly describe them here so you can plan for those discussions if needed.

Preventing Abuse: There can be a trade-off between collecting information to prevent abuse and allowing users to share minimal data. For example, some community owners believe that internet nicknames encourage bad behavior, and people who share their full names are likely to act as if their mother is watching. This may be true; it depends on the context of your community versus other controls. Privacy engineers and usable security experts can help you balance the user features and engineering decisions that protect people.

Measuring diversity means collecting data on diversity. If you search online for “privacy and diversity” now, most results will discuss it as a conflict. Many companies with DEI want to track minority status to see if they are meeting hiring and retention goals, but this means collecting and storing this sensitive data. There is a similar conflict regarding measuring fairness with Machine Learning and Artificial Intelligence. You don’t want to build models on biased datasets, but how do you know if your dataset is unfair to a minority group without that sensitive data? Once again, privacy engineers (consultants or full-time employees) can help you weigh the options that work for your company.

About the Author Rebecca

Dr. Rebecca Balebako builds data protection and trust into software products. As a certified privacy professional (CIPP/E, CIPT, Fellow of Information Privacy) ex-Googler, and ex-RANDite, she has helped multiple organizations improve their responsible AI and ML programs.

Our Vision

 We work together with companies to build Responsible AI solutions that are lasting and valuable. 

Privacy by Default

respect

Quality Process

HEALTH

Inclusion

>