Privacy Atrophy

Rebecca // August 4 // 0 Comments

Great privacy foundations may not be enough to ensure lasting privacy protections

Privacy by design is great, as systems can integrate privacy into their foundations.  Privacy should not be tacked on at the end of the system design.   However, this week I’ll address one reason why great privacy foundations may not be enough to ensure lasting privacy protections in software systems.  

What causes Privacy Atrophy?

Even a system that’s designed with privacy from the get-go can later discover privacy vulnerabilities. Privacy flaws can arise over time in released products even if the product design hasn’t changed.  This happens when the privacy landscape has changed.  I call this “privacy atrophy”, as it can be a gradual degradation in privacy protections.   Privacy atrophy in released software systems is caused by three main factors. 

The social or cultural context can change.  For example, there is now greater awareness about the sensitivity of fields such as “sex” and “gender.”  These are no longer seen as static over time, and an official “sex” data point on a person might not match a person’s own preferred definition.  

New privacy vulnerabilities are being discovered.  New privacy vulnerabilities are being discovered both by privacy researchers and unfortunately also by malicious hackers.  For example, our definition of “identifiable” has changed as we’ve seen creative attacks and re-identification of data sets that were considered “anonymized.”  Removing strong identifiers may have been fine when releasing datasets a decade ago; it certainly doesn’t fly today.  

Other technology changes around your product.  When fundamental changes occur to the OS, browser, or integrations, it can have an impact on the original privacy design.  For example, Chrome has announced that it will change how cookies are handled.  Websites will have to update their cookie mechanisms.  

Does Privacy Atrophy impact your system?  It depends on how old your production system is and how sensitive your data is.  Even well-designed systems need to be reviewed over time.  If you built your system with privacy by design, remedial privacy will certainly be easier.  If you build your systems with privacy remediation by design, you are ahead of the game.  

About the Author Rebecca

Dr. Rebecca Balebako builds data protection and trust into software products. As a certified privacy professional (CIPP/E, CIPT, Fellow of Information Privacy) ex-Googler, and ex-RANDite, she has helped multiple organizations improve their responsible AI and ML programs.

Our Vision

 We work together with companies to build Responsible AI solutions that are lasting and valuable. 

Privacy by Default


Quality Process