Innovation Update

Insights without exposure

Understanding ‘differential privacy’ in information security



How should fraud examiners and legal professionals meet compliance standards but also keep individuals’ and organizations’ data private? “Differential privacy” is a system of cybersecurity that proponents claim can protect data far better than traditional sanitizing or anonymizing methods.

Over the past two years as a columnist for Fraud Magazine, it’s been a pleasure introducing new ideas, innovations and technologies to you — my colleagues. That’s why I’m excited to devote this edition of “Innovation Update” to the concept of “differential privacy.”

Differential privacy can securely limit algorithms so organizations can securely share private, sensitive data internally or among third parties. The concept isn’t new. Mathematicians, cryptographers and academics have been discussing it for more than a decade. However, companies are now commercializing it for global fraud examinations and proactive compliance monitoring.

In September 2019, Google released the open-source version of the differential privacy library it uses in some of its products, such as Maps, according to Emil Protalinski, the author of Google open-sources its differential privacy library, VentureBeat, Sept. 5, 2019.

“Differential privacy limits the algorithms used to publish aggregate information about a statistical database,” Protalinski writes. “Whether you are a city planner, small business owner or software developer, chances are you want to gain insights from the data of your citizens, customers or users. But you don’t want to lose their trust in the process. Differentially private data analysis enables organizations to learn from the majority of their data without allowing any single individual’s data to be distinguished or re-identified.”



For full access to story, members may sign in here.

Not a member? Click here to Join Now. Or Click here to sign up for a FREE TRIAL.