No surprise: Privacy is a big deal.
If your organization is storing data about people, privacy should be a big deal to you. This is especially true if you’re storing data about people in the European Union (EU), in light of General Data Protection Regulation (GDPR), which will go in effect May 25, 2018.
GDPR is designed to strengthen and unify data protection for all individuals within the EU and give control back to citizens over their personal data. Under GDPR, organizations in breach can be fined up to four percent of their annual global turnover (revenue) or €20 million (whichever is greater).
This is the maximum fine, which can be imposed for the most serious infringements, like not having sufficient customer consent to process data.
And it’s per infringement.
So what precisely does it mean to have data that aren’t properly anonymized?
The Look of Data that Can Incur Hefty Fines
Weak anonymization algorithms are one way of violating user privacy.
Remember Where’s Waldo? In the books, Waldo is hidden among a large crowd, and we are invited to pore over the pages, scanning for his trademark red-and-white-striped shirt, bobble hat and glasses. Knowing what to look for makes it slightly easier, although the books introduce red herrings to make Waldo more difficult to spot.
Imagine if an entire Where’s Waldo? illustration contained a mass of people all wearing dull green, surrounded by dull green landmarks, and Wally in his trademark red.
He’d be really easy to spot.
A GDPR infringement occurs if somebody can determine a person’s identity through data, even if anonymization algorithms are in place. Best intentions don’t matter.
One of the best methodologies your organization can institute to comply with GDPR is to adopt a Privacy by Design approach to your systems.
Privacy by Design: Where Outcomes, Not Intentions, Matter
Privacy by Design is an approach to systems engineering that takes privacy into account throughout the whole engineering process.
It’s not about data protection per se.
Rather, the system is engineered in such a way that it doesn’t need protection.
The root principle is based on enabling service without having the client become identifiable or recognizable.
Three examples of Privacy by Design include:
- Dynamic Host Configuration Protocol (DHCP). With DHCP, a server maintains a pool of IP addresses, and randomly assigns an IP address to a device. Because the IP address is “leased” to a device, it doesn’t leak personal identifiers about the person using the device.
- Global Positioning System (GPS). A GPS device doesn’t require you to transmit data; rather, it relies on signals transmitted from GPS satellites whose positions are known. Without leaking your identity or location, it can provide you your geographic location.
- Radio-Frequency Identification (RFID). As it pertains to the Internet of Things (IoT), RFID can act as the bridge between the physical and digital world. The RFID tag is preregistered with the host system to establish identification. Then the tag communicates only by broadcasting its ID.
Zero-knowledge proof is one way you can implement Privacy by Design. It is a means of establishing proof by using something other than personal identifiers.
For example, a gambling website may use a Facebook sign-in, which can guarantee proof of age by asking Facebook.
In another example, a risqué game in the 1980s might ask questions about baseball players that only an older audience would know. Of course, the questions didn’t prevent a baseball-prodigy youngster from gaining access.
How Privacy by Design is achieved depends on the application, technologies and choice of approach. Daugherty can anonymize your data to protect you from penalties through GDPR. We can also analyze it to determine opportunities in the data where your organization can expand. Contact us today.