Forgetting personal data and revoking consent under the GDPR: Challenges and proposed solutions by Eugenia Politou, Efthimios Alepis, Constantinos Patsakis (Oxford Academic, Journal of Cybersecurity, Volume 4, Issue 1, 1 January 2018)
The notion of consent revocation, or withdrawal, has also been brought into light recently, with many to argue for a right to revoke consent and for a more user friendly and personalized consent mechanism [71–72]. Indeed, when individuals’ are given the opportunity to grant consent to the use of their personal information as a primary mean for exercising their autonomy and to protect their privacy, it should be logical to exist a corresponding option to withdraw or revoke that consent, or to make subsequent changes to that consent [73, 18]. The principle of consent withdrawal within the Human Computer Interaction (HCI) context has been studied in many ethical research projects, with Benford et al. in [74] to underline that in many cases it may be difficult to fully withdraw in practice because the issue of balancing consent, withdrawal and privacy is a very demanding managed task. Whitley in [18] argues further that, since the revocation of consent can mean a variety of different things depending on the circumstances and constitutive purposes that the data are being held for, it is helpful to differentiate between revoking “the right to hold” personal data and revoking “the right to use” personal data for particular purposes. Revoking the right to hold might be implemented by marking a particular record as no longer “being live” or may require the deletion of records and, in extreme cases, it might require deleting data from backups and physically grinding the hard disks. In addition, providing auditable, privacy friendly proof of compliance when and how the revocation has been achieved is a challenge both technologically and legally [18]. For instance, the advancements towards privacy-enabled networks and infrastructures puzzles some academics [75] who afraid that the same mechanisms have been put in place to protect the privacy of data (like de-identification) may actually make it very difficult to trace and remove individual derived data in order to allow participants to withdraw completely their consent and be forgotten. In such situations, as Kaye [75] underscores, it may be only possible to prohibit the entry of new information and samples into the system. Apart from these practical difficulties, there are also economic and public-good arguments for disallowing absolute withdrawal. For instance, in the bio-banking field complete withdrawal could lead to the wastage of resources invested in bio-repositories [75–76] whereas the practice of archiving qualitative research data for substantive secondary analysis can be significantly challenged under the revocation mechanism for withdrawing consent [77]. Due to these immense consequences, many academics and legal experts questioning the concept of consent withdrawal.

This morning, I ventured out to Rojo’s Roastery for a cappuccino. There wasn’t much activity on Palmer Square.

I am enjoying experimentation with Fuji Film Simulation recipes for my Fuji X-T2. This one from Ritchie Roesch is a simulation of Ilford HP5 Plus. This image is SOC.

It's Time for Action on Privacy, Says Apple's CEO Tim Cook by Tim Cook (Time)
One of the biggest challenges in protecting privacy is that many of the violations are invisible. For example, you might have bought a product from an online retailer—something most of us have done. But what the retailer doesn’t tell you is that it then turned around and sold or transferred information about your purchase to a “data broker”—a company that exists purely to collect your information, package it and sell it to yet another buyer.

The trail disappears before you even know there is a trail. Right now, all of these secondary markets for your information exist in a shadow economy that’s largely unchecked—out of sight of consumers, regulators and lawmakers.

Let’s be clear: you never signed up for that. We think every user should have the chance to say, “Wait a minute. That’s my information that you’re selling, and I didn’t consent.”
Image from flickr.

Woot! Best thing I’ve read all month! I fully support this. The EU GDPR document was a not a riveting read but is a punch to the groin for data aggregators and brokers. And it put in place fines, 4% of revenue, for violations. We need similar legislation here.