Increasing complexity of digital data

Increasing complexity of digital data and the identity of images. Why it's essential to understand GDPR.

The Increasing Complexity of Digital Data and the Identity of Images

Monday 21 September, 2020

GDPR - General Data Protection Regulation - was like a rather bad day at the office and most people don’t want to be reminded of that particularly intense and challenging period in their Data Protection story. But who can afford to take their eye off the ball as digital multi-media takes information to infinitesimal locations at the touch of a button?

Data that is clearly data so in the printed word is easy to identify and link to a particular person (or not) and so the rules governing this are relatively simple to understand but images are not always that easy to coral or interpret.

The importance of images

The internet has brought an explosion in the use of images, tech makes it so easy to capture a picture and digital media to publish it. The use of images online has become more important than the text they enhance; screen readers will view the image before reading the words and may not bother to read some or any of the text at all.

Images and GDPR

The rule of thumb with GDPR is that photographs are not considered ‘sensitive personal data’ if they are not used for the purpose of identifying or processing personal data but there is some ambiguity surrounding this and each picture has to be interpreted and evaluated on its merits. The GDPR definitely does include photography that is not exclusively private as stated within Article 9 – Processing of Special Categories of Personal Data and from which an individual may be recognised.

What is pseudo photography?

Pseudo photography is just one example of where technology has run away miles ahead of the legislation which is struggling to play a rather slow catch-up.

A pseudo-photograph is a new technological medium, defined as an image which to all intents and purposes, appears to be a photograph but which has been artificially created. It could contain the essence of a real photo of a person, event or landscape which has been digitally manipulated to some degree or enhanced or, it could be created entirely by a software design programme using Artificial Intelligence. The intent usually is to create something that does look like a real photograph so the person or event or scene captured in the shot presumably is depicted as genuine to those that view it.

Depending on the circumstances of the case, pseudo-photography could constitute a misuse of personal data under the Data Protection Act. Use of a pseudo-photograph could be deemed to be a breach of an individual’s human rights under the European Convention on Human Rights (ECHR) or other similar global legislation applicable in the broader international community.  But if the ‘person’ in the image isn’t real but a construct of multiple different people’s faces to create an authentic picture of someone who doesn’t actually exist, does pseudo-photograph as a medium per se fall foul of Data Protection laws?

There is a suggestion with pseudo-photography that it may be created to deceive hence the new regulations in France. From September 2017 in France, it became illegal to digitally enhance an image without stating on the picture that it had been airbrushed. “Photographie retouchée" is now mandatory in those situations so that people realise the image they are looking at is not entirely real but it is probably still real enough to identify the individual model.  The mandate here is to stop presenting women with physically unobtainable and unrealistic bodies and make some dent in the eating disorder statistics. France has a very high rate of anorexia globally.  Interestingly, the new rule in France, a country obsessed with thinness, does not cover dyed hair or blemishes which may also have been removed by the editing software. Are they next? When is an image so altered that is no longer representative of the individual and its use is not, therefore, a potential breach of their rights under GDPR?

The implications for human identity

Artificial intelligence is on the move and now able to create completely realistic images of people who simply don’t exist. This is pseudo-photography, the image of a person totally computer-generated. The pictures are produced using a Generative Adversarial Network or GAN, a system made up of two different networks, a generator and a discriminator.  This AI has previously been successfully used to create 3D images of new developments so you can see how a new building will look in the landscape it is planned for and also take a walk around its interior.

Crucially, this technology begins the process from mapping a real face and then throws another one into the mix before finishing with a random sprinkling of human detail such as spots, skin pores, freckles and ethnic features.  Although the final result is wholly fictional, the courts may need to decide whether using original faces or features, albeit even only slightly is wrong per se if not attributed. It’s a moot point whether that ‘person’ is fictional if parts of their identity have been borrowed from people who do exist.

The unintended consequences of online publication

The way information and image are shared is another consequence of the digital age and turns the concept of shared ownership on its head, a legal framework for which was originally devised and written based on the idea of property as something tangible and defined with a material presence and ownership usually relying on physical possession. The manipulation and transference of data is something the law could never have envisaged or anticipated in the pre-internet age. Data protection law has caught up with the status of personal data as property, but cyberspace has propagated information like never before.

Information can remain ostensibly within our ambit but still be spread around the digital globe at the touch of a button and with incredible momentum. Where it ends up and who uses it seems not to be within our grasp. The old concepts of control and ownership have been exploded and even the idea of a simple image is now open to question.

Here at the Civil Service College, we also offer expert-led training on Information Rights - GDPR, FOI & Data Protection in partnership with Naomi Korn Associates that also might be of interest.


New Course

A five-day programme developed jointly by the Commonwealth Parliamentary Association and the Civil Service College aims to equip new ministers with a skill set and tools in managing a department.

Tailored in-house training

Civil Service College can deliver In-House training within your organisation that is exactly tailored to meet your individual training requirements.

Training Catalogue

Complete Training Catalogue
Jan - Dec 2021

Training Alerts

For more information on how we protect and store your submitted data, please see visit our privacy policy.