The EU General Data Protection Regulation (GDPR) which becomes effective 25 May 2018, will have a major impact on the data protection policies of organisations and on how they will be processing personal data. Entertainment and media companies will be among the industries most affected. In this sector companies increasingly build their business models around collecting personal customer data to tailor their propositions and create a competitive advantage. Main purposes of the GDPR are to uplift data protection standards in and outside the EU and to deliver greater legal harmonisation of data protection regulations within the region. This should make it easier for individuals to understand how their data is being used and to raise any complaints, even if they are not in the country where their data is located.
The GDPR was adopted in May 2016. Organisations have a grace period of two years to become compliant with the new regulations. As of May 25, 2018 the GDPR will be fully enforceable within all EU member states.
With the enforcement of the GDPR organisations will be obliged to gain and show insight in the various personal data processing operations within their organisations. It also requires companies to implement the privacy by design and privacy by default principles by e.g. the performing of Privacy Impact Assessments in case of the introduction or development of new products and systems. Finally the GDPR introduces substantially higher penalties of up to 4% of an organisation’s global annual turnover or € 20 million (whichever is greater).
Across Europe, several studies show that the preparations for the GDPR are starting to get underway in most organisations, but many are struggling with the implementation of the required processes and procedures. It is clear to most organisations that the GDPR requirements include much more than improving IT security standards and procedures.
Telco and E&M companies typically have access to a lot of personal data. They are continuously exploring how they can monetise this data and develop new products and forums to share them, by making the most of continuously evolving technologies. In some cases they inadvertently or deliberately cross, or balance on, the privacy boundaries. With privacy legislation catching up, we see an increased number of imposed fines within the European Union, including recent ones for Facebook and Google. With enforced powers for national data protection authorities under the GDPR we expect the number and level of fines to further increase. In many countries the local data protection authorities have already indicated that after May 2018 they will most certainly make use of their increased power to impose fines.
Business earnings models are increasingly based on developing personalised profiles that describe and predict our (future) behaviours, preferences and sentiments (profiling). In order to make valuable use of the trillions of bits of online personal data, large data analysis techniques are commonly used to provide previously unexplored relationships between structured and unstructured data and to discover connections that could not be detected in the past.
This form of data processing includes ‘profiling’ and it relies to a large extent on personal data. Personal data under GDPR is defined as information that allows an individual to be identified, either directly or indirectly. Examples of personal data include obvious data such as name, address or telephone number, but it also includes data such as IP address, location data, genetic data and behavioural data.
Over the past decades technologies developed rapidly and have allowed organisations to gather personal data at a large scale and analyse it for a variety of purposes. Profiling of personal data can help organisations to draw conclusions about individuals and act upon it. Such conclusions may for instance relate to targeting of individuals for marketing purposes or price differentiation, but it can also be used to exclude individuals as (potential) customers.
Behavioural targeting creates the possibility to make educated targeted decisions. Behavioural targeting includes identifying consumers’ browsing habits, such as the web pages they visit, the time they spend there, the links they click, the search queries they perform, and everything they subscribe to, watch, listen to, read, buy, like and share. It is a form of monitoring individuals which allows to anticipate the desires of current customers and use this information for marketing decisions assuming that the needs of existing customers will closely match the needs of future customers. As such, it increases the effectiveness of future marketing and advertising campaigns. For companies with a web store, behavioural targeting also includes studying browsing and purchasing habits of previous customers. This also helps to predict what customers are likely to buy in the future. When properly used and interpreted, behavioural targeting supports effective advertising efforts to ensure that only those customers with real interest in the company’s services and products are targeted.
There are risks attached to behavioural targeting, profiling and automated decision-making especially if used at a large scale. In addition to cyber security issues such as data leakage, risks for individuals include discrimination, de-individualisation and stereotyping. Often the right to privacy is invoked to mitigate these risks. The GDPR, in an effort to define and enhance the rights of data subjects to control their personal data, contains many restrictions on automated data processing – and decisions based upon such processing – to the extent that they can be characterised as profiling.
Given the rapid technological developments in the area of profiling, it has been questioned whether current data protection laws and the right to informational privacy and data protection provide an adequate level of protection and are effective in balancing different interests when it comes to profiling. The GDPR attempts to address those concerns.
The intention of GDPR is not to make existing business models obsolete. The objective is to design an international framework that companies need to respect when developing and executing business models. This implies that the far majority of existing business models can continue operating in their current form, provided that certain conditions are met.
In the GDPR profiling is defined as:
“any form of automated processing of personal data consisting of using those data to evaluate certain personal aspects relating to a natural person, in particular to analyse or predict aspects concerning that natural person’s performance at work, economic situation, health, personal preferences, interests, reliability, behaviour, location or movements.”
As such profiling is an automated form of processing carried out on personal data with the purpose to evaluate personal aspects of a natural person.
The “monitoring of an individual’s behaviour” is further explained in the GDPR:
“In order to determine whether a processing activity can be considered to ‘monitor the behaviour’ of data subjects, it should be ascertained whether individuals are tracked on the internet with data processing techniques which consist of profiling an individual, particularly in order to take decisions concerning her or him or for analysing or predicting her or his personal preferences, behaviours and attitudes.”
Based on article 22 of the GDPR, individuals have the right not to be subject to a decision which is solely based on automated processing which produces legal effects concerning the individual or significantly affects him or her. This is generally interpreted in such a way that individuals have the right to object to any such form of processing of personal data. This right to object does not exist if the processing is authorised by law or regulation, if it is necessary for the performance of a contract with the individual, or if it is based on the individual’s explicit consent.
The GDPR furthermore requires that the organisation which engages in profiling based on automated processing activities must implement “suitable measures” to safeguard the rights of the individuals. Individuals must in all cases be specifically informed about any profiling activities. In particular, individuals should be allowed to express their point of view, to obtain further information about the decision that has been reached on the basis of the automated processing, and the right to contest this decision. This is generally regarded as the right for individuals to request a ‘second opinion’ by way of human intervention. In practice, this means that organisations are obliged to explain on which information their decision has been based without solely relying on, or referring to, the automated decision-making process.
The use of sensitive personal data (i.e. data about sexual orientation, race, political beliefs, health) for automated decision-making purposes is generally not allowed under the GDPR, unless if based on explicit consent or necessary for reasons of public interest. In those situations suitable measures to safeguard the individual’s rights and freedoms and legitimate interests are in place.
Most people agree that the updated privacy legislation in the GDPR is needed. Rapid technological developments, however, allow organisations to use and reuse data sets at an increasingly rapid pace and larger scale. As a result of the continuous use and reuse of data, their characteristics, structure, and value may change and it will often no longer be possible to direct them back to the source. Individuals (the data subjects) and owners of data sets may easily lose track of the use and whereabouts of the(or their) personal data. This makes it much more difficult and in some cases practically impossible to enforce the rights and comply with the obligations of the GDPR.
The GDPR may thus already be partly obsolete before it has become enforceable failing to provide individuals with the desired increased protection of their privacy under all circumstances. It shows that the speed of technological developments far exceeds that of any legislative process.
“As a result of the continuous use and reuse of data, it will often no longer be possible to direct them back to the source”