Page tree
Skip to end of metadata
Go to start of metadata

EU legislation for the compliance of data law - ensures that data stored about users in the EU is kept in a way. Huge potential cost for breaching the regulation (up to 4% of global turnover). It’s purpose is to harmonise the EU’s laws on data sovereignty.

Requirements covered in seven directives, including the right to be forgotten.

In the UK the DPA (data protection act) will be expanded to ensure that it fits compliance with the GDPR. Other nations will have their own local version of the DPA which should be updated.

The GDPR implies explicit consent must be granted. FOr example, Amazon can only use data for the purposes of selling things to the user, not to sell data on to aggregators (without explicit consent to do so).

How does anonymised data work in the GDPR? At what point does data stop being PII (personally identifiable information) so that such consent does not need to be granted? How does this impact on companies which use aggregate data to draw conclusions on populations? E.g. Cambridge Analytica? The consent deal is struck between the SP (service provider) and the customer (end user), so long as the customer consents then the SP is able to resell to whomever. This third party has no further obligations to the origin of the data.

EULAs/T&Cs (eng user license agreements / terms and conditions) are not enough. The consent can be given prior, in-line (during) and post the collection. For example, if a company collected information on an individual years ago, and now acquires a new company which can utilise that data in a new way, additional consent should be sought from the user to ensure compliance with the GDPR.

GDPR says to respect the user and their relationship. Consent becomes the basis of the trust relationship between the customer and the SP.

GDPR specifies how to behave, but does not specify how it should be done - remains technology agnostic. However, some level of technology-specific language may be appropriate - consider mentions of pre-checked checkboxes in EU Commerce Directive.

For a lot of companies what a specific user purchases is no longer directly important, it’s rather the trend analysis of that data that is important. AIs are being trained with trading information to produce predictions for future purchases, and this ‘creepy factor’ results in Big Brother style surveillance. This gives consumers the willies, and the GDPR is attempting to mitigate those concerns.

Some companies have to supply information they have gathered to be able to operate, consider eBays’ APIs for allowing individuals to look at listing prices, etc.

GDPR has escape clauses for the police / government agencies.

Consider - is the access log of a webserver owned by the server, or the individuals who connected to the server and on whose behalf each line was written? Is an IP address PII? Are MAC addresses PII?

How does GDPR apply to ID best practises? Reduce amount of data gathered by SP to the minimum set necessary to perform the action they are advertising. Using the ID platform as a single view across their userbase. This is especially important in large corporations which have formed through acquisitions who must have clear vision of their userbase across many possibly discrete systems. This view needs to be managed, and ForgeRock supply technology to help this situation.

Difference between primary and secondary data retainers - for example the BBC may be storing historical data of a user’s browsing habits which are hashed, kept anonymised and stored. However, the data used in the production system is available for the user to trim (e.g. remove previously watched episodes) if they wish. Important part is to reduce the ‘creepy factor’.

What’s the expectation of the GDPR with respect to unstructured data? Depends what you’re doing with it - aggregating usually ok if it’s anonymised. However, it still can’t be used to message individual with sales rhetoric, etc. unless those specific individuals have previously given consent to such action.

Don’t just consider the potential negatives of the GDPR, look at the positives too - giving users this amount of consent gives them a control ethic not just an excuse to say “I don’t want to give you this data” without reasoning.

Questions remain about what form it will take - a process? A consent form? A pop-up?

Also about how enforcement will occur, will it be PCIDSS audit style, or DPS “when you mess up we come get you” approach?

Will we end up moving towards a standard model of conditions, presented from the user’s point of view? For example, “I [the user] am happy to give you [the SP] X, but not Y”. Does this imply a necessary move towards more granular identity attribute sharing? For example to access a site requiring age verification simply sending an attribute from your profile signed by the govt. indicating you’re over 18?

In the old world things used to be forgotten. This doesn’t happen any more in the new digital world.

  • No labels