”Put humans first” warning on data use

Putting mankind first in an age of mass data collection and machine learning is vital, according to a new report from the UK’s Royal Society and British Academy.

Report cover – ‘Data management and use: Governance in the 21st century.

The report warns there is an urgent need for a new governing body to oversee all data use and management. They fear unethical use of private information will lead to public relations disasters that could slam the brakes on the data-driven technology industry. So they recommend an independent authority to monitor data use, with powers to act if needed.

Small print

Cooley’s Mark Deem

An alternative way to achieve greater public awareness of data governance is proposed by technology law specialist Mark Deem of Cooley, the international law practice. Deem notes that most people willingly give away their personal details several times a day. They agree to the terms and conditions attached to a new app, subscription or software upgrade. With the Future Intelligence team, he has demonstrated that the terms and conditions – the ‘small print’ – attached to use of personal information are largely ignored.

In an experiment in London, Deem and the FI team showed people ticking the box to accept ‘Ts and Cs’ that gave away their first-born child to a free public wifi provider link to YouTube video.

Mark Deem proposes an innovative solution: a standardised privacy clause.

This simple form of words could be clearly flagged up and internationally recognised. Perhaps, Deem suggests, it could offer different levels of protection such as bronze, silver or gold. Individual products might include variations or additions to the standard formula, but the basic terms and conditions could be quickly read and readily understood.

Researchers cited by the Royal Society report show that our carefree acceptance of terms and conditions that give away personal rights to commercial companies does not extend to the public sphere. Governments collect, retain and use personal data but do not enjoy universal public trust. The Royal Society/British Academy report does not even mention Edward Snowden’s 2013 revelations about secret surveillance of citizens. It cites historical examples, showing how the fires and fallout at Windscale power station, UK (later re-named Sellafield) Three Mile Island, USA, and Ukraine’s Chernobyl disaster damaged public confidence in atomic power generation.

In the same way, the report’s authors say, nineteenth-century farmers resisted the advent of the railways because the noise would “stop hens laying”. The potential benefits from sharing National Health Service patients’ records were lost in 2015 because of fears that they would not be anonymised and suspicions about plans to share data with insurance companies. add quote here from report And British identity cards were introduced, and then abolished, on two occasions during the twentieth century, showing the extreme tension between citizens’ rights to privacy and governments’ need to keep track of them.
Even in today’s dangerous times, the RS/BA report notes that in an Ipsos GlobalTrends 2016 survey, half of the Britons surveyed found it unacceptable for government to use surveillance of their communications without consent, in the context of the immediate threat of a terrorist attack (only one third found it acceptable)59,60.”

“Human flourishing”
The two august bodies describe their goal not as “human primacy” (the way Future Intelligence framed the basis for digital ethics our 2014 research study link to report) but rather as “flourishing” – the notion that humans must not only survive but also thrive in the information age. They see this as an over-arching principle that covers the other pillars of good governance:

  • protect individual and collective rights and interests
  • ensure that trade-offs affected by data management and data use are made transparently, accountably and inclusively
  • seek out good practices and learn from success and failure
  • enhance existing democratic governance

Infographic by British Academy/Royal Society

Some of the report’s authors express concern about specific areas of data use that are open to abuse. For example, data collected from thousands of individual personal fitness trainers tracks the number of steps taken each day and the amount of physical activity and sleep patterns. This could be used to identify conditions such as depression or obesity. Each individual may have given permission for his or her personal data to be logged – but who can permit the wider use for diagnostic or insurance purposes? Roger Taylor, who chairs the Open Public Services Network of the Royal Society for the encouragement of the Arts (RSA) notes in his contribution:

“People want data to be used to target information more effectively. They do not want it used for nuisance marketing. This cannot be effectively policed by control of  the usage of data because nuisance marketing is no more than the poor execution of targeted marketing.”

Taylor’s proposed solution: scientific researchers should have access to the widest possible pools of data, and this would have to be squared with the commercial companies who are also trying to use it to monetise their online businesses.

The gap between the haves and the have-nots is an issue underlined by Professor Sabina Leonelli of the University of Exeter. She concludes that since not everyone is equally involved in data collection, processing and use, there is a risk of exacerbating digital divides.

To reconcile all the tensions and perspectives expressed in the report, the authors propose a new body to monitor data collection and use that should have oversight over existing watchdogs, such as the General Medical Council, Information Commission and eleven other national scrutiny systems. One suggestion is that it should report directly to Parliament. There is also an insistence that the new body should have an adequate and constant supply of independent funding.