Tech companies have grown up – we must treat them like adults
By Jennifer Baker
Europe’s data protection regime is widely regarded as the best in the world. The man charged with advising the EU’s lawmakers on privacy, the European Data Protection Supervisor, Giovanni Buttarelli, spoke to Jennifer Baker for the Good Technology Collective.
What is the biggest threat to our digital rights at the moment?
The biggest threat is that people, human beings, get reduced to the status of online robots working unconsciously for unaccountable companies and the state surveillance apparatus. Devices, web-based services and apps have been deliberately designed to inculcate addiction in their users, in order to maximize the amount of attention given and personal information disclosed.
Most people are not aware of what is happening when they are connected, which is now almost constantly. We know about toys that listen to children, personal assistants which record activities in the most intimate places, smartphones which are monitoring movements even when they are switched off.
The dominant business model for web-based services and IoT devices is to monetize attention and data in exchange for providing “free services.” Look at statistics from the OECD and recent articles in the Economist and FT — there has been an extraordinary concentration of market and informational power into just a few hands. The result is that the five biggest companies in terms of market capitalization are all tech companies. And their size dwarfs even the size of the big oil companies whose place they have taken.
Their power over our lives is remarkable. Tech platforms are the gateway and highway for social cultural and commercial interactions. They even seek in some countries to be the gateway to the Internet itself.
This was not the vision of the founders of the world wide web.
I know this because I have spoken in person to people like Tim Berners Lee. There are similar trends now in the field of AI and machine learning. Now we have the ‘founders’ of machine learning like Yoshua Bengio saying that centralization of talent and algorithmic power is “dangerous for democracy.”
EU commentators can be too focused on EU and their “frememies” in the United States. We have to engage with what is happening in China, with the Social Credit “trust” system as a tool for potential social coercion. We have to engage with India where the Aardhaar biometric identification system has become effectively compulsory for a billion people when they need to participate in civic and commercial life, but where the Supreme Court has also last month affirmed data privacy to be a fundamental right.
We have to engage with the 121 states which have adopted or are going to adopt data privacy laws, to a large extent based on or inspired by Europe’s approach in the Council of Europe Convention 108 and the EU data protection framework.
European countries are now in the minority among countries. But if you look at map of the world, the countries lacking data protection tend to be poorer. We have to engage with these countries. We cannot allow the privacy to be privilege of those who can afford it.
This is not to be alarmist. But these trends have much wider consequences than simply data protection or even privacy.
Where do you see the biggest challenges in the next 20-30 years?
Try to remember life back in 1987 or 1997 and our predictions of the future. Blade Runner in 1982 predicted urban life in 2019 consisting of flying cars and bioengineered “replicant” humans.
We could not have predicted that everyone would be carrying in their pockets a device as powerful as mainframe computers in the early 1980s.
The speed of change has been intoxicating but there is always ‘broken glass’ on the carpet, there are always casualties in revolutions, the ‘digital dividend’ has not been evenly distributed. Most people do not feel empowered by their online experience. For a decade companies, which have grown enormous thanks to the quantity of personal data they have accumulated, have claimed privacy is dead. But the champions of this movement have spent billions to safeguard the privacy of their personal lives and the confidentiality of their corporate life.
So the challenge will be to make these digital innovations sustainable for the next decades. We have to safeguard the interests of individuals and groups. It cannot simply be surrendered to Schumpeterian forces. We need to decentralize the Internet, give people more choice. Freedom to express themselves without the fear of being constantly monitored and monetized.
Data Protection Authorities (DPAs) have a crucial role. If they don’t take the lead, I do not know who can. They have to be properly resourced, tech savvy, and trust each other. The EU’s General Data Protection Regulation (GDPR) will only succeed if the consistency mechanism and the one stop shop is successful.
Should lawmakers attempt to stay ahead of the technology curve, or is a wait and see approach better?
We need to be always thinking ahead.
Laws take time — the GDPR has taken 10 years between Commission review of the directive and full entry into force in 2018 . More than 10 years when you consider the potential delegated acts and implementing acts, codes of conduct, guidance from European Data Protection Board, etc.
These need to be good laws which step to correct market failures, and to genuinely safeguard
individual rights and interests. But sometimes you don’t need new laws, you just need to enforce the existing ones better.
What impact do you think the current data explosion will have on competition law?
Antitrust rules are in the spotlight right now. We have a very active and committed Commissioner for competition, who is showing great courage.
The German competition authority also is taking initiative to show that a dominant digital company may abuse their power by imposing unfair terms and conditions and data use policies on their customers.
We certainly need much more cooperation between regulators in different sectors. Our initiative of the digital clearinghouse is meant to provide a forum for competition, consumer enforcers and DPAs to meet and exchange notes, learn lessons together.
Also because there are some areas of harm — like with connected IoT products — where consumer groups like BEUC are very worried will escape proper regulation. Regulators must be more conversant with technology and get out of their silos.
What future – even highly speculative technology – are you most excited about in the next 30-50 years?
I am not a futorologist. I am an Italian judge “of a certain generation.”
I want to see technology deployed to maximize scope for free expression and safeguarding of intimate spaces. Not many people are talking about it yet in the public policy space, but the shift to quantum computing is likely to present a brand new paradigm for all of us in the next 10-15 years. I was in California last year and one of the leading quantum engineers predicted it will be like the dawn of the Internet. One possible application will be securing communications.
To what extent can we expect ordinary citizens to understand the tech behind the everyday apps they use?
People are entitled to be informed about any activity which has an impact on them. Terms and conditions and data use policies have been traditionally drafted to protect companies from legal challenges, not to inform and protect the individual. There are now some beautifully presented policies available, “dashboards,” etc.
But they are more significant for what they do not tell you — for example what exactly will these controllers do with your data? How can you opt out of your data being used and shared and stored by the company?
But transparency is not enough. We cannot place the burden on ordinary people to understand complicated big data analytics. Accountability is more important — complying with the rules and being able to demonstrate compliance with the rules. And DPAs have to be accessible to citizens. GDPR Article 80 provides for advocacy organizations to defend groups of individuals and their rights — we need to test this.
To what degree should we expect tech companies to “self regulate?”
Tech companies have tended to be left to self regulate in the last 20 years. They are mature and powerful players now and they should be expected to attract the attention of regulators. For data protection , let’s treat them like adults. They are clearly within scope of GDPR and they must now develop a culture of accountability.
Soon, I hope they will require under the new ePrivacy Regulation to adjust their business models to avoid monitoring of communications. But there are other tools, like antitrust and merger control. Regulators need to look at the longer term consequences for consumer welfare of proposed concentrations in the markets, before the regulator decides whether to allow the merger or not.
What are the big ethical challenges you foresee with machine learning, AI, etc?
Companies and governments are beginning to take advantage of technological developments related to the Internet of Things, big data, robotics and artificial intelligence. These are no longer abstract concepts, they are a reality that we must accept and react to.
Though these developments can bring many benefits for individuals and society, these benefits depend on ensuring that our values, based on a common respect for human dignity, remain a core component of innovation.
AI, biotech, virtual and enhanced reality – all these developments will question core notions of what it means to be human, to make rational, autonomous decisions, to be accountable for the actions of machines. Come to the 2018 international privacy Commissioner’s conference in Brussels next year, as these questions will form the theme for this event.