LONDON – Apple’s new watch keeps track of your health. Google Now gathers the information needed to compute the ideal time for you to leave for the airport. Amazon tells you the books you want, the groceries you need, the films you will like – and sells you the tablet that enables you to order them and more. Your lights turn on when you get close to home, and your house adjusts to your choice of ambient temperature.
This amalgamation and synthesis of digital services and hardware is designed to make our lives easier, and there is no doubt that it has. But have we stopped asking fundamental questions, both of ourselves and of the companies we entrust to do all of these things? Have we given sufficient consideration to the potential cost of all of this comfort and ease, and asked ourselves if the price is worth it?
Every time we add a new device, we give away a little piece of ourselves. We often do this with very little knowledge about who is getting it, much less whether we share their ethics and values. We may have a superficial check-box understanding of what the companies behind this convenience do with our data; but, beyond the marketing, the actual people running these organizations are faceless and nameless. We know little about them, but they sure know a lot about us.
The idea that companies can know where we are, what we have watched, or the content of our medical records was anathema a generation ago. The vast array of details that defined a person was widely distributed. The bank knew a bit, the doctor knew a bit, the tax authority knew a bit, but they did not all talk to one another. Now Apple and Google know it all and store it in one handy place. That is great for convenience, but not so great if they decide to use that information in ways with which we do not proactively agree.
And we have reason to call into question companies’ judgment in using that data. The backlash to the news that Facebook used people’s news feeds to test whether what they viewed could alter their moods was proof of that. I do not recall checking a box to say that that was okay. Recently, hackers misappropriated photos sent via Snapchat, a service used primarily by young people that promises auto-deletion of all files upon viewing.
Likewise, health-care data were always considered private, so that patients would be open and honest with health-care professionals. As the lines between health care and technology businesses become hazy, some manufacturers of “wearables” and the software that runs on them are lobbying to have their products exempted from being considered medical devices – and thus from regulatory requirements for reliability and data protection.
Privacy is only one part of a larger discussion around data ownership and data monopoly, security, and competition. It is also about control and destiny. It is about choice and proactively deciding how people’s data are used and how people use their own data.
More mature firms have phased in formal protocols, with ethics officers, risk committees, and other structures that oversee how data are collected and used, though not always successfully (indeed, they often depend on trial and error). Small new companies may have neither such protocols nor the people – for example, independent board members – to impose them. If serious ethical lapses occur, many consumers will no longer use the service, regardless of how promising the business model is.
We like new applications and try them out, handing over access to our Facebook or Twitter accounts without much thought about the migration of our personal data from big companies with some modicum of oversight to small companies without rigorous structures and limits. Consumers believe or expect that someone somewhere is keeping an eye on this, but who exactly would that be?
In Europe, legislation to protect personal data is not comprehensive, and much of the rest of the world lacks even rudimentary safeguards. After exploring this issue with legislators in several countries over the past couple of months, it has become abundantly clear that many do not have a full grasp of the myriad issues that need to be considered. It is a difficult subject to address, and doing so is impeded by lobbying efforts and incomplete information.
In the short term, young companies should view ethics not as a marketing gimmick, but as a core concern. All organizations should invest in ethics officers or some sort of review process involving people who can assess all of the implications of a great-sounding idea. Legislators need to educate themselves – and the public – and exercise more oversight. For example, just as many countries did with car seatbelts a generation ago, a public-safety campaign could be paired with legislation to explain and promote two-step verification.
In the longer term, as we rightly move toward universal Internet access, we need to ask: How much of ourselves are we willing to give away? What happens when sharing becomes mandatory – when giving access to a personal Facebook account is a job requirement, and health services are withheld unless a patient submits their historical Fitbit data?
If that is the future we want, we should stride toward it with full awareness and a sense of purpose, not meander carelessly until we fall into a hole, look up, and wonder how we got there.
Lucy P. Marcus is CEO of Marcus Venture Consulting.
Copyright: Project Syndicate, 2014.