To understand what’s
at stake in the battle between Apple and the F.B.I.over
cracking open a terrorist’s smartphone, it helps to be able to predict the
future of the tech industry.
For that, here’s one
bet you’ll never lose money on: Digital technology always grows hungrier for
more personal information, and we users nearly always accede to its demands.
Today’s smartphones hold a lot of personal data — your correspondence, your photos,
your location, your dignity. But tomorrow’s devices, many of which are already
around in rudimentary forms, will hold a lot more.
Consider all the
technologies we think we want — not just better and more useful phones, but
cars that drive themselves, smart assistants you control through voice or
household appliances that you can monitor and manage from afar. Many will have
cameras, microphones and sensors gathering more data, and an ever more
sophisticated mining effort to make sense of it all. Everyday devices will be
recording and analyzing your every utterance and action.
This gets to why tech
companies, not to mention we users, should fear the repercussions of the Apple case. Law enforcement officials and their
supporters argue that when armed with a valid court order, the cops should
never be locked out of any device that might be important in an investigation.
But if Apple is forced
to break its own security to get inside a phone that it had promised users was
inviolable, the supposed safety of the always-watching future starts to fall
apart. If every device can monitor you, and if they can all be tapped by law
enforcement officials under court order, can anyone ever have a truly private
conversation? Are we building a world in which there’s no longer any room for
keeping secrets?
“This case can’t be a
one-time deal,” said Neil Richards, a professor at the Washington University
School of Law. “This is about the future.”
Mr. Richards is the
author of “Intellectual Privacy,” a book that examines the dangers of a society
in which technology and law conspire to eliminate the possibility of thinking without fear of
surveillance. He argues that intellectual creativity depends on a
baseline measure of privacy, and that privacy is being eroded by cameras,
microphones and sensors we’re all voluntarily surrounding ourselves with.
“If we care about free
expression, we have to care about the ways in which we come up with interesting
things to say in the first place,” he said. “And if we are always monitored,
always watched, always recorded, we’re going to be much more reluctant to
experiment with controversial, eccentric, weird, ‘deviant’ ideas — and most of
the ideas that we care about deeply were once highly controversial.”
Mr. Richards might sound alarmist, especially to those who
believe theF.B.I.’s argument that its request for
Apple to hack into one phone is limited to this special circumstance.
“The particular legal issue is actually quite narrow,”
James B. Comey Jr., the director of the F.B.I., wrote in a blog post on Sunday. “We
simply want the chance, with a search warrant, to try to guess the terrorist’s
passcode without the phone essentially self-destructing and without it taking a
decade to guess correctly. That’s it.
But civil liberties activists say they’d have an easier
time believing Mr. Comey’s assurances if there weren’t a long history of the
government relying on legal cases based on old technology to decide how to handle
newer technologies. Courts in the 1960s and 1970s created rules for the
wiretapping of analog phone calls; those rulings were later applied as the basis for mass surveillance
of the Internet.
“By and large you get very little constitutional protection
for data housed by third parties, and that’s mostly a result of a Supreme Court
case from the 1960s — before email, before search engines, before social
networks,” said Chris Soghoian, the principal technologist at the American
Civil Liberties Union.
Mr. Soghoian pointed out the government had already tried
to turn connected devices into surveillance machines. In a mob case more than a
decade ago, the F.B.I. asked a company that made an in-dash roadside assistance
device — something like OnStar, which uses a cellular phone to connect to an
operator in case of an emergency — to secretly record the private conversations
of people inside a car. A court ruled against the F.B.I.’s request,
but only on the very narrow grounds that bugging the car would have interfered
with the proper functioning of the in-dash device.
“The court left open the door to surveillance as long as
the primary function of the device was intact,” Mr. Soghoian said. “So as long
as Amazon Echo can tell you what temperature it is or can still play music,
that case seems to suggest that the government might be able to force Amazon to
spy on you.”
Mr. Soghoian was referring to Amazon’s handy digital assistant, a device
that is constantly listening to your household conversations to try to offer
you friendly help. The Echo listens for a keyword — “Alexa!” — which prompts it
to start streaming your voice to Amazon’s servers to decipher your request.
Amazon, which declined to comment on how the Apple case might affect Echo
users’ privacy, has said it is not constantly recording people’s voices, and
that it keeps voice recordings only to help the system learn to better
understand you.
But the Apple case threatens to undermine
those promises. If a court can get Apple to hack into an iPhone, why couldn’t
it also force Amazon to change the Echo’s security model so the Echo can record
everything you say? Mr. Soghoian believes the Apple case could set that precedent.
“What we really need for the Internet of Things to not turn
into the Internet of Surveillance is a clear ruling that says that the
companies we’re inviting into our homes and bedrooms cannot be conscripted to
turn their products into roving bugs for the F.B.I.,” he said.
Some readers may argue for a simpler solution to this
problem: Opt out of the technologies that could be made to spy on you. Don’t
buy the Amazon Echo. Don’t put cameras in your house. Don’t use a thermostat
that connects to the Internet and can monitor when you’re home and when you’re
not.
There’s some merit to these arguments, but technology has a
way of worming its way into our lives without many of us making a conscious
choice to let it in. Smartphones and personal computers were once an
indulgence; then, as more people began to use them, they became inescapable.
The Internet of Things will follow a similar path.
Employers and insurance companies may require you to wear health-tracking
devices. It may become impossible to find cars without cameras and sensors.
Every fridge will come with cameras inside whether you like it or not.
“From a historical perspective, we’re entering into a very
new era,” said Jennifer Granick, director of civil liberties at the Stanford
Center for Internet and Society. Not long ago, we were living in a world in
which surveillance was difficult. “In the past, you and I would have a
conversation in person. No record would be made; nobody would have access to
it. I wrote things on paper; I burned them in my fireplace. They were gone
forever.”
But in the absence of technical and legal protections,
technology is upturning those presumptions.
“Now we have a surveillance-enabled world,” Ms. Granick
said. “It’s cheap, and it’s easy. The question that society has to ask is, Is
that what we really want?”