Most people talk about self-driving cars with breathless excitement — after all, think about the possibilities. No more worrying about people driving under the influence. No concerns about a hit and run involving a distracted, texting 17-year-old. No more stressful commutes filled with honking and traffic. The list goes on and on.
A land populated by self-driving cars sounds like one step closer to utopia — except for the fact that widespread adoption of driverless cars has at least one serious downside. Besides the mass disruption of multiple industries, including (but not limited to) the auto industry, taxi industry, and trucking industry, really getting the most out of this new technology requires that everyday consumers give up a serious amount of their privacy.
For much of America, getting your driver’s license at 16 seems like an opportunity for greater freedom. No more having to ask your parents to drive you places; instead, you can get in your car and go wherever you’d like. But with self-driving cars, that freedom is curtailed. Instead of being able to drive around on a whim, you either have to call a self-driving taxi, or tell your car exactly where you want it to go — which means no more scenic route. And every time you tell your car where you want it to go, that information is stored somewhere, and mined to find patterns in your behavior so that the car can be more receptive to your needs.
The purpose of a self-driving car is not just to get you from point A to point B. Like with any other machine learning or artificial intelligence technology, self-driving cars will eventually evolve to become vehicles receptive to our requirements, which in turn requires massive amounts of data on individuals. Given the vast number of vehicles out there, for personal as well as commercial use, the amount of data that self-driving cars will collect is massive. Google’s self-driving car, for example, gathers 750 megabytes of data per second — and that’s just from its sensors.
Tesla recently asked drivers of its cars for permission to collect more data, in order to improve the vehicle’s self-driving capabilities. Tesla would collect this data via videos taken using the car’s external cameras. This is in addition to the data the company already collects or has access to, including a vehicle’s location, performance, condition and service history. As Pat Clawson, the chief executive officer of data security firm Blancco Technology Group, notes: “You have car owners and renters syncing millions of data points (contact details, music apps, maps and so much more data) from their smartphones directly onto the dashboards… And when you move to driverless cars, the types and amounts of data stored [increase] significantly.” As a result, privacy is a huge issue, and one that neither companies nor governments are currently doing much to address.
There are undoubtedly those who will say that it is worth it to give up personal information if there is enough of a benefit to be had. In the case of self-driving cars, the benefit is clear: fewer accidents, less traffic, more efficiency. But using them also puts your data firmly in the hands of several big players: Google, Uber, Tesla, and Ford, among others. Depending on the type of information that they collect, you could end up giving up a lot more personal information than you were bargaining for. Tesla, for example, says that it may access personal settings in its vehicles (ie. navigation and browsing histories, contact lists, and even radio listening history) while performing remote diagnostic functions, although the number of employees who have access to such sensitive information is limited.
When Hurricane Irma came barrelling down on Florida, a Tesla user asked the company to unlock the full amount of energy in his car battery so that he could escape the path of the storm. Tesla was able to do this remotely, for them and all other customers affected by the storm, due to the software that came installed in that particular model. While this particular over-the-air update proved to be enormously useful, what if the next one isn’t as benevolent? More importantly, will users even know if a company is accessing their private information?
The issue at hand is not the fact that driverless or self-driving cars are collecting data per se; rather, it’s the fact that users (or passengers) may not be aware of the types of information being gathered, or even how it’s being used. According to Tesla’s website, the company reserves the right to “share information [they] collect with [their] service providers and business partners, with other third parties” authorized by the owner, and “with other third parties when required by law, and in other circumstances.” The very fact that there are people at these companies who can tell where every car is at any given time is, frankly, kind of off-putting — and a recent report that a federal advisory board that focused specifically on self-driving car technology has been all but disbanded, doesn’t inspire much confidence, either.
Ultimately, self-driving cars will happen; that’s a foregone conclusion. But the amount of transparency consumers demand of car companies with respect to data usage? That story has yet to be told. We’re at an inflection point, and only time will tell how many questions consumers ask as they hand over a large swath of data to corporate America.