Four Strategies for Protecting User Data

You probably do not have sinister plans for collecting consumers' data, but when deploying a customer-facing IoT system, you need to carefully plan how, when, where and why data is collected and stored.
Published: February 18, 2016

Two lawyers walk into a conference room full of technologists.

No, that is not the beginning of a bad joke. It’s what happened last week at the IoT Nexus conference in San Francisco.

Laura Berger

Laura Berger, an attorney in the Federal Trade Commission’s Division of Privacy and Identity Protection, and Gail Gottehrer, a partner with the firm Axinn, Veltrop & Harkrider LLP, appeared on a panel designed to help companies that are deploying Internet of Things technologies to, as the panel’s title put it, “align privacy requirements with business needs.”

Early in the session, Berger asked any attorneys in the room to raise their hands. Not a single one of the approximately 50 attendees in the conference hall at San Francisco’s Kabuki Hotel moved a finger. She said that did not surprise her, and that companies deploying IoT technologies that capture and store consumer data often fail to engage their legal team in determining what information will be collected, for how long, for what purpose, and how the firm will go about informing consumers about this data collection and provide options for opting in or out. That is partly why she is currently working on 50 different court cases involving privacy protections and consumer data, she added.

Gottehrer reminded the audience that they should be thinking critically about not just consumer data collection, but also data collection related to employees. Her firm represents businesses that have faced litigation brought from workers who allege that they were fired because data collected by their employer tracked their whereabouts.

Say, for example, a company uses telematics or geofencing to monitor a fleet of trucks. The firm might not take into account where the driver goes during a lunch break, but if that driver is later fired, he could allege that the employer terminated him because, say, he visited an AIDS clinic or attended a political rally. “Oftentimes,” she said, “the employee was fired for poor performance and the company did not even realize it was tracking location data when the driver was off-duty and never planned to use it.” In these cases, “the company has to prove a negative, which is very hard to do.”

During the course of the panel, Berger and Gottehrer offered a number of best practices that businesses integrating IoT technologies into their products or operations ought to heed.

Be careful about the types of data you collect. What you can collect and what you should collect are not always the same things. Say you make a smart-home device and, in order to set up a user account and send alerts to the user, request an e-mail address. You could also request that users send you their e-mail account passwords, and a surprisingly high percentage of users might comply with this request, not realizing how vulnerable this makes them in the event that your corporate database is hacked. There is no benefit to storing users’ e-mail passwords (unless you plan on hacking into their accounts) and plenty of risk in doing so, Berger said, so businesses shouldn’t collect it in the first place. “We tell people to think about the potential for data collection to harm consumers,” she said, adding that as a rule of thumb, the more consumer data you collect, the bigger your burden is to responsibly and securely store it.

The panelists advised consulting with your firm’s legal counsel, as well as your marketing, engineering and IT departments, to obtain input into what information an IoT-based product or service should and should not collect. The marketing department might push for data that can be sold to a third party, for example, even if it has no relevance to the product or service you are selling. Ask the attorneys whether doing so would present any legal problems. Ask the IT folks about the data storage and security implications, or about how they will store this ancillary data. “Most companies do not have designs to harm consumer privacy,” Gottehrer explained, “but they might say, ‘We want to collect data for marketing’—yet they have no idea of the legal implications [of doing so].”

It is also important, they said, to securely delete consumer data as soon as its value has been realized. “Unless you’re a drug company that needs to keep data for a long-term study, there is likely no value in storing consumer data for a long time,” Gottehrer said. “For most companies, data that is more than a year old is probably not valuable. So look at that older stuff, or stuff that you didn’t plan to collect, and get rid of it.” An important caveat here, of course, is that if your firm is being sued, that information must be retained for discovery. In fact, that is another reason to store only essential data of ongoing value, because once a legal action is taken against your firm based on the consumer data it collects, you will need to pay your lawyers to review all stored data. And that, Gottehrer added, can get very expensive.

Gail Gottehrer

Make data collection easy for consumers to understand, and stick to the End User License Agreement (EULA). The panelists admitted that few consumers ever carefully read the EULA—and whatever data collection it specifies—before accepting the license and using the product. Regardless, the EULA is your main opportunity to convey your data-collection practices and obtain consumer buy-in. But the language must be clear, and any changes in the way your company collects or handles data must also be made clear to users through policy updates.

“The FTC‘s operating statute bars unfair deceptive policies,” Berger reminded attendees. “In [terms of consumer] privacy, that means if you handle consumer information in a way that is different than the way you told them you would, or if you do something material with their information that can violate that primary statute, [you are in violation].”

To emphasize the importance of clear language, Berger noted that the FTC brought a case against a company that made a flashlight app for a smartphone that was collecting users’ location data. “They did disclose this [data collection] in the EULA, but it was buried,” she said. So EULA policies need to lay out data uses that “a reasonable consumer could expect and understand,” she added. Otherwise, “that is when you get in hot water.”

Build smart, secure data architecture from the ground up. It’s not difficult to find examples of IoT products that have been exploited due to weak security protocols. Berger referenced a case that the FTC brought against the manufacturer of an IP-based video camera because the company (TRENDnet) had failed to build adequate software security into the device, which streamed captured video to the Internet. Without a password required, all one had to do to watch a camera’s stream was to add its IP address to a common URL. Hackers who did so had posted links to the live feeds of nearly 700 of the cameras.

Make security an incentive. Gottehrer noted that insecure video cameras or toys are a major concern for consumers; toymaker VTech, for example, failed to secure the names, e-mail addresses, passwords and home addresses of 4.8 million parents who had purchased connected toys for their children. But when companies fail to build adequate privacy protections into consumer-facing IoT products and services, the harm that causes can have very wide ripples. She pointed to companies in other sectors, such as health care or transportation, in which the IoT could enable very important or life-saving applications. “The fear,” she said, “is that litigation [based on privacy infringements] could hold companies back” from developing such connected products and services.

Gottehrer, therefore, suggested that companies deploying or considering IoT-based products and services need to focus on employee education. Many businesses already devote resources to training employees to identify incoming data security threats, she noted, such as e-mail phishing. Employees should also be cognizant of the privacy and security implications of the products and services they sell. She even suggested that bonuses or performance reviews should reward employees for taking actions to address these issues.

In the end, privacy laws exist to protect consumers, even when they fail to take common-sense steps to protect themselves. “Data protection is important even in these times, when people share shockingly private information on their Facebook pages,” Gottehrer stated. “But if you have [personal data about consumers] and illegally share it, the impact on your company can be shocking as well.”