From cars to watches, IoT technology is having a profound impact on how products work and how consumers interact with them. Yet, security and privacy experts have been raising red flags for years, concerned that these devices will open the door to data thieves and degrade civil liberties.
At the State of the Net conference, held yesterday in Washington, D.C., the Federal Trade Commission (FTC) issued two reports about Internet of Things technologies. Both assessments focus on the security and privacy issues surrounding consumer-facing applications and products (as opposed to those used at places of business), and are written for companies that are deploying these products and services. The main report, titled “The Internet of Things: Privacy and Security in a Connected World,” summarizes the recommendations that resulted from a day-long, same-named workshop that the FTC conducted in November 2013. The second, “Careful Connections: Building Security in the Internet of Things,” is meant to serve as a guidebook for companies developing consumer-facing products.
The main report reviews key privacy and security issues that could harm consumers, such as the threat of hackers exploiting weak security protections in order to gain access to private data transmitted to or from IoT devices, or infiltrating a large number of IoT devices and using their particular vulnerability to execute a denial-of-service attack. The report notes that “denial of service attacks are more effective when the attacker has more devices under his or her control; as IoT devices proliferate, vulnerabilities could enable these attackers to assemble large numbers of devices to use in such attacks.”
Another area of concern is that IoT devices could serve as a pathway to physically harm consumers. At the November 2013 FTC workshop, Tadayoshi Kohno, an associate professor in the University of Washington‘s Department of Computer Science, described how he was able to access settings on an Internet-connected insulin pump and cause it to malfunction; another participant had found a security hole in a vehicle’s telematics system that would allow someone to remotely tinker with the car’s braking system.
With respect to privacy concerns, the report notes that while some risks—such as the collection of sensitive personal information or geolocation information—already exist in “traditional Internet and mobile commerce,” these are potentially magnified by IoT devices automatically collecting “personal information, habits, locations, and physical conditions over time, which may allow an entity that has not directly collected sensitive information to infer [such information].” Workshop participants also expressed concern that without proper safeguards, insurers or potential employers could hypothetically access an individual’s personal data—sleep patterns, level of exercise, or even mood, for instance—collected through a consumer’s wearable IoT devices, and consider that information when making decisions regarding whether or not to issue that person an insurance policy or offer him or her a job.
Recommendations
The second half of the report focuses on debate around whether the use of conventional digital security measures, known as Fair Information Practice Principles (FIPPs), are appropriate when designing security measures for IoT devices. The FIPPs principals state that companies should provide notice to consumers when personal data is collected, offer a choice to opt in or out of such data collection, provide access to this information, and ensure accuracy, data minimization, security and accountability. A number of U.S. regulations use FIPPs or borrow from it, including the Health Insurance Portability and Accountability Act (HIPAA) and the Consumer Privacy Bill of Rights. Some workshop participants argued against using data minimization, notice and choice as tools for protecting consumer privacy in the IoT.
Data minimization refers to a company restricting the amount of personal data it collects from its users, the idea being that less data collected means less data for which it is responsible to safeguard. But some IoT industry pundits claim that such a practice could stifle innovation, with respect to the applications that a company could develop based on its data-collection efforts.
In terms of providing notice to consumers whenever data is collected, as well as managing their opt-in or opt-out preferences, some industry watchers involved in the FTC workshop expressed dismay at how difficult or impossible this would be, since the IoT device comes without a digital display or any means of conveying information to a user via text or audio.
Addressing these types of concerns, the report notes, will likely require new approaches to data-security best practices that have evolved through Internet- and smartphone-based consumer applications. For example, perhaps companies would demarcate each piece of data that they collect, by attaching digital markers (“tags”) to the information that would dictate how each piece of data could and could not be used (both by the collector and by third parties).
While noting that security and privacy risks are real and growing as IoT devices proliferate—scores of consumer products linked to the Internet have entered the marketplace since the FTC’s 2013 workshop—the agency said it does not believe these risks should be addressed through legislative action at this time. However, the commission’s staff does use the report to urge Congress to enact “general technology-neutral data security legislation” and “broad-based (as opposed to IoT-specific) privacy legislation.”
In addition, the report calls for companies to self-regulate and adopt best practices, such as building security features into the products starting at their earliest design stages and developing some form of data-minimization practice, as well as offering a reliable means of upgrading products with security patches once they are available in the marketplace. The companion report, “Careful Connections: Building Security in the Internet of Things,” goes into further detail regarding best practices.
Reactions
The staff report’s findings are consistent with public statements made by FTC chairwoman Edith Ramirez during the recent Consumer Electronics Show, and those in an article that FTC commissioner Terrell McSweeny wrote for the technology news site Re/Code. But Maureen K. Ohlhausen—one of Ramirez’s and McSweeny’s fellow commissioners (the FTC has five politically appointed commissioners, with no more than three from the same party)—issued a statement, echoed in a number of tweets she also sent on Tuesday, in which she argued against the report’s recommendations for enacting broad privacy legislation. Specifically, she noted that the FTC already has authority to enforce regulations that companies must require consumers to opt in and consent to personal data collection. Ohlhausen described the report’s recommendation that companies use data-minimization tools, albeit with flexibility, as “overly prescriptive,” and wrote, “The report, without examining costs or benefits, encourages companies to delete valuable data—primarily to avoid hypothetical future harms.”
In the statements and on Twitter, Ohlhausen called the report a “missed opportunity” to delve into the tensions between FIPPs and the IoT, particularly with regard to data minimization. Ohlhausen said the staff report’s recommendations followed the precautionary principal (decision-making through which an organization attempts to mitigate harm by considering how a product or service could negatively impact users), whereas she prescribes to the concept of “permissionless innovation”—as expressed by Adam Thierer, a senior research fellow at George Mason University’s Mercatus Center—in which companies are encouraged to develop new products unabated “unless a compelling case can be made that a new invention will bring serious harm to individuals.”