The ongoing collision of big data and the internet of things raises whole new concerns about maintaining security, privacy, and fairness of personal data, says Julie Brill, member of the Federal Trade Commission.
Brill spoke earlier this month at the Cyber Security and Privacy Summit hosted by Washington State Gov. Jay Inslee.
“The data from connected devices will be deeply personal, and big data analytics will make the data more readily actionable,” said Brill. “Some of these devices will handle deeply sensitive information about our health, our homes, and our families. Some will be linked to our financial accounts, and some to our email accounts.”
However, she added that people won’t change much.
“We as individuals will remain roughly the same. We will not suddenly become capable of keeping track of dozens or hundreds of streams of our data, peering into the depths of algorithmic decision-making engines, or spotting security flaws in the countless devices and pieces of software that will surround us,” she warned.
Faced with a world of uncertainty about which devices are safe and whether they are getting a fair shake in the big data world, Brill continued, “consumers could use some help.”
Major inroads possible into our lives
This rapidly evolving environment raises issues that have yet to be resolved. Brill divided the issues into the three areas of security, privacy, and fairness:
“Because these connected devices are linked to the physical world, device security also is a top concern,” she said. To wit:
• No armor. Of the 90% of connected devices that are collecting personal information, 70% transmit the data without encryption.
• No expertise or recognition. Traditional goods manufacturers may not have the expertise, or even realize they need such expertise, to secure their new devices.
• Cheap as dirt. Many connected devices will be inexpensive and essentially disposable.
• Just because the plug fits … Security vulnerabilities may be hidden deep in the code that runs an app or device, which may not become apparent until it is connected to an environment for which it wasn’t designed.
“All of these factors point to the need to take an all-hands-on-deck approach to data security, with security researchers playing an important role in bringing security flaws to light,” Brill said.
“Consumers want to know—and should be able to easily find out—what information companies are collecting, where they’re sending it, and how they’re using it,” said Brill. She said that information plays an important part in consumers’ decisions about whether to use digital products and services in the first place.
However, obstacles have emerged:
• Didn’t know they were watching. Many companies, including data brokers, ad networks, and analytics firms operate in the background with consumer data.
• Devices give no clues. Many connected devices do not have a user interface to present information to consumers about data collection.
• Queries not answered. Questions have arisen about who should receive disclosures about data collection and use practices; how would consumers or innocent bystanders know when a device is recording images or audio; and how will the collected data be secured.
Brill said that manufacturers of connected devices should recognize that providing transparency will require some creative thinking.
“Visual and auditory cues, and immersive apps and websites should be employed to describe to consumers, in a meaningful and relatively simple way, the nature of the information being collected … and provide consumers with choices,” Brill said.
Certain data brokers assemble individual profiles on consumers from various sources which are used for marketing practices.
On such firms specifically, Brill said that “while this kind of information can be used for relatively benign purposes, or even in ways that will enhance financial inclusion, this kind of information has also been used to harm vulnerable consumers.”
Again, pairing big data with internet of things in this area creates new concerns:
• Credit scores used beyond credit world. The use of scores, such as credit scores, can go beyond decisions about mortgages, for example, to other major decisions such as whether a prospective employer would extend a job offer to a given applicant, or whether insurance companies would charge higher premiums on auto or homeowners insurance.
• Scores grown outside the regulatory zone. The use of many different types of scores has proliferated to make eligibility determinations covered by the Fair Credit Reporting Act, yet they haven’t yet been subject to the same kind of scrutiny that Congress and federal agencies have brought to bear on traditional credit scores.
• It all happens in a black box. Scoring algorithms and other forms of big data analytics rely on statistical models and data system designs that few on the outside understand in detail.
“This suggests that testing the effects of big data analytics may be a promising way to go,” Brill said, adding that “companies using scoring models should themselves do more to determine whether their own data analytics result in unfair, unethical, or discriminatory effects on consumers.”
In summary she says, “For now, the rapid changes in big data analytics and the internet of things have made it difficult to meet some of these expectations in practice. The key point, however, is that these are the enduring expectations of consumers, rather than relics of a simpler world.”