Press "Enter" to skip to content

Can they do that? Consumer rights in the age of Cambridge Analytica

Walter Smith

Last year, two Stanford researchers announced their automated process could analyze facial images and predict human subjects’ sexual orientation with much higher accuracy than human investigators guessing the subjects’ orientations. This discovery gives me pause, thinking of the dystopian possibilities it suggests.

Sure, artificial intelligence like this could be used by an advertiser to present customized messaging perfect for the viewer’s demographic—but it could also be used by an authoritarian state to profile and target citizens in disturbing ways. In some ways, nearly everyone has benefited from the increased ease and convenience of living in the digital age. But where are consumers left as technology becomes so sophisticated, it can discover our most intimate traits, whether we like it or not?

Laws to protect consumers are insufficient

Consumer protection laws prevent a range of abuses, but as business practices and technology evolve, new gray areas in the law naturally emerge. In one sensational example, recent news reports suggest the British company Cambridge Analytica misused Facebook data to develop psychometric profiles of millions of unconsenting American consumers. To log in to even one of those users’ accounts without authorization would likely be a crime, but surreptitiously “harvesting” the data visible to the consumer’s friends is not necessarily the same as hacking.

In the coming years, legislators, regulators, and the courts will increasingly be asked to clarify the proper balance between individual privacy interests and the possibilities that high technology creates for making use of data about individuals, particularly in industries like social networking and social media. Decisions to resolve these conflicts will likely be framed by reference to long-standing legal authority, and also some of a more recent vintage. Both for the regulated community and for individual consumers, more specific rules would be helpful to clarify what constitutes an invasion of consumer privacy.

Unfair or deceptive?  Unclear

One of the most familiar touchstones of consumer rights in the United States can be found in the Federal Trade Commission Act, which since 1915 has prohibited “unfair and deceptive acts or practices in or affecting commerce.” Similar laws have been adopted in nearly every state, and both government regulators and attorneys representing consumers and businesses can cite the FTC Act and decisions under it to argue about what is unfair or deceptive.

As technology changes, this general framework has remained remarkably adaptable—allowing the FTC to regulate, for example, the privacy policies of major companies like Facebook. No matter what the state of technology is, generally speaking, misleading one’s consumers can open up a company to liability under the “deceptiveness” prong of the law. To comply with the deceptiveness prong, a website operator’s privacy policy, user agreements, and public communications should be truthful about how they use consumers’ data stored on their servers. But the deceptiveness prong leads to a focus on whether misstatements or omissions make the website deceptive, and does not necessarily help define what realm, if any, of consumer privacy should be free from intrusions.

The “unfairness” prong of the FTC Act is more expansive, but requires the agency to draw lines through rulemaking between business practices that may be unpopular with consumers, and those that are truly unfair. In assessing unfairness, the agency focuses on whether the practice substantially injures consumers, without offsetting benefits, and in a way consumers cannot reasonably avoid. The unfairness prong’s ambiguity can lead to contradictory decisions about whether practices are or are not permissible. And the unfairness prong would require a different analysis as to each challenged practice, meaning it is no substitute for a more comprehensive set of rules defining consumers’ rights to privacy.

The need to balance privacy with innovation

Specific legislation since the FTC Act has addressed many consumer privacy concerns, for example, by prohibiting “hacking” or unauthorized access to most computers (at the federal level), and prohibiting computer spyware and requiring data-breach notification to affected consumers (here in Washington). Maybe no other jurisdiction goes as far as the European Union in protecting consumer privacy, where consumers enjoy the “right to be forgotten.” By law, the EU permits consumers to request that links be removed from search engine results where the linked information is inaccurate, inadequate, irrelevant or excessive. These diverse approaches to regulating consumer privacy show there are many possible solutions for balancing privacy concerns with ongoing technological innovation.

A social network consumers’ bill of rights

While changing technology opens up ever-new possibilities for exploiting data about consumers, we should keep asking ourselves if we know what our rights are, and if we are satisfied with the regulatory framework applying to this changing social and economic area. The ability to harvest information that we think is only visible to our friends, and the possibility of discovering intimate details about individuals by analyzing their data or even their profile photos, creates new challenges. If society is uncomfortable with the possibilities these technologies create, perhaps a social network consumer’s bill of rights is needed to require that consumers are better informed and give full consent before intimate details of their lives and identities are gleaned from their social network data.

Walter Smith is an attorney in Olympia who works on consumer protection issues. His firm’s website is http://www.smithdietrich.com, and you can reach him by email at walter@smithdietrich.com.

 

Be First to Comment

Leave a Reply

Your email address will not be published. Required fields are marked *

Next:
Staff writer From press releases A couple of years ago,…