Non-Consensual User Data Tests Violate Fundamental Rights

7 Min Read

On Thursday, Sept. 4, OKCupid founder Christian Rudder wrote an op-ed piece in the Wall Street Journal addressing the dating site’s use of personal user information for testing purposes.

On Thursday, Sept. 4, OKCupid founder Christian Rudder wrote an op-ed piece in the Wall Street Journal addressing the dating site’s use of personal user information for testing purposes. The announcement of these tests were met with outrage in July when the company’s blog post concerning the user tests went viral, with many publications citing Rudder, “But guess what, everybody: if you use the Internet, you’re the subject of hundreds of experiments at any given time, on every site. That’s how websites work.”

The tongue-in-cheek post, written after the announcement of Facebook’s algorithm experiment on users, was likely meant to attract buzz and controversy. Though, the extent of that controversy was unforeseen for OKCupid, much as it was for Facebook after their announcement, and has since raised privacy and transparency concerns for users of both platforms and, to a larger extent, of all digital platforms.

For most users, the issue wasn’t in the testing itself, but, rather, in the lack of consent around being a test subject. Typically, though few people read them, terms and conditions agreements include caveats to the use and collection of user data, including such rhetoric as “research” and/or “enhance the services we offer.” This rhetoric often provides legal loopholes for what the public sees as “non-consensual” testing

And indeed, what these companies are doing is ultimately bettering their services. After all, Facebook users who are happier after using Facebook will likely be more engaged with the platform and OKCupid users who find their perfect match within a shorter timeframe are likely to be happier with their online dating experience. At the end of the day, higher engagement with a platform and more word of mouth success stories add revenue to the bottom line. In other words, platform manipulation using personal data points is akin to A/B testing, in which companies hypothesize and subsequently test which online experiences will entice users to reach the end of a conversion funnel more often.

This is Bigger than A/B Testing

All of this is well and good, except for one small concern: human experimentation without consent. 

In the era of big data, each and every one of us leaves behind a digital footprint that can be collected and put to use via data brokers. The data we create is three-fold, presenting itself as first-, second- or third-party data. First party data is data we knowingly and willingly give to a company (i.e. your name and email address when you sign up for a newsletter). Second-party data is data we have given on a first-party basis to a site (like Facebook, for instance) and knowingly share with another company (you do this when you use social login on websites, as the social authentication button will alert you). Third-party data is scraped data that you do not explicitly give anyone the permission to collect, but because the information exists, data brokers do collect it and anonymize it for legal purposes. 

Now, to avoid big data is nearly impossible and the opportunity for additional players to enter the data broker game multiplies every time a new wearable is created. For the most part, this is good news. Companies like Nike, for instance, can use data collected from your Fuelband to provide health insights to your doctor. Or, companies like Jawbone can give us really cool infographics about how an earthquake affected the sleep-cycles of users. 

Better yet, big data can help brands create more engaging and authentic experiences for their users, based on their users’ preferences, demographics, geography and more. 

Of course, all of the benefits that come along with big data do not mean that big data currently is or ever should be an explicitly understood exchange between a digital platform and a user.

See, while the notion of big data and data rights is certainly taking hold across the globe, the vast amount of users are still unaware of exactly when, why and how their data is being collected. 

This is not the fault of users. It is the fault of brands, unethically collecting data, jeopardizing user trust and ultimately breaching user data rights. 

“There are many ethical questions yet to address, and I’ve learned firsthand that communication between websites and their users needs to improve,” Rudder wrote in the Wall Street Journal op-ed. “How do we continue to wring meaningful findings out of intensely personal data while protecting the privacy of the individual? How do we ensure that users are never at risk? What is permissible, and what should always be off-limits?”

The answers to these questions, though, aren’t difficult and shouldn’t take, as Rudder suggests, years to perfect. Sure, actual policy and law around big data and user data rights will indeed take years. In fact, it is likely that they will take a massive misuse case of the Supreme Court variety. But, companies championing for the people in this big data movement would be wise to abide by a best practices of sort, one that dictates the adherence to user data collection and use transparency. 

It might sound cliche, but what big data needs is the enforcement of the Golden Rule: treat my data as you would want yours treated. For most people, that means allowing for the ability to consent to a study, or opt-out of collection entirely if they so choose. 

Because in the end, all any of us want is the autonomy to choose for ourselves. Big data might be able to serve us more relevant than ever options, but it is still our right to exercise our personal agency. Any company that takes that away from users is violating what we at Umbel see as a fundamental right. 

Share This Article
Exit mobile version