By using this site, you agree to the Privacy Policy and Terms of Use.
Accept
SmartData Collective
  • Analytics
    AnalyticsShow More
    predictive analytics in dropshipping
    Predictive Analytics Helps New Dropshipping Businesses Thrive
    12 Min Read
    data-driven approach in healthcare
    The Importance of Data-Driven Approaches to Improving Healthcare in Rural Areas
    6 Min Read
    analytics for tax compliance
    Analytics Changes the Calculus of Business Tax Compliance
    8 Min Read
    big data analytics in gaming
    The Role of Big Data Analytics in Gaming
    10 Min Read
    analyst,women,looking,at,kpi,data,on,computer,screen
    Promising Benefits of Predictive Analytics in Asset Management
    11 Min Read
  • Big Data
  • BI
  • Exclusive
  • IT
  • Marketing
  • Software
Search
© 2008-23 SmartData Collective. All Rights Reserved.
Reading: Non-Consensual User Data Tests Violate Fundamental Rights
Share
Notification Show More
Latest News
ai digital marketing tools
Top Five AI-Driven Digital Marketing Tools in 2023
Artificial Intelligence
ai-generated content
Is AI-Generated Content a Net Positive for Businesses?
Artificial Intelligence
predictive analytics in dropshipping
Predictive Analytics Helps New Dropshipping Businesses Thrive
Predictive Analytics
cloud data security in 2023
Top Tools for Your Cloud Data Security Stack in 2023
Cloud Computing
become a data scientist
Boosting Your Chances for Landing a Job as a Data Scientist
Jobs
Aa
SmartData Collective
Aa
Search
  • About
  • Help
  • Privacy
Follow US
© 2008-23 SmartData Collective. All Rights Reserved.
SmartData Collective > Data Management > Best Practices > Non-Consensual User Data Tests Violate Fundamental Rights
Best PracticesBig DataBusiness IntelligenceData Management

Non-Consensual User Data Tests Violate Fundamental Rights

Tracey Wallace
Last updated: 2014/09/09 at 6:00 AM
Tracey Wallace
7 Min Read
Image
SHARE

ImageOn Thursday, Sept. 4, OKCupid founder Christian Rudder wrote an op-ed piece in the Wall Street Journal addressing the dating site’s use of personal user information for testing purposes.

ImageOn Thursday, Sept. 4, OKCupid founder Christian Rudder wrote an op-ed piece in the Wall Street Journal addressing the dating site’s use of personal user information for testing purposes. The announcement of these tests were met with outrage in July when the company’s blog post concerning the user tests went viral, with many publications citing Rudder, “But guess what, everybody: if you use the Internet, you’re the subject of hundreds of experiments at any given time, on every site. That’s how websites work.”

The tongue-in-cheek post, written after the announcement of Facebook’s algorithm experiment on users, was likely meant to attract buzz and controversy. Though, the extent of that controversy was unforeseen for OKCupid, much as it was for Facebook after their announcement, and has since raised privacy and transparency concerns for users of both platforms and, to a larger extent, of all digital platforms.

For most users, the issue wasn’t in the testing itself, but, rather, in the lack of consent around being a test subject. Typically, though few people read them, terms and conditions agreements include caveats to the use and collection of user data, including such rhetoric as “research” and/or “enhance the services we offer.” This rhetoric often provides legal loopholes for what the public sees as “non-consensual” testing. 

More Read

ai digital marketing tools

Top Five AI-Driven Digital Marketing Tools in 2023

Is AI-Generated Content a Net Positive for Businesses?
Combat AI-Powered Threats with Cybersecurity Simulations & Other Practices
Utilizing Data to Discover Shortcomings Within Your Business Model
Embedded BI Tools Bring Huge Benefits to Business Applications

And indeed, what these companies are doing is ultimately bettering their services. After all, Facebook users who are happier after using Facebook will likely be more engaged with the platform and OKCupid users who find their perfect match within a shorter timeframe are likely to be happier with their online dating experience. At the end of the day, higher engagement with a platform and more word of mouth success stories add revenue to the bottom line. In other words, platform manipulation using personal data points is akin to A/B testing, in which companies hypothesize and subsequently test which online experiences will entice users to reach the end of a conversion funnel more often.

This is Bigger than A/B Testing

All of this is well and good, except for one small concern: human experimentation without consent. 

In the era of big data, each and every one of us leaves behind a digital footprint that can be collected and put to use via data brokers. The data we create is three-fold, presenting itself as first-, second- or third-party data. First party data is data we knowingly and willingly give to a company (i.e. your name and email address when you sign up for a newsletter). Second-party data is data we have given on a first-party basis to a site (like Facebook, for instance) and knowingly share with another company (you do this when you use social login on websites, as the social authentication button will alert you). Third-party data is scraped data that you do not explicitly give anyone the permission to collect, but because the information exists, data brokers do collect it and anonymize it for legal purposes. 

Now, to avoid big data is nearly impossible and the opportunity for additional players to enter the data broker game multiplies every time a new wearable is created. For the most part, this is good news. Companies like Nike, for instance, can use data collected from your Fuelband to provide health insights to your doctor. Or, companies like Jawbone can give us really cool infographics about how an earthquake affected the sleep-cycles of users. 

Better yet, big data can help brands create more engaging and authentic experiences for their users, based on their users’ preferences, demographics, geography and more. 

Of course, all of the benefits that come along with big data do not mean that big data currently is or ever should be an explicitly understood exchange between a digital platform and a user.

See, while the notion of big data and data rights is certainly taking hold across the globe, the vast amount of users are still unaware of exactly when, why and how their data is being collected. 

This is not the fault of users. It is the fault of brands, unethically collecting data, jeopardizing user trust and ultimately breaching user data rights. 

“There are many ethical questions yet to address, and I’ve learned firsthand that communication between websites and their users needs to improve,” Rudder wrote in the Wall Street Journal op-ed. “How do we continue to wring meaningful findings out of intensely personal data while protecting the privacy of the individual? How do we ensure that users are never at risk? What is permissible, and what should always be off-limits?”

The answers to these questions, though, aren’t difficult and shouldn’t take, as Rudder suggests, years to perfect. Sure, actual policy and law around big data and user data rights will indeed take years. In fact, it is likely that they will take a massive misuse case of the Supreme Court variety. But, companies championing for the people in this big data movement would be wise to abide by a best practices of sort, one that dictates the adherence to user data collection and use transparency. 

It might sound cliche, but what big data needs is the enforcement of the Golden Rule: treat my data as you would want yours treated. For most people, that means allowing for the ability to consent to a study, or opt-out of collection entirely if they so choose. 

Because in the end, all any of us want is the autonomy to choose for ourselves. Big data might be able to serve us more relevant than ever options, but it is still our right to exercise our personal agency. Any company that takes that away from users is violating what we at Umbel see as a fundamental right. 

Tracey Wallace September 9, 2014
Share this Article
Facebook Twitter Pinterest LinkedIn
Share

Follow us on Facebook

Latest News

ai digital marketing tools
Top Five AI-Driven Digital Marketing Tools in 2023
Artificial Intelligence
ai-generated content
Is AI-Generated Content a Net Positive for Businesses?
Artificial Intelligence
predictive analytics in dropshipping
Predictive Analytics Helps New Dropshipping Businesses Thrive
Predictive Analytics
cloud data security in 2023
Top Tools for Your Cloud Data Security Stack in 2023
Cloud Computing

Stay Connected

1.2k Followers Like
33.7k Followers Follow
222 Followers Pin

Sign Up for Our Newsletter

Subscribe to our newsletter to get our newest articles instantly!

[mc4wp_form id=”1616″]

You Might also Like

ai digital marketing tools
Artificial Intelligence

Top Five AI-Driven Digital Marketing Tools in 2023

6 Min Read
ai-generated content
Artificial Intelligence

Is AI-Generated Content a Net Positive for Businesses?

5 Min Read
cybersecurity simulations
Artificial IntelligenceExclusiveITSecurity

Combat AI-Powered Threats with Cybersecurity Simulations & Other Practices

7 Min Read
utlizing big data for business model
Big Data

Utilizing Data to Discover Shortcomings Within Your Business Model

6 Min Read

SmartData Collective is one of the largest & trusted community covering technical content about Big Data, BI, Cloud, Analytics, Artificial Intelligence, IoT & more.

data-driven web design
5 Great Tips for Using Data Analytics for Website UX
Big Data
AI and chatbots
Chatbots and SEO: How Can Chatbots Improve Your SEO Ranking?
Artificial Intelligence Chatbots Exclusive

Quick Link

  • About
  • Contact
  • Privacy
Follow US

© 2008-23 SmartData Collective. All Rights Reserved.

Removed from reading list

Undo
Go to mobile version
Welcome Back!

Sign in to your account

Lost your password?