By using this site, you agree to the Privacy Policy and Terms of Use.
Accept
SmartData Collective
  • Analytics
    AnalyticsShow More
    data science anayst
    Growing Demand for Data Science & Data Analyst Roles
    6 Min Read
    predictive analytics in dropshipping
    Predictive Analytics Helps New Dropshipping Businesses Thrive
    12 Min Read
    data-driven approach in healthcare
    The Importance of Data-Driven Approaches to Improving Healthcare in Rural Areas
    6 Min Read
    analytics for tax compliance
    Analytics Changes the Calculus of Business Tax Compliance
    8 Min Read
    big data analytics in gaming
    The Role of Big Data Analytics in Gaming
    10 Min Read
  • Big Data
  • BI
  • Exclusive
  • IT
  • Marketing
  • Software
Search
© 2008-23 SmartData Collective. All Rights Reserved.
Reading: CERN: A Case Study in “Big Science” Data Management
Share
Notification Show More
Latest News
ai in automotive industry
AI Is Changing the Automotive Industry Forever
Artificial Intelligence
SMEs Use AI-Driven Financial Software for Greater Efficiency
Artificial Intelligence
data security in big data age
6 Reasons to Boost Data Security Plan in the Age of Big Data
Big Data
data science anayst
Growing Demand for Data Science & Data Analyst Roles
Data Science
ai software development
Key Strategies to Develop AI Software Cost-Effectively
Artificial Intelligence
Aa
SmartData Collective
Aa
Search
  • About
  • Help
  • Privacy
Follow US
© 2008-23 SmartData Collective. All Rights Reserved.
SmartData Collective > Data Management > Culture/Leadership > CERN: A Case Study in “Big Science” Data Management
Big DataCulture/LeadershipData ManagementPolicy and Governance

CERN: A Case Study in “Big Science” Data Management

tkorte
Last updated: 2013/10/29 at 8:00 AM
tkorte
6 Min Read
Image
SHARE

ImageThe European Organization for Nuclear Research (CERN) became a poster-child for organizing and processing enormous quantities of data during its search for the Higgs boson and other physics investigations, but scaling up is not the only data hurdle the organization has had to jump.

ImageThe European Organization for Nuclear Research (CERN) became a poster-child for organizing and processing enormous quantities of data during its search for the Higgs boson and other physics investigations, but scaling up is not the only data hurdle the organization has had to jump. In its efforts to maximize the long-tail value of its data, it has had to create roadmaps for preserving and releasing public data. In the same way that CERN’s scalable processing and storage innovations can be applied in industry, its data preservation and dissemination efforts can serve as examples for other “big science” projects.

Data Preservation

CERN’s data preservation imperative arises from two factors. First, data re-use is a crucial component of particle physics. A 2012 special report from the Data Preservation in High Energy Physics (DPHEP) study group notes that important research can arise from data sets over a decade after their collection (see “Other examples of long-term data re-use”). The same report notes that long-term analysis accounts for 5-10% of total scientific output of high energy physics collaborations and a survey of over a thousand high energy physicists showed that 70% regarded data preservation as “very important” or “crucial” to their work.

More Read

data security in big data age

6 Reasons to Boost Data Security Plan in the Age of Big Data

Growing Demand for Data Science & Data Analyst Roles
How Big Data Is Transforming the Maritime Industry
Utilizing Data to Discover Shortcomings Within Your Business Model
Small Businesses Use Big Data to Offset Risk During Economic Uncertainty

The second factor is that in general, particle collision data is only collected once. As accelerators grow more powerful, they tend to collect data at energy levels never before achievable, so data from older accelerators risks being lost forever if it is not adequately preserved. Scientists cannot hope to reap the benefits of running new analysis on old data if this one-of-a-kind data disappears.

In light of these two factors, CERN has been proactive about preserving data for future re-use, even as it remains plagued by non-standardized, ad-hoc formats, and highly complex data models. CERN was a co-founding organization in the Alliance for Permanent Access, a European data preservation working group. Internal efforts are underway as well, although these have not yet been implemented completely. In a 2011 presentation, the computing coordinator of the Toroidal Large Hadron Collider Apparatus (ATLAS) experiment at CERN (one of the detectors that contributed measurements to the Higgs boson finding) stated that he was confident that technological progress would enable preservation of large quantities of data in the medium term. The collider’s Compact Muon Solenoid(CMS) experiment (the other detector involved in the Higgs finding), has made progress as well, announcing that it would begin implementing a new metadata collection tool this month and mentioning an emerging initiative to guarantee “bit-level data preservation” in the future.

These efforts may benefit from government nudging, including writing data preservation rules into grants and providing funding for data management staff. The DPHEP report found that a staffing increase of 2-3 full time employees leads to a significant improvement in the ability to implement long-term data preservation systems. This might seem like a drop in the bucket for a large, multinational effort such as CERN, but individual experiment teams might be more inclined to spend general allocations on detector systems than on practices such as data preservation that benefit the greater community.

Open Data

At present, CERN only gives a select few unaffiliated scientists access to raw data from the Large Hadron Collider, but the organization has expressed commitment to open data access in the longer term. The CMS experiment leads current open data efforts, having approved a data preservation, re-use and open access policy in 2012, and issued guidance on implementing the policy earlier this month. Public data releases will take place yearly, during Large Hadron Collider machine shut-downs, and data will be released 3 years after collection, to ensure that CERN and affiliated physicists have access to the data before it goes public. In addition, where special software is required to read the data, it will be made publicly available in open-source.

The ATLAS experiment has also issued preliminary guidelines, although these are much less comprehensive, providing data access for unaffiliated physicists only insofar as they are involved with a particular research project.

To the extent that CERN’s physics data is produced from publicly-funded experiments, the organization should strive to make it available and accessible to the public. In the future, member governments could mandate open releases; by encouraging the open availability of taxpayer-funded physics data, governments can maximize the data’s re-use value and encourage more meaningful collaboration throughout the scientific community.

tkorte October 29, 2013
Share this Article
Facebook Twitter Pinterest LinkedIn
Share

Follow us on Facebook

Latest News

ai in automotive industry
AI Is Changing the Automotive Industry Forever
Artificial Intelligence
SMEs Use AI-Driven Financial Software for Greater Efficiency
Artificial Intelligence
data security in big data age
6 Reasons to Boost Data Security Plan in the Age of Big Data
Big Data
data science anayst
Growing Demand for Data Science & Data Analyst Roles
Data Science

Stay Connected

1.2k Followers Like
33.7k Followers Follow
222 Followers Pin

You Might also Like

data security in big data age
Big Data

6 Reasons to Boost Data Security Plan in the Age of Big Data

7 Min Read
data science anayst
Data Science

Growing Demand for Data Science & Data Analyst Roles

6 Min Read
How Big Data Is Transforming the Maritime Industry
Big Data

How Big Data Is Transforming the Maritime Industry

8 Min Read
utlizing big data for business model
Big Data

Utilizing Data to Discover Shortcomings Within Your Business Model

6 Min Read

SmartData Collective is one of the largest & trusted community covering technical content about Big Data, BI, Cloud, Analytics, Artificial Intelligence, IoT & more.

giveaway chatbots
How To Get An Award Winning Giveaway Bot
Big Data Chatbots Exclusive
AI chatbots
AI Chatbots Can Help Retailers Convert Live Broadcast Viewers into Sales!
Chatbots

Quick Link

  • About
  • Contact
  • Privacy
Follow US

© 2008-23 SmartData Collective. All Rights Reserved.

Removed from reading list

Undo
Go to mobile version
Welcome Back!

Sign in to your account

Lost your password?