Recap of the Government Big Data Forum of 26 Jan 2011

4 Min Read

On 26 January 2011 government IT professionals, federally-focused systems integrators and IT vendors met in the Government Big Data Forum to dialog on issues of common concern on the topic of overwhelming data.

On 26 January 2011 government IT professionals, federally-focused systems integrators and IT vendors met in the Government Big Data Forum to dialog on issues of common concern on the topic of overwhelming data.

 

This event, sponsored by Carahsoft, included presentations, discussions, panels and a tech expo all focused on the issue of “Big Data” in the federal enterprise.

Attendees heard from speakers like Dawn Meyerriecks, Deputy Director of National Intelligence for Acquisition and Technology and Dr. Aaron Drew of DoD’s Business Transformation Office and Kirit Amin, CIO of the Bureau of Consular Affairs at the US Department of State, and Tim Schmidt, CTO of Department of Transportation.  We also heard from some of the IT industry’s greatest technology firms. And throughout we encouraged direct dialog with participants through question and answer discussions.

Results are still being analyzed and we will be posting more in the coming days at http://ctovision.com But here are some general takeaways on the topic:

  • Participants in the forum came because we share common challenges with overwhelming data, including massive amounts of data now and more coming in the future.
  • As in the commercial sector, government use of the term Big Data is not consistent among all users. And frankly we probably need a variety of approaches to that concept. The missions of agencies vary and so do data requirements and so will approaches to massive scale data.
  • That said, we heard some good, probably useful descriptions of “Big Data” in this forum. One participant described Big Data as “The data that is too big for you to currently deal with.” Another called it ”The Data that cannot be analyzed, yet.”  Another called it “Datasets that have grown so large they are awkward to deal with.”
  • There are no simple answers to overwhelming data challenges, but one thing is clear.  The answer to our challenges is not “less data.” The reasons we have so much data is we seek to support missions and serve citizens. We need more analysis and understanding, and we need smart data solutions. And all indications are we will need increasing amounts of data.
  • We need more automated tools, more advanced models, more distributed analysis and processing capabilities, and a better treatment of analysis as a discipline. These are not one-time-fix needs. We need a long term focus on Big Data. One speaker called this the “decade of analysis.”
  • New approaches to Big Data missions in the federal space may include more social media, more crowdsourcing and better designed big data systems.
  • Designing for sense-making means treating the design of Big Data systems as a discipline.
  • This concept of Big Data as a discipline in the federal space was apparent throughout the day. Many participants in the forum called for a focus on Big Data as a discipline worthy of systems engineering and a career path all its own. And the community is calling out for enhanced bodies of knowledge in this domain.
  • Big Data issues are not only about sensemaking.  There are also issues of data movement, data storage, data deduplication, backup and recovery.

Concluding Thought: Building workable big data systems requires more than just disciplined plans. It requires execution. Community dialog and enhanced sharing of lessons learned and best practice can contribute to execution.

Presentations from the event will be made available soon, and we will provide more analysis in coming posts in our new Big Data category.

Share This Article
Exit mobile version