From GPS to Genomes: “Big Data” Expanding Its Reach

7 Min Read

The future of computing is at an exciting, if ambiguous crossroads, with some of the biggest players in the industry, players like Intel, AMD, and NVidia starting to stray off the beaten path of relentlessly pursuing the time-honored tradition of Moore’s law: doubling chip density (and consequently halving prices) roughly every 18 months.

The future of computing is at an exciting, if ambiguous crossroads, with some of the biggest players in the industry, players like Intel, AMD, and NVidia starting to stray off the beaten path of relentlessly pursuing the time-honored tradition of Moore’s law: doubling chip density (and consequently halving prices) roughly every 18 months.

While huge question marks still hang over the hardware side of computing, there’s little doubt the software side of things is blazing down the path of Big Data, with many juggernauts of the world of Enterprise-level computation, including Oracle, Microsoft and IBM, racing their way to dominance in this emerging field. Hardly a day goes by anymore where the phrase “Big Data” isn’t mentioned somewhere in the news, which garners the layperson’s question of: “What is big data anyways?”

What is “Big Data”?

Big data is an all-encompassing term referring to both the immensely large and complex data sets as well as the methods being devised to analyze them. The analysis of this big data yields valuable information to companies, government agencies and service providers around the world. Put simply, big data is all about taking very large and otherwise useless chunks of raw data and squeezing valuable information out. Examples of fields where big data is being employed include the indexing and rapid recall of search engine results, predictive finance analytics, meteorology and genome analysis.

Take GoGPS for example — using innovative GPS tracking solutions, they enable logistics companies to engage in fine-tuned tracking and routing of ground-shipping fleets. The same GPS fleet tracking technology that was previously only economically viable in air traffic controller is now at the fingertips of smaller shipping companies, making it easier to plan, track, and fulfill delivery of large shipments. Together with big data storage and analysis of their shipments, GoGPS can also predict the time of shipments, map optimal routes, and monitor gas consumption and logging times for compliance.

Data storage and analysis is hardly new though, the groundwork was laid well before the dawn of the big data era; database service provider Oracle, for example, was founded in 1977, so what changed leading up to the current popularization of big data?

How Big Data became as “Big” as it is

Although information was being stored, analyzed and recalled long before the era of big data, the data was nowhere near as intricate or exhaustive as today. Three things changed to bring about the era of big data:

Storing information became cheaper:

According to scientific research paper by prominent data scientist Martin Hilbert published in Science Magazine in 2011, 2002 represented the true coming of the digital age, as it was in this year that digital media finally surpassed analog media in storage capacity. Since then, the cost of storing information digitally has only gotten cheaper, making it viable for companies to never have to throw out or otherwise reallocate storage space. In other words, all information that is created can now be permanently stored, ushering in a whole new age of digital analytics of historical data that simply wasn’t available before then.

Means of acquiring information became richer:

With the coming of the smartphone in the early 2000s, humanity gained an unprecedentedly cheap means of gathering information in the form of cameras, microphones, radio-frequency identifiers, and wireless network sensors. This newfound wealth of information gathering, together with loosely defined privacy legislation, allows companies to gather user information with a high level of precision. From the websites they surf to the places they’ve been, users inadvertently share all of this information every time they post to a social network, or even access their smartphones.

Curation of Digital content became more commonplace:

Before the era of big data, only those with technical know-how could curate and publish information on the internet. The bare minimum requirement was an understanding of HTML. With the coming of Web 2.0, this all changed, as users with no coding knowledge were now able to create and post content on social media. This information, together with the wealth of metadata — such as the locations of pictures, or the model of capturing device appended to it — opened the door to a whole new era of data analysis.

In Closing: Where Big Data is headed

Experts are divided on where big data is headed since it’s still such an emergent field. The term big data is still loosely defined and even more loosely regulated, encompassing any form of manipulation or large data sets.

The result is an industrial dissonance where things are constantly being rediscovered and reinvented due to a lack of communication. Kudos to sites like SmartDataCollective & Datafloq, which strive to change that by consolidating data scientists with industry stakeholders, matchmaking between the two in unlikely partnerships that result in truly innovative solutions.

Share This Article
Exit mobile version