“Weather is the original big data application,” Bryson Koehler, the CIO of Weather Company, told the source late last year. “When mainframes first came about, one of the first applications was a weather forecasting model.”
Now, the Weather Company’s Storage Utility Network, or SUN, pulls in about 20 terabytes of data per day. An updated report on this project from the news source revealed that SUN is collecting as much as 2.25 billion pieces of data from around the world every three minutes. Yet, there are still questions that need to be answered and issues left unresolved.
Deciding how to handle big data
Adding 20 terabytes of information to a cloud platform on a daily basis means that data analysis tools are crucial to sorting and interpreting this data. The Weather Company still needs to come up with a strategy for updating the generated predictions and properly aligning analysis tools to best valuate all the various data points.
“We’re still on the learning curve on how to best tune the system, how we monitor and how we respond when things go wrong,” Koehler explained to the source in a follow-up interview.
Merging the cloud and big data
Silicone Angle reported that finding ways to properly link cloud services and analytics is an industry-wide effort. The main advantage of cloud services is that data can be uploaded from virtually anywhere to one central storage area. Many companies are developing software and processes that will enhance the ability to analyze data in the cloud, making it easier for businesses to get real-time feedback whether they’re examining weather patterns or determining how to market new products to their customer base. The overarching message is clear, however: big data and the cloud are a perfect match. It’s more a matter of time until the kinks have been ironed out and real-time analytics can be pulled directly from a cloud hosting service.