Three Ways Big Data Is Revamping Manufacturing Processes

6 Min Read
Shutterstock Licensed Image - By Wright Studio

From eCommerce and healthcare to fintech and sports, the modern world is all about big data. And the official stats echo this idea: By 2020, the global big data market is estimated to exceed the $57 billion mark.

The manufacturing industry with its competitive challenges and quest for high productivity is also ready to join the big data boom. But can this tech reign in production facilities? To get the right answer, let’s delve into the major benefits of big data in manufacturing.

Betting on big data to reduce downtime

Downtime is an absolute nightmare for every industrial sector given the negative outcomes such stoppages cause. Manufacturers deal with an average 800 hours of downtime annually, which means from five to 20 percent in productivity losses.

However, there’s no time for glass-half-empty moods. General Electric (GE), for example, shared their best practices at the ‘Minds and Machines Europe’ event in London. Jeff Immelt — GE’s former CEO — revealed how a combination of tech fosters change for the company across different sectors, such as healthcare, energy, and aviation.

According to Immelt, big data analytics coupled with materials science and “intelligent machines” equipped with sensor technology can harness the power of industrial data in real time, bringing substantial benefits.

As a result, the corporation managed to automate its manufacturing processes, optimize performance, and eliminate downtime by predicting when a machine or a certain component will fail. And yearly $45 billion in revenue is tangible proof of their success.

Here’s an example of how remote monitoring and early diagnosis of problems take place in aviation. A set of sensors connected to gas engines captures data every 30 seconds.

Then, the Hadoop software comes into play. Its fault-tolerant, redundant HDFS file system splits the collected data into manageable chunks and distributes them across thousands of nodes, preparing the ground for extremely fast MapReduce-based parallel computations.

Such large-scale data processing effectively deals with the three Vs of big data — volume, velocity, and variety — and helps GE correct the possible manufacturing flaws. Jeff Immelt claims that boosting at least one percent of gas engine performance a year via Hadoop-enabled analytics saves customers $2 billion.

Mitigating supply chain risks in the big data age

Supply chain is fraught with uncertainties. If you want to reduce the possible risks and build good relationships with retailers and customers alike, you’ll need data analytics, again. In supply chain, big data applications are revolving around three main silos: traceability, procurement, and warehousing.

For example, IoT-facilitated data — fruitfully used to generate meaningful insights — allows manufacturers to track goods and mitigate unfavorable situations. This may include making on-the-spot decisions on the optimal track routes due to food shortage or identifying instances of food contamination in a split second.

According to the Chartered Institute of Procurement and Supply, natural disasters and extreme weather conditions constitute the top causes of supply chain disruption. To make sure these pessimistic scenarios won’t lead to business interruption, companies can analyze weather stats for tornadoes, earthquakes, hurricanes, etc. and use predictive analytics to calculate the probabilities of delays.

Moreover, by mining historical and real-time data from external and online sources — such as financial analyst recommendations and media reviews — manufacturers can spot future trends and get valuable time for contingency measures in case of a financial crisis. Other applications of big data include maintaining the optimum level of inventory and improving procurement decisions.

Enhancing product quality by dint of big data

Quality control (QC) is another area where big data can show its worth. To wit, the multinational giant Intel has been using predictive analytics since 2012 to accelerate the production of their chips — at the same time increasing product quality.

By scrutinizing historical data gathered during manufacturing, Intel significantly reduces the number of tests every chip should go through. “Instead of running every single chip through 19,000 tests, we can focus tests on specific chips to cut down test time,” says Ron Kasabian, general manager for Intel’s data center group. “And as we’re ramping up new chips, we uncover lots of bugs and fix them.”

Moreover, big data assists Intel in testing equipment. By capturing and analyzing sensor-generated information, the corporation detects failures in its manufacturing line early on and takes preventive measures. This data-driven method has become a key enabler of enhanced QC as well as a strategic cost-cutter. In 2012, the corporation managed to save $3 million in manufacturing costs.

There’s more to the story

We’ve lifted the veil on just three applications of big data in manufacturing. Companies offering big data consulting services will definitely extend this list — depending on your business model and goals.

If you haven’t capitalized on data yet, get a heads up about the potential benefits. These include eliminating downtime, improving supply chain management, accelerating production and innovation, providing better services, increasing customer satisfaction rates, balancing costs, and more.

Share This Article
Exit mobile version