Cookies help us display personalized product recommendations and ensure you have great shopping experience.

By using this site, you agree to the Privacy Policy and Terms of Use.
Accept
SmartData CollectiveSmartData Collective
  • Analytics
    AnalyticsShow More
    data analytics for pharmacy trends
    How Data Analytics Is Tracking Trends in the Pharmacy Industry
    5 Min Read
    car expense data analytics
    Data Analytics for Smarter Vehicle Expense Management
    10 Min Read
    image fx (60)
    Data Analytics Driving the Modern E-commerce Warehouse
    13 Min Read
    big data analytics in transporation
    Turning Data Into Decisions: How Analytics Improves Transportation Strategy
    3 Min Read
    sales and data analytics
    How Data Analytics Improves Lead Management and Sales Results
    9 Min Read
  • Big Data
  • BI
  • Exclusive
  • IT
  • Marketing
  • Software
Search
© 2008-25 SmartData Collective. All Rights Reserved.
Reading: Analyzing Logs and More – A Big Data Architecture
Share
Notification
Font ResizerAa
SmartData CollectiveSmartData Collective
Font ResizerAa
Search
  • About
  • Help
  • Privacy
Follow US
© 2008-23 SmartData Collective. All Rights Reserved.
SmartData Collective > Big Data > Data Warehousing > Analyzing Logs and More – A Big Data Architecture
Big DataData WarehousingHadoopText AnalyticsUnstructured Data

Analyzing Logs and More – A Big Data Architecture

sdesikan
sdesikan
7 Min Read
SHARE

Big data and log files

Splunk’s great success in providing the tools for a sysadmin to delve into previously inaccessible log files has opened up the market for deeper analysis on data in log files.

When processing log files, understanding the syntax is very important. The process varies by volume and file type, so make sure you understand the proper protocols.

Volume and Variety in log files

More Read

leverage existing data
Leveraging Existing Data To Penetrate Saturated Markets
OLTP meets OLAP, BI Conferences, Sybase Who? And Other News
Free as in Freebase
How Can Big Data Influence Sports?
A Software called Splunk

What is commonly referred to as “log files” is a set of machine or user logged text data, containing information about the behavior of an application or a device. These log files come in various formats and have the truth about how a product or software is being used. Traditional uses of log file mining has been around weblogs and syslogs but over the last few years game companies like Zynga and Disney( where I was responsible for analytics infrastructure) have perfected the art of logging usage data at regular intervals and mining the data to understand customer usage patterns, performance issues as well as longer term trends on product usage on what features work or don’t work.

The infrastructure and tools are now available to capture logs from any device or software – think storage arrays or servers in data centers, medical devices and sensors – they all produce data. Increasingly the people who produce and support these products have to deal with not only the volume of data that comes out of these devices and software components but also the variety of information that is needed to understand customer usage, problem diagnosis, installed base analytics etc.

Traditional log file vendors like Splunk focus on problem resolution which in Splunk’s case is by providing a platform to index all log data and show patterns through a very intuitive UI. This search and index based solution may not be suitable for complex log bundles and other non time-series semi-structured data as well as use cases that require longer term trend analysis and reporting.

A new reference architecture for mainstream log file analysis

Another approach that is more relevant to the product or app owners, is to use a stack consisting of not just the tools to collect and index the log files but also go a step further and derive structure for performing enterprise business intelligence from unstructured logs and log bundles. The reference architecture for accomplishing this is as follows

  1. Applying context to data: A language for defining the semantics of the data is the first step in preparing to analyze the data sets in the log files. Using this language one can delineate different sections in a log file or across multiple files and how they relate to each other. The language can also define and tag various elements of the log file in a repeatable scalable and flexible manner to accommodate changing log file formats, new sections and attributes being introduced with new versions etc. A DSL (Domain Specific Language) is one way to go about doing this( Scala DSL for example).
  2. Collecting and routing data:  Data comes from various sources and transports and this needs to be routed and processed centrally. See Apache camel as an example of a tool to handle this. Whether you build it yourself or build on top of Camel, this is a key step which today gets buried as a pre-processing step consisting of a myriad of custom scripts that are difficult to maintain.
  3. Scalable backend:  Data needs to be collected and stored in a NoSQL column family based data storage: Apache HBase or Cassandra for creating structure from the raw data. The advantage with Cassandra is that you can leverage it to store the raw logs as a blob and integrate with Lucene/Solr as well for search on the files. Datastax for example offers the Solr/Cassandra combo as Solandra. A Cassandra like NoSQL data stores provides the flexibility to create schemas on the fly and accommodate needs of gathering both structured and unstructured data in one place.
  4. Rules and Alerts – a way to define rules identifying common patterns when problems occur so that an automatic alert can be triggered when a similar pattern of issues is seen with a different customer. Some of these rules can be on structured data within a log – such as ‘count number of errors in a section where the line has a string “Error in device format”’ as an example. However, many a time, the rules are more complex, allowing for lookups on data across multiple sections in a file and even across multiple files, combining simple look ups with regular expression searches. Providing a tool to define these rules centrally allows you to manage and extend your knowledgebase over time.
  5. Reporting and Analysis: A middleware and a set of apps and reporting infrastructure for predefined queries satisfying common business cases -example installed base analytics, performance analysis, product usage analysis, capacity management etc. You can develop the middleware/app using a framework like PLAY. Expose common business queries as webservices so you can plug in common BI tools like Tableau to the data.

Conclusion

Parsing and processing log files tactically is still a grep/awk/sed or some scripting exercise done by a lone super ranger in an IT department or elsewhere. But with the growing strategic value of the data in log files, major product and software vendors are looking to put together a robust technology stack to leverage this information across the enterprise. If done right, this would become a very powerful and unique “Big Data” app providing meaningful insights across the enterprise from product support to engineering and marketing providing both operational and business intelligence from machine logs.

Share This Article
Facebook Pinterest LinkedIn
Share

Follow us on Facebook

Latest News

payment methods
How Data Analytics Is Transforming eCommerce Payments
Business Intelligence
cybersecurity essentials
Cybersecurity Essentials For Customer-Facing Platforms
Exclusive Infographic IT Security
ai for making lyric videos
How AI Is Revolutionizing Lyric Video Creation
Artificial Intelligence Exclusive
intersection of data and patient care
How Healthcare Careers Are Expanding at the Intersection of Data and Patient Care
Big Data Exclusive

Stay Connected

1.2kFollowersLike
33.7kFollowersFollow
222FollowersPin

You Might also Like

Malaysian Blogosphere Division

1 Min Read

Netsuite SuiteWorld 2011, Making ERP Social

10 Min Read
mobile technology
Big DataBusiness IntelligenceData ManagementHardwareInside CompaniesITLocationMobilityNew ProductsPrivacySecuritySocial Data

Indoor Locationing: The Hottest Thing in Tech

2 Min Read

Winning the first game in a baseball series: a harbinger, or not?

4 Min Read

SmartData Collective is one of the largest & trusted community covering technical content about Big Data, BI, Cloud, Analytics, Artificial Intelligence, IoT & more.

data-driven web design
5 Great Tips for Using Data Analytics for Website UX
Big Data
ai is improving the safety of cars
From Bolts to Bots: How AI Is Fortifying the Automotive Industry
Artificial Intelligence

Quick Link

  • About
  • Contact
  • Privacy
Follow US
© 2008-25 SmartData Collective. All Rights Reserved.
Go to mobile version
Welcome Back!

Sign in to your account

Username or Email Address
Password

Lost your password?