Cookies help us display personalized product recommendations and ensure you have great shopping experience.

By using this site, you agree to the Privacy Policy and Terms of Use.
Accept
SmartData CollectiveSmartData Collective
  • Analytics
    AnalyticsShow More
    data analytics
    How Data Analytics Can Help You Construct A Financial Weather Map
    4 Min Read
    financial analytics
    Financial Analytics Shows The Hidden Cost Of Not Switching Systems
    4 Min Read
    warehouse accidents
    Data Analytics and the Future of Warehouse Safety
    10 Min Read
    stock investing and data analytics
    How Data Analytics Supports Smarter Stock Trading Strategies
    4 Min Read
    predictive analytics risk management
    How Predictive Analytics Is Redefining Risk Management Across Industries
    7 Min Read
  • Big Data
  • BI
  • Exclusive
  • IT
  • Marketing
  • Software
Search
© 2008-25 SmartData Collective. All Rights Reserved.
Reading: Text Analytics for Legacy BI Analysis
Share
Notification
Font ResizerAa
SmartData CollectiveSmartData Collective
Font ResizerAa
Search
  • About
  • Help
  • Privacy
Follow US
© 2008-23 SmartData Collective. All Rights Reserved.
SmartData Collective > Big Data > Data Quality > Text Analytics for Legacy BI Analysis
AnalyticsData QualitySecurity

Text Analytics for Legacy BI Analysis

DougLautzenheiser
DougLautzenheiser
8 Min Read
SHARE
A stumbling block for businesses trying to modernize legacy computer applications is the sheer volume of program files. An IT organization may own thousands of code libraries, each with thousands of programs. Often, the whereabouts of the original developers of old applications are unknown. 
A stumbling block for businesses trying to modernize legacy computer applications is the sheer volume of program files. An IT organization may own thousands of code libraries, each with thousands of programs. Often, the whereabouts of the original developers of old applications are unknown. 
I have found this to be especially true for legacy end-user 4GL reporting tools such as FOCUS. 
A computer language developed by Information Builders in the mid-1970s, FOCUS became the industry standard as a multi-platform report writer for business end-user communities. With FOCUS, rather than ask the busy IT organization to develop reports, users could build their own.
But instead of being just a report writer, FOCUS was in reality a full application development environment, originally designed to replace COBOL. Many enterprising users took advantage of robust features such as online screens, database maintenance, and batch processing  to build very sophisticated systems.
A decade or two later, many IT shops now struggle to grasp what their FOCUS users developed. Trying to assess the purpose, functionality, usage and complexity of these legacy applications by manually looking at each program is near impossible.
To assist with this type of time-consuming detective work, Partner Intelligence developed text analytics software called the “BI Consolidator.” Written in C/C++ with a web browser graphical user interface, the application has two mainfeatures: 1) automated textual discovery; and 2) automated translation into a new BI product.
For now let us consider only the automated textual discovery feature that called the”scanner.”  
Text Scanning
Computer programs are not completely unstructured like an e-mail message or the prose found inside a Word document. Instead, almost all computer programs follow a particular formal syntax which forces them to be at least semi-structured text. This simplifies textual analytics since we know what to expect (for the most part, anyway, as there can be user syntax errors and a fair share of junk). 
Our textual analytic scanner is smart enough to figure out the code dialect, but we provide it with some starting instructions. For example, we can tell the application to perform a very specific scan such as looking for FOCUS-to-Web FOCUS conversion issues, FOCUS metadata to find data formats, SAS statistical features, JCL batch job features, HTML legacy CGI calls, Crystal Reports features, or to parse SQL commands. 
When in a curious mood, we can perform custom ad-hoc textual searches.
While the results pulled from the text can be just displayed on a screen, it is more useful to save these to a database and later analyze the answer set.
Online GUI and Batch Text Scanning
We started with a GUI front-end, but when working with a large number of libraries it quickly becomes tedious to repeatedly point, click and run. As a result, we modified the Scan program to be alternatively run using a batch script from the command line. 

Not only is it easier to use, the scanner runs much faster since we eliminate generating HTML for displaying results within the browser. On our current engagement, we scan close over 200 mainframe libraries containing over 80,000 programs within 15 minutes.

Keyword Frequencies
For many of the scans, the software performs keyword frequency counts. For example, to evaluate conversion issues related to green-screen application development, the scanner searches the text for a variety of FOCUS keywords whose either presence or absence would be significant: 
  • MAINTAIN
  • -WINDOW, -CRTFORM,-PROMPT, -FULLSCR
  • CRTFORM, FIDEL, FI3270 (used within MODIFY)
  • PFKEY, SET PF
To help with the accuracy of the scanning, we can apply a variety of criteria on searches suchas: 
  • Perform case-sensitive search (or uppercase all text first)
  • Perform stand-alone search (or allow the token to be embedded within a string)
  • Ignore blanks between search tokens (since developers often format code using spaces between words)

Pattern Recognition
With the results ofthe keyword searches, we can group specific ones together help identify a pattern of usage within the application. For example: 
  • Reporting App = high number of TABLE (report) requests but few MODIFYs (database updates)
  • Online Reporting App =Reporting App with high number of -CRTFORMs (menu screens) or -PROMPTs
  • Online Maintenance App= MODIFYs, CRTFORMs (transactional screens), and PFKEY usage
  • Batch Maintenance App= MODIFYs with FIXFORM/FREEFORM (transactions) instead of screens
  • Multi-Step Batch Job =JCL with various FOCUS and non-FOCUS steps (perhaps a difficult platform port)
Textual Parsing
For some textual analytics, we actually need to parse the semi-structured code and pull out more than just keywords. For example, we often find SQL (structured query language) embedded within reporting applications. Being structured, SQL follows a strict syntax of blocks of code in a specific order of: SELECT; FROM; WHERE; GROUP BY;HAVING; ORDER BY.
This makes it possible to parse the syntax and extract what databases, tables, and columns that are being used in the application. We can also distinguish between the columns showed on the report versus those being used in the selection criteria or for sorting and aggregation.
Standard Content Analysis
With these textual contents extracted and stored inside a database, we can then perform standard reporting as well as custom queries. For example, one well-known client used the scan results to perform a redundancy of their Business Objects environment to evaluate it being replaced with a new web-based solution.
The business sponsor was completely against a one-to-one conversion of these legacy reports. Instead, from the scanned contents of thousands of reports and SQL files, the client was able to identify commonalities and reporting redundancies which enabled them to categorize their BI needs into a dozen buckets. From there, they built a roadmap for replacing their legacy reporting environment with a collection of highly dynamic reporting solutions.
In addition to analytics, we have standard reports that help with the operational aspect of a modernization initiative such as parallel test plans. 
Building a Textual Analytics Engine
When companies need to modernize an application, they often view it as a one-time activity. With this mindset, they might not invest the time and money to build this type of textual analytics scanner and translator. Because we work with a variety of clients with this common need, it made sense for Partner Intelligence to create such a textual scanning tool as the BI Consolidator.
If you are interested in learning more, I would be happy to discuss the details of our software with you. Contact me at my DLautzenheiser at PartnerPS dot com address.
Share This Article
Facebook Pinterest LinkedIn
Share

Follow us on Facebook

Latest News

protecting patient data
How to Protect Psychotherapy Data in a Digital Practice
Big Data Exclusive Security
data analytics
How Data Analytics Can Help You Construct A Financial Weather Map
Analytics Exclusive Infographic
AI use in payment methods
AI Shows How Payment Delays Disrupt Your Business
Artificial Intelligence Exclusive Infographic
financial analytics
Financial Analytics Shows The Hidden Cost Of Not Switching Systems
Analytics Exclusive Infographic

Stay Connected

1.2KFollowersLike
33.7KFollowersFollow
222FollowersPin

You Might also Like

Data Mining: Interesting Ethical Questions

1 Min Read

Top 10 Social Engineering Tactics

13 Min Read
construction analytics
Analytics

5 Benefits of Analytics to Manage Commercial Construction

5 Min Read

What Scales Best?

4 Min Read

SmartData Collective is one of the largest & trusted community covering technical content about Big Data, BI, Cloud, Analytics, Artificial Intelligence, IoT & more.

AI and chatbots
Chatbots and SEO: How Can Chatbots Improve Your SEO Ranking?
Artificial Intelligence Chatbots Exclusive
giveaway chatbots
How To Get An Award Winning Giveaway Bot
Big Data Chatbots Exclusive

Quick Link

  • About
  • Contact
  • Privacy
Follow US
© 2008-25 SmartData Collective. All Rights Reserved.
Go to mobile version
Welcome Back!

Sign in to your account

Username or Email Address
Password

Lost your password?