Why don’t they [IT] ever learn?

11 Min Read

I was chatting with friends last weekend about work and the conversation led to discussing how hard IT seem to make things for users “STILL”.  As an ex-CIO who started out as an analyst/programmer in the mainframe days – I appreciate the need for controls (security, data quality etc) but we have come such a long way; from the white coat “lab rat” days, to having developed so much enabling technology for users which makes life easy for both business and IT. 

 

The conversation started with my friend, the business analyst who works in a government agency, complaining how hard it was to not only get access to the data in their data warehouse (which is why it was of interest to me) but also to make corrections to information published on the departmental web site.  Here’s the saga:

 

ISSUE 1: SOURCING THE RIGHT DATA

The source data is provided to the department by external agencies to meet reporting requirements (presumably an extract from their operational systems).  This data is loaded and transformed into the data warehouse by IT (so far so good), and while the business users are consulted on the transformation rules and what data is needed by them to do their

I was chatting with friends last weekend about work and the conversation led to discussing how hard IT seem to make things for users “STILL”.  As an ex-CIO who started out as an analyst/programmer in the mainframe days – I appreciate the need for controls (security, data quality etc) but we have come such a long way; from the white coat “lab rat” days, to having developed so much enabling technology for users which makes life easy for both business and IT. 

 

The conversation started with my friend, the business analyst who works in a government agency, complaining how hard it was to not only get access to the data in their data warehouse (which is why it was of interest to me) but also to make corrections to information published on the departmental web site.  Here’s the saga:

 

ISSUE 1: SOURCING THE RIGHT DATA

The source data is provided to the department by external agencies to meet reporting requirements (presumably an extract from their operational systems).  This data is loaded and transformed into the data warehouse by IT (so far so good), and while the business users are consulted on the transformation rules and what data is needed by them to do their analysis/publications, it unfortunately isn’t always effective.  Getting this step right is critical for the ongoing trust and reliability of a data warehouse by end users.

 

Interestingly, having only just completed a Data Warehouse Maturity Assessment (or health check) for a customer, an observation by their business was that, when developing and implementing an application no one [in the project team] asks the business what their future analytic needs are.  It was identified by an executive as something that maybe they should start doing for all projects.

 

ISSUE 2: DATA LINEAGE

OK – so if the data isn’t quite what it should be then at least it can be corrected! Right? WRONG – the source data isn’t even retained!  [My jaw dropped on this one in disbelief.]. Can you imagine pleading with the external agencies for another copy just so you (the business analyst) can have the correct set of data supporting a public web site?  With disk space so cheap, let alone business being able to justify publication data, I couldn’t understand why an original copy of data isn’t retained.  What auditor let that process omission through to the keeper?

 

ISSUE 3: DATA FRESHNESS

Anyway, after much begging, trial and tribulation and a number of weeks later, my business analyst has the tables with the correct data ready for uploading onto the web site.  Their unit is responsible for the published data but getting the data published is IT’s responsibility and they then have to deal with another area of the Department creating further delays.  Bear in mind, this is a public web site with inaccurate data.  Instead of providing tools to the business owner to be self serving so they can upload the data, they even changed the formats for delivery to IT from Excel to a standard CSV file.  Seems reasonable, but since the original file was loaded using Excel which can have multiple tabs it also seemed reasonable that the reload could be done using the original format as converting from multi-tab to a single tab is very time consuming and has the potential to introduce inaccurate data. Just another road block and delay and it seems “just because” they can demand it.

 

ISSUE 4: IT PROCESS

At this point another friend comments, “I know what you mean”.  He has a small business selling a software product that is usually loaded into existing client data bases and often integrated with other client systems. He said that if the business user (the buyer) makes a comment when told of the data base/server/network requirement, “no problem, we get on well with our IT” then the installation is usually a smooth one. If the client makes negative comments about their delays in dealing with IT, it is a warning that just getting the system loaded and up and running (a straight forward half a day job) will inevitably be delayed and create major bottlenecks to the business etc.

 

ISSUE 5: ACCESS TO DATA

By now my analyst friend was on a roll – still more challenges.  Apparently, another group was enquiring as to what they needed to do to get access to their data and were told by IT “we can put it into cubes for you” to which my friend replied “good luck – don’t expect anything for about 6 months” mainly because the cubes are in development and not ready for user access.  This last response is so common it’s sad.  A common bottleneck for users of a data warehouse is limiting direct access to the data.  Fair enough for end users who just want the data fed to them, either as a fixed report or through a parameter driven self service portal.  The power user, on the other hand, should be given access to the data warehouse for their ad hoc analysis and avoid the need to wait for controlled cubes. 

 

PEOPLE – PROCESS – TECHNOLOGY

At this point I was a tad irritated – why don’t they learn?  40 years ago things did take a long time, IT controlled how and what computers (the computer) did in an organisation – and we were accused of being hard to communicate with. I always wanted to be proud of the systems I delivered.  In fact, how many times did you see in the newspapers when there was a ‘glitch’ – “it was a computer error” [technology] when you knew full well it was the human! [people and/or process] Grr….

 

But today there is no excuse – most people are computer savvy and expect to be able to use tools to be self sufficient – like being able to load data to a web site that you are responsible for – and power users exploring the data itself.  Security, protocols, monitoring tools etc exist to protect data and enable access not to create major impediments for authorised business users, BI tools and data warehouses can (and do) support self service and rapid/agile deployment. They also free up the IT staff to do much more value adding ‘cool stuff’ rather than loading data or building a cube for someone else.  Many of our customers can proudly say “our Teradata data warehouse delivers the right data, in the right time, to the right people – quickly.”

 

If you are reading this and have a similar complaint about your data warehouse, or if you have a good data warehouse and want to do more – then I suggest it’s time for a health check. Find out where you can make improvements in people (skills), process and technology.  IT is an enabler for, not a constrainer of (that’s the purview of your imagination and budget!) delivering business value.

 

Christine Page-Hanify

Share This Article
Exit mobile version