A Brazen Expose on Big Data in the Government Sector

April 19, 2014
169 Views

ImageA few weeks back, I was contacted by Tamsin Rutter of The Guardian’s Public Leaders Network. She asked me if I would like to participate in Guardian’s online debate about big data in the government sector. Given my experience with this, I was more than happy to oblige.

ImageA few weeks back, I was contacted by Tamsin Rutter of The Guardian’s Public Leaders Network. She asked me if I would like to participate in Guardian’s online debate about big data in the government sector. Given my experience with this, I was more than happy to oblige.

There were about 10 experts present and active in the debate. You can read the re-cap here if you’re interested to know more about what everyone else had to say.  I found the debate to be a bit challenging because Tamsin didn’t actually ask the questions that she had provided me by which to prepare. For this reason, I have used the questions that he originally provided me as a guide post here in my Brazen Expose on Big Data in the Government Sector.

***Before proceeding, please heed the following warning***

In this article I am only discussing my personal experience. I have made some pretty broad and condemning generalizations, and many of them may not be representative to what is happening in your business or organization. Maybe your IT department is really cool and has tons of great data scientists and analysts… maybe you are lucky enough to work for an organization where everyone is collaborating and helping each other out. If so, consider yourself fortunate! Based on my experience and many conversations I have had with fellow data analysts, data visualization specialists, and data scientists… you are in the minority. The content of this article only expresses my opinion, based on my experiences and the experiences of my colleagues who’ve practiced data science in the public sector.

And now, without further ado… let’s begin!

Which existing parts of central and local government are best-placed to use (big) data effectively?

I would say that all divisions could stand to benefit by using (big) data effectively. The problem is often in resistance to change. Managers generally aren’t operating from a quantitative perspective. They are people managers, not data analysts. Getting them to adopt technology for data-driven decision-making is often the largest obstacle. It’s a “if you build it, they probably won’t come” scenario. To drive adoption of the data-driven decision-making approach, you’ll likely need to get that mandated by organizational leaders-at-large… and even if you do, adoption does not mean “effective adoption”. Managers have to want it for (big) data to be effectively used in government.

But, I digress… I think that any direct customer service line of business within the government has a lot to gain by adopting (big) data and analytics to improve customer experience. Internal business operations could also be made a lot more resource efficient, but again… enforcing effective adoption of (big) data in government is a very difficult thing to do.

What changes are needed to effectively pull together (big) data in the government sector?

In many governments, as in many private sector organizations, there is still a significant obstacle to overcome the “silo” effect. Not only that, you have classic IT departments out there that act as a true hindrance to any sort of progress with respect to (big) data or its effective utilization. I think the classic IT department is the biggest hindrance and the biggest thing that needs to be changed in order to facilitate the incorporation of (big) data and the business insights it generates.

IT departments classically serve a vital role! They make sure the infrastructure stays secure. They make sure that the infrastructure is adequate to support operational demands. IT performs a number of vital tasks, many of which I am unfamiliar with because I have never worked in proper “IT”, per se. This said, classical IT department might be populated with great IT Engineers and Computer Engineers and whatnot; but they’re generally not populated with great analysts… IT is not known for its strong skills in “Data Science”, “Data Analytics”, or “Data Visualization”. That’s generally because these roles are outside the range of their expertise. They are not statisticians, analysts, or data scientists – they are IT engineers. They host and secure business data, but they don’t have the substantive knowledge to make sense of that data.

Making matters worse, classical IT departments are generally VERY threatened by government (internal) businesses having coders and programmers working on and making sense of their data. IT loses control in these situations… and it’s a decentralization of “computing skills” to entities that are operating outside of their control. They often thwart efforts to do this by any means possible… so (big) data technologies are many times “banned” for no justifiable reason… just because IT department says so.

For this reason, (big) data and its insights cannot be pulled together to benefit most government (internal) businesses. Until this changes, I honestly don’t see a way that (big) data can be used effectively in the government sector (at least in governments that face this same dilemma). Again, the answer to this is in the organizational managers mandating and ENFORCING changes to be adopted, this time in the IT line of government business.

What are some ways of approaching (big) data differently, through events such as hackdays?

Hack days are great! Open data is great!! It’s so inspiring to see what is being done in New York City. This said, it’s hard for me to imagine many government organizations TRULY making their data “open” like NYC has. There are political risks in doing that… and the IT departments would see that as a huge risk and vulnerability. There is a lot of talk about “open government”, but I think that governments are usually only as “open” as they need to be to check that box off the list of PR requirements to qualify as a “cool” government”.

If government employees can’t even freely use and access the data that their line of business is generating, there is no way in heck that public citizens would get unfettered access to it. Governments have SO MUCH to gain from the work of data-do-gooders and digital volunteers who are eager to use “open data” to improve their communities… but governments are slow to adopt, and unless the citizen base knows and deliberately pushes for REAL “open data” and “hack days”… I’m afraid that many of these things will be little more than PR exercises.

What is the potential impact of good leadership and senior champions of (big) data in the public sector?

Good, strong, data-savvy leadership is the answer to the government (big) data woes (almost all of them I would say). Government leaders are the only ones that have the power to mandate and enforce effective adoption of (big) data driven decision-making at the managerial level. Because there is a steep learning curve, and because most managers are not that tech-savvy…. Leaders will have to push hard (both indirectly through inspiration and directly through enforcement). Leaders will have to mandate IT to let internal business analysts, staff, and managers have and use their data freely… No more of this stalemating ad nauseum so that nothing effectively gets done. There is going to be a HUGE HUMP and leaders are going to have to push hard to get their organizations to the other side. Passive leadership will never due.

This said, when the changes are finally propagated through, government is looking at a huge payload in the form of resources and taxpayer’s dollars saved. Considering the deficit spending of most governments, it’d be nice if the push for adoption comes sooner rather than later.