By using this site, you agree to the Privacy Policy and Terms of Use.
Accept
SmartData CollectiveSmartData Collective
  • Analytics
    AnalyticsShow More
    analyst,women,looking,at,kpi,data,on,computer,screen
    What to Know Before Recruiting an Analyst to Handle Company Data
    6 Min Read
    AI analytics
    AI-Based Analytics Are Changing the Future of Credit Cards
    6 Min Read
    data overload showing data analytics
    How Does Next-Gen SIEM Prevent Data Overload For Security Analysts?
    8 Min Read
    hire a marketing agency with a background in data analytics
    5 Reasons to Hire a Marketing Agency that Knows Data Analytics
    7 Min Read
    predictive analytics for amazon pricing
    Using Predictive Analytics to Get the Best Deals on Amazon
    8 Min Read
  • Big Data
  • BI
  • Exclusive
  • IT
  • Marketing
  • Software
Search
© 2008-23 SmartData Collective. All Rights Reserved.
Reading: SAP and The Hoover Dam
Share
Notification Show More
Aa
SmartData CollectiveSmartData Collective
Aa
Search
  • About
  • Help
  • Privacy
Follow US
© 2008-23 SmartData Collective. All Rights Reserved.
SmartData Collective > Data Management > Culture/Leadership > SAP and The Hoover Dam
CommentaryCulture/LeadershipData MiningData QualityData WarehousingDecision ManagementKnowledge Management

SAP and The Hoover Dam

EstebanKolsky
Last updated: 2011/06/01 at 1:47 PM
EstebanKolsky
10 Min Read
SHARE

First, I did not go to SAPPhire Now 2011.  Had to get that out of the way quickly (and to confess I am late in filing my report, I was busy – sorry).

First, I did not go to SAPPhire Now 2011.  Had to get that out of the way quickly (and to confess I am late in filing my report, I was busy – sorry).

I relied on the superb job that SAP did with the remote coverage (there were 7-10 sessions covered in real time, great video and sound, superb coverage – almost like being there).  Kudos to whoever ran that part of the event, I can honestly say that they are many steps closer to running virtual shows if they so desire.  Excellent job.

More Read

data mining

Data Mining Technology Helps Online Brands Optimize Their Branding

Use this Strategic Approach to Maximize Your Data’s Value
Niche Data Tactics to Take Your Business to the Next Level
What is Data Pipeline? A Detailed Explanation
Can Data Mining Aid with Off-Page SEO Strategies?

Yes, I missed many things by not being there.

Which is where collaboration and a phenomenal network of peers, friends, and fellow independents fills in.  I wasn’t there, but I go sufficient information from chatting with lots of them, reading their updates as they were going on and post-show, and chatting with SAP people on the phone and email that I actually feel that I was there.

Alas, before we get deeper into SAP, let me tell you about the Hoover Dam.  Yes, the Hoover Dam.

Some time ago PBS ran a documentary on the Hoover Dam (you can watch it online if you want, it is quite fascinating).  Long recognized as one of the projects that helped rebuild the country’s confidence following the great depression, it is mind-blowing to see what had to happen  to finish it early.  Yep, it was completed ahead of schedule — the project manager (a gentleman by the name of Frank Crowe, considered “the finest dam builder in the world”) wanted to avoid the penalties if they went over the seven years allocated.  The most interesting part was how many things were invented, circumvented, or done differently on the road to build the Hoover Dam.

Case in point: concrete.  Turns out that concrete needs time to cure once it is laid.  This is because the chemical reactions that make it hard first make it hot.  Very hot.  The elevated internal temperatures of concrete during the curing process means that to build the Hoover Dam it would’ve taken 125 years with techniques that existed back then.  Frank Crowe, committed to making that seven-year-deadline, designed a system that uses water pipes built into the concrete subsections carrying cold water, cooling down the concrete and helping it cure faster.  By dividing the entire dam into segments and using this water-cooling method, the Dam was done ahead of time.  Oh, yeah – he got that and many other tricks up his sleeve… This wikipedia entry gives you all the details (if you don’t want to watch the video above).

OK, back to the world we live in.  Why on earth am I talking about the Hoover Dam and SAP in the same post?

Because SAP has taken on building the Hoover Dam of the modern era;  In-memory analytics is the equivalent of building a Hoover Dam in our technological days – and it is going to need all the ingenuity, all the tricks we can master.

To those that doubt it, we are not talking about moving data from hard drive to in-memory and processing it there- nah, that is something that could’ve been done by MS-DOS 3.1 many years ago.  We are talking about revamping and redoing the entire way our programs today process data and make things happen.  It is not simply about speed (which, in all frankness – would be like reducing 125 years of curing concrete to roughly two years), but it is about working differently.  This is what SAP has committed to make happen.

Now, the question in all your minds? Can they make it happen?

Well, here is the deal.  Most of the nay-sayers are pointing to the departures at the executive level and saying that SAP is crumbling, that they cannot keep key people there and that is going to hurt them.  They are saying that the talent needed to make it happen is gone.  I think there is some true to that.  Some of the people that left were key in building components of the new SAP, the one that is going to use and leverage in-memory.  However, many more are left behind – and they are the ones making it happen.  For each John Wookey that left, there are teams of developers and database specialists that implemented the vision and the product he designed.  Those people are for the most part staying.  The same holds true for other areas of the company where executives and high-level visionaries have left: their teams and the experience are left behind.

Yes, but – can they do it?  Can they build in-memory in a timeline that makes sense for the company or their clients?

Frankly, I don’t know – I think so… but it would not be the first time that SAP comes up with some killer thought-leadership or innovative-feature and is unable to deliver fully. The SAP dichotomy (which I wrote about last time I wrote about SAP) remains: excellent thinkers, great technologists, poor exhibition.  My hope is that some of the people who left are the ones that were holding back the progress, that having less people signify fewer political battles, fewer egos to bruise and more focus on what matters: delivering what the customers want.

Yeah, Yeah, Yeah — cute, corny and all that… but, can they deliver?

Fine, if you won’t let me evade the question anymore – I hope so. I know that many at SAP hope the same and it is something that only time will be able to tell.  Next step in this Journey, as friend and excellent reporter Dennis Howlett reported, is SAP Europe in Madrid.  If we see the same message and some tiny progress, any progress, by then we know we have a game on.

Further coverage of the show was surrounding Mobile (ho-hum from my vantage point, but sure got lots of coverage). I know it is me, I just don’t get what SAP can offer there with Sybase that they could not deliver before, nor do I see what the whole bruhaha is about.  I know we are all going to be running in-brain computers with mobile browsers in 2 years – or whatever the latest hype is about mobile – but it is nothing to accomplish compared to HANA, in-memory analytics, and even putting together a good plan for the cloud. Yes – that cloud.

There were other feature improvements in products, new positioning and message for others (including analytics) – but none of them are comparable (even if you put them all together) to the massiveness of in-memory.  It was nice buzz-words, good hype-compliant alphabet soup or bingo card — but not where the money is.  Money in the future is going to be in near-real-time (you can never do real time, but that is fodder for another post), in-memory processing.

In-memory is what is going to make-or-break (as in be acquired) SAP in the next five-to-ten years (what? you thought it would take 18 months? dream on).  We shall see someprogress in 24-to-36 months if they are going to succeed.

From the (virtual) floor show, this is Esteban reporting – back to you in your office.  What says you?

*****
Note: If you want to read coverage for SAPPhire (or however many capital and lowercase letters you need for branding it properly) read Dennis Howlett and Vijay Vijayasankar’s excellent Day 1, Day 2, and Day 3 part series.
Disclaimer: SAP is an active client, they also offered to pay for my expenses to attend SAPPhire, which I had to decline on advice of my wife’s divorce attorney given my rate of travel recently.  I cancelled at the last minute, and I feel really bad about it, promise.

Share

EstebanKolsky June 1, 2011
Share This Article
Facebook Twitter Pinterest LinkedIn
Share

Follow us on Facebook

Latest News

data breaches
How Hospital Security Breaches Devastate Local Communities
Policy and Governance
analyst,women,looking,at,kpi,data,on,computer,screen
What to Know Before Recruiting an Analyst to Handle Company Data
Analytics
data perspective
Tackling Bias in AI Translation: A Data Perspective
Big Data
Data Ethics: Safeguarding Privacy and Ensuring Responsible Data Practices
Data Ethics: Safeguarding Privacy and Ensuring Responsible Data Practices
Best Practices Big Data Data Collection Data Management Privacy

Stay Connected

1.2k Followers Like
33.7k Followers Follow
222 Followers Pin

You Might also Like

data mining
Data Mining

Data Mining Technology Helps Online Brands Optimize Their Branding

7 Min Read
analyzing big data for its quality and value
Big Data

Use this Strategic Approach to Maximize Your Data’s Value

6 Min Read
niche data tactics for business success
Big Data

Niche Data Tactics to Take Your Business to the Next Level

6 Min Read
What is Data Pipeline A detailed explaination
Big Data

What is Data Pipeline? A Detailed Explanation

8 Min Read

SmartData Collective is one of the largest & trusted community covering technical content about Big Data, BI, Cloud, Analytics, Artificial Intelligence, IoT & more.

ai is improving the safety of cars
From Bolts to Bots: How AI Is Fortifying the Automotive Industry
Artificial Intelligence
giveaway chatbots
How To Get An Award Winning Giveaway Bot
Big Data Chatbots Exclusive

Quick Link

  • About
  • Contact
  • Privacy
Follow US
© 2008-23 SmartData Collective. All Rights Reserved.
Go to mobile version
Welcome Back!

Sign in to your account

Lost your password?