The Architecture After Cloud

10 Min Read

I think that Zach Nelson (Netsuite’s CEO) was wrong when he said that “cloud is the last computing architecture” but I also believe that his quote is a healthy challenge to take computing and business architectures to a new level.

I think that Zach Nelson (Netsuite’s CEO) was wrong when he said that “cloud is the last computing architecture” but I also believe that his quote is a healthy challenge to take computing and business architectures to a new level.

Nelson went on to say “I don’t think there is anything after it (cloud). What can possibly be after being able to access all of your data any time, anywhere and on any device? There is nothing.”  His comments are available in full from an interview with the Australian Financial Review.

Aside from cloud, our industry has had a range of architectures over the decades including client/server, service oriented architecture (SOA) and thin client.  Arguably design patterns such as 4GLs (fourth generation languages) and object oriented programming are also architectures in their own right.

I think that we can predict attributes of the next architecture by looking at some of the challenges our technologies face today.  Some of these problems have defined solutions while others will require inventions that aren’t yet on the table.

Security

Near the top of any list is security.  Our Internet-centred technology is increasingly exposed to viruses, hackers and government eavesdroppers.

One reason that so much of our technology is vulnerable is that most operating systems share code libraries between applications.  The most vulnerable application can leave the door open for malicious code to compromise everything running on the same machine.  This is part of the reason that the Apple ecosystem has been less prone to viruses than the Windows platform.  Learning from this, it is likely that the inefficiency of duplicating code will be a small price to pay for siloing applications from each other to reduce the risk of cross-infection.

At the same time our “public key” encryption is regarded by many as being at risk from increasing computing power for brute force code cracking and even the potential of future quantum computers.

Because there is no mathematical proof that encryption based on the factoring of large numbers won’t be cracked in the future it can be argued to be unsuitable for a range of purposes such as voting.  A future architecture might consider more robust approaches such the physical sharing of ciphers.

Privacy

Societies around the world are struggling with defining what law enforcement and security agencies should and shouldn’t have access to.  There is even a growing debate about who owns data.  As different jurisdictions converge on “right to be forgotten” legislation, and societies agree on what back doors keys to give to various agencies, future architectures will be key to simplifying the management of the agreed approaches.

The answers will be part of future architectures with clearer tracking of metadata (and indeed definitions of what metadata means).  At the same time, codification of access to security agencies will hopefully allow users to agree with governments about what can and can’t be intercepted.  Don’t expect this to be an easy public debate as it has to navigate the minefield of national boundaries.

Network latency

Another issue that is topical to application designers is network latency.  Despite huge progress in broadband across the globe, network speeds are not increasing at the same rate as other aspects of computing such as storage, memory or processing speeds.  What’s more, we are far closer to fundamental limits of physics (the speed of light) when managing the transmission from servers around the world.  Even the most efficient link between New York and London would mean the round-trip for an instruction response is 0.04 seconds at a theoretical maximum of the speed of light (with no latency for routers or a network path that is not perfectly straight).  In computing terms 0.04 seconds is the pace of a snail!

The architectural solution has already started to appear with increasing enthusiasm for caching and on-device processing.  Mobile apps are a manifestation of this phenomenon which is also sometimes called edge computing.

The cloud provides the means to efficiently synchronise devices, with the benefits of computing on devices across the network with cheap and powerful processing and masses of data right on-hand.  What many people don’t realise is that Internet Service Providers (ISPs) are already doing this by caching popular YouTube and other content which is why it seems like some videos take forever while others play almost instantly.

Identity

Trying to define the true nature of a person is almost a metaphysical question.  Am I an individual, spouse, family member or business?  Am I paying a bill on my own behalf or doing the banking for my business partners?  Any future architecture will build on today’s approaches and understanding of user identity.

Regardless of whether the answer will be biometrics, social media or two-factor authentication it is likely that future architectures will make it easier to manage identity.  The one thing that we know is that people hate username/password management and a distributed approach with ongoing verification of identity is more likely to gain acceptance (see Login with social media).

Governments want to own this space but there is no evidence that there is a natural right for any one trusted organisation.  Perhaps Estonia’s model of a digital economy and e-residency might provide a clue.  Alternatively, identity could become as highly regulated as national passports with laws preventing individuals from holding more than one credential without explicit permission.

Internet of Things

The Internet of Things brings a new set of challenges with an explosion of the number of connected devices around the world.  For anyone who is passionate about technology, it has been frustrating that useful interfaces have been so slow to emerge.  We were capable of connecting baby monitors, light switches and home security for years before they were readily available and even today they are still clumsy.

Arguably, the greatest factor in the lag between the technological possibility and market availability has been the challenge of interoperability and the assumption that we need standards.  There is a growing belief that market forces are more effective than standards (see The evolution of information standards).  Regardless, the evolution of new architectures is essential to enabling the Internet of Things marathon.

Complexity and Robustness

Our technology has become increasingly complex.  Over many years we have layers building on layers of legacy hidden under facades of modern interfaces.  Not only is this making IT orders of magnitude more expensive, it is also making it far harder to create bulletproof solutions.

Applications that lack robustness are frustrating when they stop you from watching your favourite TV programme, but they could be fatal when combined with the Internet of Things and increasingly autonomous machines such as trains, planes and automobiles.

There is an increasing awareness that the solution is to evolve to simplicity.  Future architectures are going to reward simplicity through streamlined integration of services to create applications across platforms and vendors.

Bringing it all together

The next architecture won’t be a silver bullet to all of the issues I’ve raised in this post, but to be compelling it will need to provide a platform to tackle problems that have appeared intractable to-date.

Perhaps, though, the greatest accelerators of every generation of architecture is a cool name (such as “cloud”, “thin client”, “service oriented architecture” or “client/server”).  Maybe ambitious architects should start with a great label and work back to try and tackle some of the challenges that technologists face today.

Share This Article
Exit mobile version