Five years ago, an infamous Gartner blog said, “The Edge will Eat the Cloud.” The gist was that while everyone was rushing to the cloud, Gartner saw a rush in the other direction – to the edge–driven by a need for lower latency and near-real-time processing. Fast forward five years, and with edge computing now forecasted to be an $800B market by 2028, it’s a perfect time to revisit the topic.
So, has there been a rush to the edge? Absolutely. Is it “eating the cloud?” Well, the cloud is as strong as ever, so it is hard to make that case. But there is a strong chance the edge will transform the very internet itself. To explain why, let’s go back to the beginning of the internet.
On October 4, 1957, the USSR launched the world’s first satellite, Sputnik. Four months later, a shocked US Department of Defense established the Advanced Research Projects Agency (ARPA) to ensure the US never lost another technology race. While landing on the moon may be the most famous outcome from ARPA, the most impactful outcome was the internet.
This began with a research paper written by J. R. Licklider in 1962. It foresaw so much of what would eventually become the Internet, including e-commerce, online banking and cloud computing. The first iteration was called ARPAnet (for obvious reasons). ARPAnet was conceived as a peer-to-peer network, designed with no central core so that it could withstand a nuclear attack.
At first, ARPAnet was restricted to use by the government and educational institutions. But in 1991, ARPAnet became the internet and was eventually made available to the public. At first, the internet was still a peer-to-peer network. But over the next three decades, the internet increasingly became a client-server architecture used mostly for consumer applications.
Today’s internet usage underlines its conversion to a consumer network:
The rest of the traffic is split between messaging, search, gaming and — in a small way — business computing. Note that mobile devices comprise HALF of all internet traffic and consumers downloaded 204 billion mobile apps in 2019 alone.
What does a consumer look like? Overwhelmingly client-server. Nearly all consumer applications (Facebook, YouTube and Spotify) are client-server. Data and compute are co-located at the internet’s core, while users are at the edge.
Client-server is fine for a consumer-grade network but it has characteristics that, while benign for consumers, are deadly for business applications:
Consumers can live with these challenges, but businesses cannot. In fact, the rise of edge computing is driven – in large part – by these very challenges (latency, security, data mobility issues). Sixty years after J. R. Licklider imagined the internet, we are watching a reimagining. Flipping compute to the edge changes so much:
Many new technologies are helping to build the edge. 5G and Wi-Fi 6, for example, both promise speed and low latency. But there is still one piece we’re missing. The speed and low-latency advantages evaporate if used in a client-server consumer network. To fully realize the promise of edge computing – and to build a true business internet, we also need to move to a peer-to-peer edge computing model to build a true business internet.
So, how does this new business-grade network have to operate? Four things need to change:
So, was Gartner right? Is the edge eating the cloud? No, but the edge is eating the internet. We’re seeing a transformation from a consumer-grade client-server internet into a business-grade, peer-to-peer network. A fast, low-latency, reliable, secure network. The network Licklider first imagined six decades ago.