Intelligent edge platform Akamai was born out of a conversation between Sir Tim Berners-Lee and the company’s CEO, Dr. Tom Leighton, at MIT. CTO of Akamai, James Kretchmar, explained how this led to the idea of a decentralised model for the Internet, that’s now deployed in over 1,500 networks today in an aim to better performance.
“Dr. Berners-Lee understood that there were going to be these future difficulties in getting good performance for content on the web, and that the architecture was going to have bottlenecks and congestion overload of servers,” said Kretchmar. “Dr. Berners-Lee posed the challenge to Tom, who is a well-renowned expert in algorithms to find algorithmic solutions to these problems. So, Tom and his students got to work on this, and they developed algorithmic solutions to these problems, which became the basis for Akamai, as we know today.
“The nature of those problems is that if you have a really popular website with end users around the world trying to visit that site, that content has to travel over many different networks to get to end users in other countries, which already introduces slowness.
Can broadband networks cope with increased data usage demands?
“One thing that may not be obvious about the Internet is that the bottlenecks are in between those networks, and if the content is really popular, that puts a load not only on the hosting server, but also the networks around it. That’s what drove our model, which is a highly distributed platform, where if an end user requests a piece of content, they get sent to a server closer to where they are.
“This avoids those bottlenecks, which improves performance, and means that end users don’t have to all swamp one centralised server.”
AI and ML impacts
Kretchmar went on to explain how artificial intelligence (AI) and machine learning (ML) impact the performance of the Internet, as well as how it can aid management of networks as the Internet continues to grow.
“Part of the way we make the service work is by understanding Internet performance in real time, and getting content to users by utilising what the best paths across the Internet are going to be, and those are constantly changing,” he said. “Machine learning and technologies like that help us figure out the structure of the Internet and what the different characteristics of different paths are, both historically and right now.
“There’s also the challenge in that there are so many bots out there now that are trying to do malicious things to websites. They want to steal data, and try to compromise login credentials, and that’s a case where machine learning is very important in order to figure out what is a bot versus what is a human.
Telling tales: using behavioural AI to reconstruct attack storylines
“The approach, at the beginning, tended to be ‘Let’s see if we can form a strategy to identify what is a machine that would be connecting,’ and then we can say ‘Alright, we know this machine is malicious, so let’s block that or handle it differently than if it were a real connecting end user’.
“But the learning strategy there has actually gotten sophisticated now to the point where the strategy is actually less of necessarily trying to identify if it’s a machine, but more if we can identify if it’s an actual human. We can look at what interactions from real humans look like, and this can demonstrate a lot of different factors, such as if it’s a mobile device, and they’re entering data by tapping the phone, it will move a little bit, as shown by the accelerometer. Similar checks can be made on desktop devices also.”
5G and IoT
Looking to the next 10 years, Akamai’s CTO envisions AI being used to make better use of emerging 5G and Internet of Things (IoT) data.
“There will be more and more data that will be relevant to the Internet, like 5G, which will help to enable IoT, and the challenge will be making use of this growing data to get something done.
Optimising the network edge for IoT
“What we’ve done with our intelligent routing systems, for example, it’s evolved from basic ways to learn from the data and has been getting more sophisticated, but when we look at what’s coming, there will be more consumption of high-definition video online and more websites, and more end users with better connectivity.
“This will make the challenge more complex, so we’ll need smarter algorithms to work with the data to figure out how to provide the best performance.”
Protecting from threat actors
Another AI-related trend predicted by Kretchmar involves protecting networks against evolving cyber attacks.
“The ecosystem is getting more complex, including cyber attacks,” he explained. “Malicious attackers are always finding a new, more creative and clever way to do something to try to take advantage of a system or the services that are that are offered online, so, machine learning will be involved in that.
“We just recently came out with a product that is oriented towards the way that modern websites work, which is that they’re often pulling in lots of third party content to make the main site work, so if you go to major travel companies or retail sites, their website is constructed by pulling in lots of things from different places to make that main site work.
How to keep critical data safe from third-party risks
“What the attackers have done is they’ve gotten creative, and they will compromise one of the many third parties that are including part of their data in that main site’s delivery. Here, an attacker could stick code in to, for example, try to steal a credit card, and the owner of the main site wasn’t really aware that a third party was compromised and including malicious code in their site.
“Now, one of the ways to try to combat that and what we’ve developed is an ability to figure out whether something malicious is happening with this site, by understanding the request flow and detecting, in an automated way, that something malicious is happening so that the site owner can be alerted and take appropriate action.”