Datacentre provider Telehouse says the rise of datacentre AI is important in two ways. The first trend is that AI applications need the global datacentre to ‘provide the necessary computational power’; and the second effect is that ‘AI applications are being developed to improve data centers themselves.’ This means that datacentres are both serving and being served by artificial intelligence.
Datacentre AI: drivers
Telehouse talks about the drivers for datecentre AI. These include:
- The need for global datacentres to have Graphics Processing Units (GPUs)
- The fact that artificial intelligence is helping datacentres to become more energy efficient.
- The use of artificial intelligence for server optimisation.
- The use of artificial intelligence for datacentre security.
- The prediction that increasingly large datacentres will require AI operators and robots to manage them better.
- The belief that some datacentre infrastructure management tasks can be handed over to artificial intelligence to allow “humans can concentrate on the most critical and creative aspects of maintaining an efficient data center.”
It concludes: “At the moment, artificial intelligence is looking promising for the data center industry. The rise of AI-based applications will increase the demand for colocation service providers. Global data centers and colocation service providers will step up their game to meet this demand. And AI-based applications will help these data centers run efficiently to provide better service to their customers. If you are developing an AI-application, it is important to choose global data centers and colocation service providers who can help you provide cost-efficient services through the use of latest technology for energy efficiency, optimization, security, compliance, and disaster recovery.”
AbsurdIT: the old data centre computing model is broken
Creating true SANs
In my opinion there is a big opportunity to use AI for storage and for storage area networks (SANs). SANS are not a true network; they are pools of storage connected to servers. However, there are a multiplicity of storage devices that can create a true multi-tiered storage configuration from the latest non-volatile memory express (NVMe) through flash, high performance disc drives, high capacity shingled drives, and with tape and optical disks. So, everything within the datacentre can create a layer of complexity that needs to be fully understood to drive efficiency gains.
AI: what CTOs and co need to know
This also means that AI has to have the capability of being able to understand the value of the data, and it should be able to organise the increasingly vast amounts of data that people and organisations store – wherever the data should be located within the appropriate tier. That requires promoting it to the highest tier, or to an appropriate performant tier before it is required.
As new capacity is added in the datecentre AI should be able to automatically understand its position in the hierarchy of storage and configure and use as needed. It should also be able to manage and locate the vast array of files and objects across the entire online storage facility – not by the servers and their list and catalogues, but by the same principles that are used is social media. People can only call it a true Storage Area Network when when the management and control is taken away from the servers.
What’s causing the move to the cloud? It’s all about digital transformation
Backup, recovery, and archiving
I also think there is another significant opportunity for datecentre AI, for backup, recovery, and archiving. In other words, AI should identify the critical and most valuable data, prioritising this data for backup and replication. Its other role should be able to create a disaster recovery (DR) suite of files for DR facility. This may require pushing it down the priority list, while working in conjunction with the storage AI to create tiered save-sets.
My last thought reflects on networks. With the spectre of latency and packet loss, and without sufficient mitigation being deliverable for SD-WANs and WAN optimisation, there is a need to use WAN data acceleration tools to mitigate their effects because they could make a datacentre less performant that it aspires to be. Part of AI’s role in this context includes automatic routing around congested and down connections, and with the help of machine learning it can automatically prioritise key data flows across the networks.
WAN data bonus
With WAN data acceleration there is the added bonus of being able to send and receive encrypted data, which is something that WAN optimisation can’t do. So, AI and machine learning will have an increasingly important role in helping organisations to get more out of their datacentres, and for the companies that run them to manage them more efficiently.
It’s also worth remembering that even the largest datacentre can’t be highly performant if its networks fail to allow it to have constant and reliable connectivity to other datacentres and to the wide area networks – including the internet. It therefore worth investing in a solution such as PORTrockIT that can enable service continuity even when disaster strikes.
What’s causing the move to the cloud? It’s all about digital transformation
Simple human intelligence also suggest that people shouldn’t rely on just one datacentre, but at least three that are located far apart from each other. Using human and artificial intelligence is therefore the best combination to ensure that downtime is something that won’t affect your business or your ability to securely store and retrieve your organisation’s data whenever your company needs it most.
David Trossell is the CEO and CTO of Bridgeworks.