The hype surrounding generative AI has meant it’s no longer a tool to experiment with during a lunchbreak – it’s now a technology that businesses are under pressure to adopt. The likes of ChatGPT have spawned new use cases that are upending entire industries. But whether organisations know it or not at this stage, it will also ultimately force them to consider how the infrastructure they use will hold up under the weight of such developments.

Many things are happening at once that are giving companies options as well as dilemmas. Firstly, AI, which is far from new as a technology in and of itself, is now asking more of the underlying infrastructure powering it because of the resource-hungry way in which large language models (LLM) are trained. Compute, GPU technologies and other accelerators that are central to a successful rollout of generative AI systems have been advancing quickly, which means businesses are in a much better place than they might have been. However, cloud computing infrastructure as we know it needs to evolve to keep pace with growing demand.



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *