Saturday, May 31, 2025

Signalgate: A Call to Reassess Security Onboarding and Training

Mobile Device Trade-In Values Surge 40% in the US

DSIT urges Ofcom to get ready for broader regulatory responsibilities covering datacentres.

AI and Private Cloud: Key Takeaways from Dell Tech World 2025

Four Effective Strategies for Recruiting Technology Talent in the Public Sector

US Unveils New Indictments Targeting DanaBot and Qakbot Malware Cases

Imec ITF World 2025: Pioneering the Future of AI Hardware

AI Solutions for Network Administrators | Computer Weekly

What is a Passkey? | TechTarget Definition

Nvidia advances accelerated computing advantage

In the second quarter of 2025, Nvidia saw a remarkable 154% increase in revenue for its datacentre business, reaching $26.3bn. This contributed to the company’s overall record quarterly revenue of $30bn, up 15% from the previous quarter and 122% from the previous year.

Nvidia’s founder and CEO, Jensen Huang, attributed this success to the transition to accelerated computing by datacentre operators globally. Demand for Nvidia’s Hopper GPU computing platform, specifically for training and inferencing in large language models, recommendation engines, and generative AI applications, drove the growth in the datacentre business.

The company reported that growth in revenue came from both cloud service providers and consumer internet and enterprise companies. Nvidia’s networking products also saw a 16% sequential increase in revenue.

Huang emphasized the importance of accelerated computing in addressing the increasing demand for computational power. He highlighted the cost-saving benefits of GPU-powered servers and predicted that all datacentres would eventually incorporate GPU technology.

Huang’s vision for accelerated computing aims to provide datacentre operators with increased computational power in a sustainable manner, while also reducing the cost of computing and avoiding “computing inflation”. He pointed out the advantages of liquid-cooled datacentres, which offer three to five times the AI throughput compared to traditional methods.