The New Realities of Research Compute: HPC, AI and the Infrastructure Shift Ahead
TL;DR
Research computing is entering a transition phase where traditional HPC and AI workloads must coexist. High-performance computing (HPC) has evolved rapidly, with rising power, cooling and GPU demands reshaping how research institutions manage compute. Yet the cultural challenge remains: every new generation of researchers still needs to learn that powerful tools exist far beyond their laptops. AI workloads are now accelerating this shift, pushing universities to consider dedicated AI systems, national facility partnerships and hybrid infrastructure models. Demand for compute is soaring, and institutions able to scale flexibly, across on-prem, national and cloud platforms, will give their researchers a decisive advantage.
Meeting Demand at the Intersection of Traditional HPC and AI.
At this year’s Australasia eResearch Conference in Brisbane, Macquarie Cloud Services’ Head of Azure, Naran McClung sat down with John Zaitseff, a High Performance Computing (HPC) specialist from the University of New South Wales. With over 12 years in the field, John has witnessed multiple generations of researchers and several waves of technological transformation. Their conversation offered insight into what is changing in HPC, what still hasn’t changed, and how research institutions are managing the collision of established computing needs with explosive growth in AI workloads.
John’s role extends beyond maintaining systems. He helps researchers understand that powerful computing resources exist well beyond their laptops, a message that surprisingly remains necessary even after years of HPC availability. As he put it, every new generation of researchers needs to learn that there are different ways to approach computational research. It’s a cultural shift as much as a technological one, and this educational dimension of his work has remained constant even as the technology itself has evolved dramatically.
From Thermal Limits to Liquid Cooling.
The physical infrastructure supporting research computing has transformed significantly. Around five years ago, UNSW began exploring liquid cooling solutions because CPUs were already reaching thermal limits and required throttling. Today, with GPU workloads dominating AI and machine learning research, cooling has become even more critical. Data centres are racing to keep up with the thermal realities of modern compute, and direct-to-chip liquid cooling, once unthinkable, is now an emerging standard.
John noted that while some aspects of HPC have changed substantially, particularly around processor power and cooling requirements, other fundamentals persist. Researchers still submit jobs, wait for results, and return hours or days later to analyse their findings. But AI workflows are different. Training models, tweaking parameters, and iterating on results require a more interactive approach that doesn’t fit neatly into traditional batch job queuing systems.
The AI Workload Question: On-Premise or National Infrastructure?
Universities across Australia face a pressing question: should they invest in dedicated AI-specific compute systems, and if so, should these be locally owned or part of a larger national infrastructure? John indicated that UNSW is actively considering these options and expects to announce decisions soon.
This distributed model reflects a broader trend in research infrastructure. Some workloads benefit from proximity to researchers and institutional data. Others require the scale and specialisation that only national facilities can provide. As AI workloads grow, institutions must decide where those capabilities should reside and how they should be funded and managed.
Demand Is Surging.
When asked about demand, John’s answer was unequivocal: very big. Queue times are already such that researchers are constantly asking for more capacity. The constraint isn’t just about completing individual jobs faster, although time certainly matters. It’s about enabling researchers to pursue multiple streams of inquiry simultaneously, to test different hypotheses in parallel, and to accelerate the overall pace of discovery.
Traditional HPC isn’t disappearing. Many researchers continue to rely on established workflows and don’t require anything beyond what’s already available. But the researchers who do need AI capabilities, who want to train models or work with generative tools, represent a growing and increasingly vocal portion of the community. Meeting that demand will require investment, strategic planning, and partnerships that span universities, national infrastructure providers, and commercial cloud platforms.
Building for a Hybrid Future.
This interview echoes a theme repeated throughout the eResearch Conference: the pace of scientific discovery is increasingly constrained not by ideas, but by infrastructure.
Macquarie Cloud Services’ launch of CAUDIT Cloud in partnership with Dell Technologies reflects the broader shift toward flexible, high-performance research environments. The ability to provision and manage compute resources through frameworks like ICTA gives institutions options. Research doesn’t happen in a single location or on a single platform, it happens across on-premise systems, national facilities, and commercial cloud, often within the same project.
John’s insights underscore a reality that many in the research sector are navigating: the future of research computing is hybrid, demand-driven, and rapidly evolving. Traditional HPC will continue to serve essential roles, but AI workloads are reshaping expectations around speed, scale, and accessibility. The institutions that succeed will be those that can balance investment in local capabilities with participation in national infrastructure, while remaining agile enough to adopt new technologies as they emerge.
The demand is real. The expectations are rising. And the institutions that adapt first, culturally and technologically, will give their researchers a powerful edge.
Full Interview Transcript.
Here we go again. My name’s Naran McClung. I work for Macquarie Cloud Services. Who am I here with
Hi, my name’s Derek Knox. I work with Dell Technologies.
And we are at the Australasia e Research Conference, 2025. This place is full of research minded people. I don’t know, there’s a lot of energy here. How do you feel about it?
It’s a lot of energy. We’ve joined at the lunchtime rush.
Yes.
After a number of sessions. We started at seven o’clock today, it’s now midday. The atmosphere is electric, which you would imagine to be in a researching environment. But this is the peak of Australian researchers who use AI
Yes.
And use it in the production of what they actually do.
Right now, I spoke to Paulo. He was one of the keynote speakers from Edith Cowan University. He was on the Rover Project with Mars and NASA as well, which I found super fascinating.
Did you mention the pyramid?
Listen, I may have mentioned some structures.
I think they’re trying to figure out whether there’s microbes and water and stuff on Mars. I think they need to go straight to the alien structures.
Yep.
That’s what interests me.
Total recall.
Right. I asked him that question directly and he said, if he told me he’d have to kill me, I was ready for that, and that’s when the conversation stopped, which says that he’s, he’s holding onto something and he’s not telling the full story.
Well, they’d have to actually prove that it was actually there.
Right.
The thing I found the most amazing about his speech was. They land the mouse rover on a big, giant bunch of balloons.
That’s right.
And it bounces for a kilometer. 36 times that thing bounced and it still landed okay. And yet I can’t get my car outta the garage without getting a flat tire.
I think if your car had those big blow up things around it, you could probably just bounce your way out of it and you wouldn’t have to concentrate at all.
Yep.
Right. Alright, good. AI seems to be the conversation here, as well as data. Now, the Mars Project, he says they’ve got, you know, decades of data to work through. They’ve got PhD students that are pouring over it for all their projects, and yet AI has a role to play and certainly there’s a lot of conversation in market around return on investment. I think AI in this e Research use case,
Yep.
Seems to go very well together. And nobody said a bad word about it.
Yep.
So when you looked at this market, say 20 years ago or 10 years ago, um, AI’s been around for quite some time, but just at this extent. The use of GPU would’ve been maybe 20, 30%. It’s now above 50% in the universities and research sector. That’s because they’re getting the most return of investment in research by being able to do the most experiments as fast as possible.
Yes.
By using GPU. GPU is just a lot of memory and a lot of, um, processor as fast, doing multiple predicted modeling and that is what they can do in here. So research is being made better and it’s proven by GPU.
I love that. Well, look, MIT said that the best use cases are back office automation. I think if e Research is dealing with petabytes of data domain specific use cases, that seems spot on to me and I know Dell Technologies, a strategic partner of ours, underpinning our CAUDIT cloud offering, I mean, that’s what it’s all about.
Yep. When you look at what’s happening in the industry, so people have to put this data somewhere, people actually have to process the data somewhere. They didn’t have many choices before on how to do it.
They can put it under their desk, they can put it in someone else’s cloud -somewhere with Caudit cloud, what’s been built is a purpose-built environment, made for research, made for GPU and storage. Sovereign in Australia, and that’s really important because the more and more, government and, defense and places like that what sort of IT, the better.
And that’s what we built in partnership with CAUDIT at Macquarie to service this market.
Good. I love that. Well look, we’re gonna keep talking to people , and we’re gonna obviously attend some more sessions today, so I think we’re gonna learn more. And, uh, let’s keep going.
Thank you for inviting me along.
You’re very welcome.