Cloud Reset – The Podcast | Episode 13: Is Your Data Serving Someone Else’s Roadmap with Luc Betbeder-Maitbet

June 5 2025, by Cloud Reset | Category: Cloud Services
Cloud Reset – The Podcast | Episode 13: Is Your Data Serving Someone Else’s Roadmap with Luc Betbeder-Maitbet

Show Resources:

Here are the resources we covered in the episode:

Follow Jono Staff on LinkedIn

Follow Naran McClung on LinkedIn

Cloud Reset’s YouTube Channel

Listen on Spotify

Listen on Apple Podcasts

Contact us at enquiries@macquariecloudservices.com with any questions, suggestions, or corrections!

And don’t forget to subscribe, rate & review!

Episode Summary:

As generative AI systems mature, the conversation is shifting—from what they can do, to what (and who) they’re trained on.

In this episode, Luc Betbeder-Matibet, President of the Australasian eResearch Organisations (AeRO), joins Jono and Naran to unpack the broader implications of AI infrastructure, data quality, and sovereign value.
Forget the hyperscale data centres and shiny AI demos for a moment—this is a conversation about control.
Control over data.
Control over what models learn.
Control over where value flows.

Inside this episode:

  • Why “sovereign” needs to mean more than just data residency
  • What it would take for Australia to own the full economic value chain of AI
  • Why we may not know when AGI is achieved—and why that matters
  • Who’s paying (and who should be paid) when public data trains private models
  • Whether the future of trust in AI comes down to who controls the training loop

Luc doesn’t speak in hypotheticals—he’s been inside the systems, built national infrastructure, and counted the GPUs. This episode won’t tell you how to prompt better—it asks whether your data, your infrastructure, and your business are serving someone else’s roadmap.

Watch or listen now?
#CloudReset #AI #Leadership #DigitalTransformation #TechStrategy #MacquarieCloudServices #Podcast

Episode Transcript:

 Okay, here we are. This is a first Jono. We are on the road with the podcast. I didn’t think we’d ever take this thing on the road when we got the neon sign. I didn’t think that that would come out of a studio, but here we are on the road. We are at the Higher Education Technology Agenda Theta in Perth.

We have a stand here. We obviously have the podcast like Mobile Studio set up. We’ve been on stage. We’ve been talking about the launch of Caudit Cloud, which is our latest and greatest thing. We’re not here to flog that necessarily. I will just say a little thing about it. Obviously it’s the bringing together of different platforms.

It helps with research cloud. There’s high performance compute, there’s Azure Local. There’s all sorts of things in there. Ease of transaction and contracting through Caudit. That’s what we’re all about. That’s all I’m gonna say on that. More importantly, we’re here with a very special guest. How about you introduce our guest?

I can’t wait to introduce our guest, Luc Betbeder-Matibet. Luc is a person in the industry who’s doing some amazing stuff with cloud and research data and you know, obviously. Uh, AI certainly, and so I’m pretty excited about today’s conversation. But before we hand to Luc, a little bit of a bio. Yep. Luc is a nationally recognised subject matter expert in high performance research computing, research data practices, and shared research infrastructure services.

He’s currently vice president of AeRO President now. Oh, president. There we go. Australia’s congratulations. Australia’s eResearch organisation and co-chair of the eResearch Australasian Peak Annual Conference. He’s an adjunct senior lecturer at UNSW Faculty of Medicine Centre for Big Data and has been a visiting scientist with the Data 61 Visual Analytics team in Australia’s National Science Agency, CSIRO.

He has held director level roles for 15 years in higher ed, ICT and eResearch. Luc is currently the Executive Director Research Technology Services at UNSW, A shared services function that he established. Recently, Luc has been working with colleagues to regularly count how many high performance computing cores and research data there are in Australia and New Zealand. That’s a, that’s a bit of a mouth mouthful of serious achievements. It’s great. It’s great. Sorry about, that’s fantastic and uh, welcome Luc. Very happy to be here. Thanks for the invite. Pleasure. It’s good fun. Luc, there are so many places we can go with this conversation. I think perhaps a good place to start would be tell us a bit about what you’re up to.

Yeah, like I said in the bio, I’ve got, um. Lots of fun parts of the roles at the moment. So, uh, nationally, um, we’re involved in some of the conversations around, uh, the national digital research infrastructure strategy and roadmaps. Um, so what kind of research infrastructure we should have nationally, and, uh, that’s important to me from a workforce perspective.

So I’m mostly here speaking with my, what we say, AeRO hat on. As President of AeRO, which is our Australian eResearch organisation. So teams like mine that are at UNSW, uh, that are doing digital research infrastructure work, uh, for organisations in Australia, whether that’s inside CSIRO, inside a uni, inside say the ARDC or national infrastructure for HBC, uh, NCI in Canberra or palsy, all of those folks who do, uh, support research from an infrastructure perspective.

They’re the kind of people that I like to run events like Theta for. So we run the eResearch conference for our workforce, for our community, and uh, that’s passion for me to invest into our workforce, to have, uh, places where we can tell them about the latest and greatest and start to benchmark off each other and share a bit of information.

So, coming to conferences is, uh, something that I like to do. I like to bring my team along to conferences. I like to share what we are doing in forums. So doing a podcast to share a little bit about what we are doing is, uh, another way of, you know, being engaged with the community. So thanks for the invite.

I really appreciate it. No, you, you’re welcome. You’re very welcome. And. Uh, let’s see. So, um, we know that in the conversations that we’ve been having with the various leaders around Australia, around generative AI, there’s a lot of talk around lived experience, um, and that, uh, rolling your sleeves up and just exploring this technology space really helps to figure out how to get better outcomes.

So can you give us some perspective on what are some of the obstacles and what are some of the lessons in how to embrace this stuff properly? And, and what are the kind of things that you’re imparting upon your, your members in this space? Oh, that’s a, that’s a good intro. Uh, in a way, the first thing that I thought about it, it, it takes me back to one of the things that I started in one of my very early projects, very early fields.

So I started as a researcher at UNSW, um, spying on old people living at homes by themselves. So our assumption back in the day in, uh, did they know about it? Yeah, yeah. With full ethical permission and, uh, full ethical, um, you know, um. Uh, approval. Yeah. We wired up people’s homes living at home by themselves.

And I dunno if you knew the, the television. So, um, towards 2000 or beyond 2000, we actually were on an episode of Beyond 2000. I used to love that show. Yeah, me too. You know, we, we had, uh. Old PCs with old donated modems, dialing back to the university every night. I think I contributed, or I was one of the co-creators of this Australia, or maybe the world’s first toilet flush sensor that dialed information back to a central, um, aggregation point.

So, you know, nothing like, um, the, the kind of data we can get out of an Apple watch, um, or out of an edge system, but these were our edge systems and we were trying to figure out. What was the functional health status of somebody living at home by themselves? And could we get that from the signal? And so old school AI expert systems, could we get signal from the noise?

Got nothing, of course. Hmm. Right. Got my masters out of it. Fantastic. Very happy with that. Um, and I got into how does that technology side and these days, how does AI practically help us to do work both on the society side? But for me it’s been a sort of a, a thread, um, of a passion going, these tools and computational tools can help us to do far, much more.

So that, that’s what started me on my infrastructure journey. And recently I did the trip back to, um, uh, the GDC conference. So I went to listen to Jensen. As, uh, a learner again. So I explicitly went, hang on. I’ve been stuck in, uh, program management or getting my team ready for the next set of challenges, but I haven’t spent a lot of self-learning time.

When I’m going to conferences, like, like this one at Theta, I’m often coming in as a presenter or as an organiser. Sure. And I hadn’t been to a conference in a while where I was genuinely the student again. Mm. And I really, for this last one, back to Silicon Valley, force myself to be. Uh, did every single stand that I could in a massive stand hall.

I spent hours and hours on the show floor talking to experts, relearning, because I think as uh, practitioners, we have to get back to kind of, where’s the utility? What’s the value? What are people selling? How are they selling it? What’s the engagement points? And do I still trust myself as an expert in the field?

Mm-hmm. And it was really humbling to kind of go, okay, there’s a lot I’m still not getting and why these people are positioning in this way. Mm-hmm. But robotics, agentic AI, and really seeing it from the show floor in, uh, Silicon Valley. Amazing experience. So, Luc, with all the conversation that’s been going on with universities and um, international students, has there now been a shift in optimism towards what Australia can do with research?

And where research is heading. And is Australia trying to pivot itself to be more prominent in that space? What kind of outcomes might Australia be hoping to achieve? Is that sort of something that’s discussed in your space? I dunno, but I, I think that it is, and we have strategies, there’s a bit of a gap in the AI strategy space and we might come back to that.

Mm-hmm. But I think fundamentally we’re optimistic in terms of, Australians have always been optimistic and especially optimistic towards how we use tech and how tech’s used around us. So that, uh, adoption part as Aussies taking advantage of the latest and greatest, or contributing to the latest and greatest, and being part of those ecosystems as, um, uh, expert users, uh, there’s no problem there.

Mm. Um, homegrown, um, capability at a step change in capacity or a step change in, um, uh, GPU availability, uh, in Australia. That’s a bit of a gap at the moment, right. So I’m, I’m, we could, I think we could explore that, um, in some of the later conversations. But in the, um, let’s go with a model of say, uh, why we did open science, uh, high performance computing as a kind of, as a, a starting point.

If we think about where, uh, AI’s come from, it’s come from supercomputing centres where we’ve aggregated cores for CPU cores and then GPU cores. Okay. Well, the places that did that at scale first were the big national and, and, uh, exascale, HPC Centres, the HPC Centres. Why did we build a HPC Centre? We wanted to do science at scale using computational tools.

We worked with our, um, partners back in that generation of when we were buying Tin Dell, uh, Cray. So, HP Lenovo, who sells us the kit. Okay, so we built from an, uh, a foundation of the silicon, we started building these architectures of computer systems to get performance output output quickly. For what? Big scientific workloads?

Yeah. Climate modeling or other types of, uh, models that we could put through these call, let’s call them research instruments at scale and at performance. And we did that and we could compare the code. That would run for the climate model in the US or in Europe or in Australia. Mm. And we wanted our capability in Australia to go, okay, does the model that we are developing, does the science work on these computers?

And is the code replica? Can we replicate the code? Can we reproduce the code? Whether it’s run in the US system on Dell Kit run in Australia. Yep. On Fuji Kit. Yep. Run, uh, in Europe on bull kit or something like that. So the, it didn’t matter what type of silicon, who, who was selling the silicon, but we as a community, we agreed in that open stack kind of way or open source community, open sharing of, uh, information and code that we could run and trust the scientific output and trust the science behind these machines.

So a good friend of mine runs a pharma lab and he’s recently retired that before he did retire, he said to me that. They had such confidence in their modeling that if they, they knew that if they could scale their compute in a linear fashion, that they could achieve outcomes that they weren’t able to within the timeframe and the compute they had.

Is it the same story with your models? Were you able to predict that if you were able to double triple 10 x, you know, uh, to an order of magnitude that you would get different outcomes? Is that something you can predict? Yeah, so in that exascale journey, we there, there are certainly some models that scale beautifully and there are others where we’re still trying to find.

Uh, the optimisation patterns, but the, the bursting into, into sort of, uh, GPU land gave us a whole, um, new set of both capability and optimism, uh, optimism about ca catching back up again to Moore’s law and, um, scaling factors. And, and we saw that leap, um, as we started to move into a AI processes. Where I’m kind of reflecting back on this model that could be useful for us is that gap in Australia starting.

We were. Um, 10 x behind the best of the world in the exoscale generations with the compute that we had available in Sure. In Australia. We are more than a hundred x behind in this AI generation. In terms of the open in, in the science world, where I’m mostly based in the universities, in the super supercomputing labs, the amount of GPUs that we have available to test retest hypothesis in a science context, we just don’t have access to the kit.

Mm. And so not only is there a black box problem which we can explore of what does it mean to both run a model, um, uh, in a cloud or in a, uh, uh, in an LLM sort of situation of where are the black boxes in the science reproducibility question, but just the available, uh, capacity in Australia. So we are talking about aspirations and optimism.

Mm. It’s hard to be optimistic. If I just don’t, if I’m a hundred x or a thousand X behind the best in the world, I get it. I get it. And so when I’m going to the US and I’m in Silicon Valley and I’m talking to a lot of my, uh, friends in that ecosystem who are punching, uh, trying to do that latest and greatest stuff, there is an investment in Taiwan.

There’s an investment in Singapore, there’s an investment in, um, uh, countries that are GDP below Australia. There’s now an investment into. Sovereign GPUs, science, GPUs that are, and and I don’t wanna say that Australia’s not going to get there. I’m saying that at the moment, both in the strategy side and in the current roadmap investments, we are not putting like we used to with HPC signaling to the world that we want to be in that place.

Look, that’s really interesting. I don’t think a lot of our listeners would have had that view. I certainly didn’t a hundred times behind in terms of. The available high performance computing capacity available for the science sector. For for the science sector? Yeah. The research sector. Yeah. That’s incredible.

Um, what’s the roadmap to correct that? I know, uh, you know, in industry as a cloud provider, we’re starting to see some green shoots around, uh, tenders coming out into the sector for some pretty large scale environments and some substantial, uh, investments from, from some universities. So if you’re saying large, if you have a view on the, the broker, what’s your, what’s your definition of large?

Get you out of, uh, bed in the morning to go to the supermodel, you know, runway situation know, is it, is it 50 million plus? Yeah, probably the 20 to 50 million. Capital investment. 2020. They’ll, there’ll be a few tens, there’ll be lots. 50. Okay. The NDRI strategy. Uh, that we’ve just, uh, been allocating out is $400 million total for all the streams of very well-planned, very strategic, very useful bits of the pie.

$400 million pie of which out we slice the high-performance computing of that we need to slice out the GPUs to do the AI work. Mm-hmm. So you have a 400 million, uh, that’s. Not the university spend, that’s the national spend. The unis will spend on top of that and match in, and we’ll grow the pie. But at the moment, in terms of what’s planned and allocated from the national science budgets, not taking into account R&D credits or other ways of mm-hmm.

Gaming the game, or, uh, optimising or getting co-investment in all good things that need to happen. Just the scale is not there. We need a billion dollar plus investment. We need to catch up. And we need to be optimistic about the productivity and the, um, workforce transformation that that’s going to bring into Australia.

Um, uh, recent, um, uh, things in the US that, uh, you know, talking Sam Altman talking about the, uh, I think it was Sam saying again, it’s a transformation economy that we are building that’s similar to the investments in the highway system. So you can expect that that transformation. We’ll have both intended un unintended consequences, but it will have productivity benefits.

Yeah, so we are building infrastructure that we want to be able to use for transformations. If I want to then compare or test a model, maybe the model’s already been built so I don’t have to pay the cost of building the model, but I do want to test some of the truthiness of what the model’s outputting.

Sure. How does CSIRO or a university researcher who’s an expert in their field test model against model or model against capacity, they’re going to need some tooling, some GPUs to burn. Yeah. To be doing some of that validation testing that we would expect. What’s truth in that new economy? Yeah. If truth is filtered through our, you know, well-designed LLMs, is it outputting?

It’s all probabilistic, right? So it’s not gonna say yes. Uh, every time. But is it Yes, most times. And is it the right kind of Yes. In the right kind of patterns for us to go, oh, we trust that so we can build public health on it. Yeah. I’m gonna ask it some public health questions to an LLM I need to trust that it’s as trusty agenticly, uh, interoperability, uh, output wise as our best researchers are.

That’s right. And so what does this, what, what is the implication here? So if we, if we just take it back a step. ’cause I think what you are saying is there needs to be a, a, a huge investment put into this space in order for us to catch up perhaps with other nation states. Um, well, just on that, just pause there for a second because there’s a couple of things that we know that’s going on in the industry, right?

Like I, I can say. With reasonable authority that there is a hyperscaler in Australia that has been steadily adding a hundred to 200 megawatts per data centre, and next project might be 700 plus megawatts for a single data center. I’m not sure where the power’s coming from for that. Mm-hmm. But that’s all for infancy.

Right? And there’s GPUs obviously in the mix there as well. Now maybe it’s a question of of cost. Right in the education sector, could they tap into a resource that would arguably be so expensive? I don’t know. But within Australia, there’s that kind of spend going on, right? Well, that, that was leading me to, I think we’re both gonna the same place.

Yeah. Why can’t, um, and I’m sure you got a good answer for this, Luc, but why can’t people just, uh, consume this stuff on tap, consume GPUs from a hyperscaler either onshore or offshore to, to do this work? Uh, so go where the investment’s already been made. Apparently the hyperscalers have deep pockets for this.

We don’t always have to own it, uh, to run it or to get the utility value out of it. I think that’s fair. You might think again about road models. Uh, how do we build a public road? Um, a public private partnership’s okay. To build public infrastructures. We’ve wrestled that up in the public debate, right? So there are models that have worked to, to leverage finance, to get us the infrastructure so that we get the productivity.

I’m not so much fussed about the, um, uh, the building of it. Um, we know how to build, you know, the, the, the market can solve aspects of those problems. But who owns the job that runs and who pays helps us pay for that. So, uh, is. Uh, it, it’s a billion dollar CapEx to build a thing, uh, but it’s x amount per year to run a thing.

I want access to the run part to do the science. Sure. If I’ve got confidence that the, uh, say the, the data is sovereign Australian dollar, uh, data rate residing in Australia. Yes. Um, that some of the signal value, not just the, the utility value of. Crunching the numbers on, on those data, uh, are in Australia, but that, uh, or in a, in a cloud, um, globally to that matter, but the signal that comes outta that, the, um, the feeds of the agents that are going to run on those systems, there’s value in that as well for us.

And then there’s the, can I compare A versus B and am I getting a subsidy, uh, for the research grant or from, for the research activity? To run it on that kind of compute. So I don’t need to have the, the kit built and owned by say, CSIRO or by university. We need to be able to run those cores, but we need some assurances from, uh, is the value staying in Australia?

Is a lot of that value staying in Australia? Is the information that I’m giving up valuable? And, uh, how are we monetising that in Australia? Is the, is the scientific output going to be valuable for Australia? Is the. In industry partnership’s going to be valuable. That’s the macro ecosystem that we are talking about.

The delivery of who owns the computes, a very small part of that broad ecosystem. Just, um, for our listeners, you mentioned sovereign. Why is that important? We have rules and obligations about, and contracts about what we work with, how we work with it, and, uh, we also have aspirations as a country. We have aspirations that are distinctly Australian for the type of community that we want to build here in Australia.

The, uh, the, the computational, uh, agentic AI that we build should reflect those Australian characteristics and assumptions and values and judgments and be generating value for Australian productivity moving forward. And Australian, uh, attack. Benefits. And if we’re thinking about growing a country capacity wise, we’re thinking about the workforce, we are thinking about the workloads and where the work gets run.

And then in computational and data terms, I think then we have an obligation about that. What’s our contribution and the value that are going to be retained in Australia for us to stay as a, you know, prosperous global economy. With a way of living and a style of, uh, ways of working and a and a healthcare system that we value those things.

Yeah. So having the data in Australia to Australian terms and conditions, maybe we can achieve that wherever it is. I don’t, I don’t mean sovereign just by, we have to lock it up in a, a vault in Canberra. It could be data. Get it? Yeah. Right. So by, by rules and by, uh, legislation and by probity. Also what I’m doing to my data and who else can do things to it is important to me as an Australian.

Yeah. And so the value that then I, and the trust that I put into these systems, not being onsold to other partners without my say so as an Australian citizen. Yeah. And where the additional value, the marginal values and the, uh, additional, uh, contributions that the macro parts of those data then have at scale, as we know from AI patterns.

Okay. We’ve put all of those into these systems who monetizes that. I want that Australian value to come back to Australian citizens in the long run. Yeah. And so when I’m saying sovereign, I don’t just mean from a rules protection legislative perspective, I think that’s important, but I’m talking about the full economic value of us building a value chain for Australian industries to co-invest in.

Yeah. So if we’ve got really good, uh, bulk data from having done mining at scale, uh. What’s the long-term value of that? Maybe we, we’ve extracted the value from the ground. Yeah. But those data might continue to have ongoing value as we are taking more, uh, challenges with our company partners to say we need to go to more extreme, um, mining techniques.

So say we’ve got a great Rio Tinto, great BHB, we’re here in WA. So their skills and capabilities to extract uh, iron ore in an efficient way at global scale is unparalleled, right? Yes. Let’s make that assumption. Say something. We get into asteroid mining. Who’s got the best trained models and the best national sovereign capability to go asteroid mining.

Nice. Wow. It’s like, okay, and where have we built that sovereign capability and where does that value retain? Yeah. And where’s the workforce and where’s the capability that could leverage the whole IP food chain that we’ve built through models and agentic capabilities, robots that we’ve sent out into the Pilbara.

Yeah. And how do we repurpose those? That’s, that’s the sovereign value that I’m talking about retained for Australian. Is there acknowledging the, the intent and purpose of AeRO? Is there a healthy competition within universities to create the best possible climate for researchers? Uh, we’re the best for frenemies in the world.

Okay. How does that work? How do you, how do you balance that? If every university’s trying to attract researchers? How does that work? Uh, the, the, the, the race for talent, but the race for credible talent that then wants to share that information. And want and is passionate about their field. That’s the kind of ecosystem that we’ve built up in the world’s best universities.

Australia’s a, a great, uh, provider of the world’s best universities. Our research outputs, our research practitioners, our research infrastructure is genuinely world class. So if you think about our group of eight universities, the, the top research universities and our, uh, peer friends in the sector, mm-hmm.

It is a great place to do research work with partners. And so that healthy competition of going after research grants or collaborating on research grants or, uh, strategically saying you are really good at these areas. We are really good at renewables and materials tech at UNSW come to UNSW to do some research into batteries or renewables or material biggest and best engineering school in Australia.

Come on down. Mm-hmm. Uh, one of the best, uh, mining, um, schools in, in, in Australia. If we want to do research, we have to give places where researchers with the style of practice, these both short term and long term projects that they want to get their teeth into with industry input, valuable, what we call translational impact, real value impact back down.

So in, in, in medicine, we want, uh, people to live longer, people to have less disease in, uh, mining, we wanna be more efficient. We want to have translational impact to our economy, to our people, to our sovereign, uh, capabilities. And so the university sector contributes to that. Now, we don’t mind then if one of our researchers is a rockstar and wants to go off and have an IP and and found a company that’s now totally possible within our ecosystem.

But you can’t just have one or two. You need to have an ecosystem where there’s enough competition at scale, and I think that’s what we have in Australia and we’re in a really good spot. Really interesting stuff there. Um, a takeaway that I just got from that was, and, and it’s a nice soundbite. Uh, once we’re finished digging stuff out of the ground, we’ve got all of this other value that, that we can retain and, and, and understanding of the, the future and what that might look like for Australian industry.

I think that there’s so much scope there, uh, to use your words, to be optimistic about. So where, where did we get version one of the LLMs from? Oh, what a great question. Where, where did, where did it come from? That’s a loaded question. I wanna believe it’s somewhere in Australia. Uh, we, we stole it. Okay.

Okay. Okay. The, the web, we put stuff out on the web and the web got scraped. Yep. And we trained our LLMs all that data. Yep. On basically everything that the spider robots or that ethical and unethical practices pop the lid on various sources of traw content on the internet. Now the universities and DARPA and everybody helped to contribute to the web and you know, a lot of us, um, you know, put posts up on Reddit where they were good posts or bad post, they all got read.

Mm-hmm. And we trained the LLMs on research output of value research output of potentially negligible value. We trained it on Reddit. We trained it on the whole corpus of the web. All right. I’m gonna ask you a question. How close are we to general intelligence? Question number one. Question number two.

Actually, let’s just start with question number one. We’ll see where it goes. Let me finish just my thought of version two of the the LLM value. Just finishing off on this thought towards asteroid mining robots. Uh, is a university sector, is our commercial sector, is the web going to exist in next week, next month, next year?

Is the information value that we are generating now through, uh, AI just making the web dumber because it’s trained on the current content of the web and the people with the information, research groups, companies. Are they putting it out on the public web to be retrial by the next version of agents? And finishing off that eight, uh, sovereign conversation company data, valuable IP, uh, valuable monetisable content that’s going to train the next generation of LLMs or agent to do truth telling against how do we monetise and are we gonna give it up?

For free by putting it out in what we’d call the open science web. I like this, you and I have spoken about this, right? The, the value of quality data to train the value LLMs, we’ve spoken about this, I think, yeah, that our, our concept was there’s a market in this, isn’t it, of, of data. As there data as the raw material for an AI factory and an insight, and all of a sudden, organisations everywhere are finding that that data that had little to no value before.

Mm-hmm. Now is incredibly valuable. So if you are reading the GIS data that, uh, geoscience Australia has pushed out, why don’t you give them a click rate, right, like a click charge on that compute cycle a hundred percent So that we can repay and so we can do the next gen. Yeah. I’m doing, you know, really good, uh, say, uh, life science work and say we put out a, a new trusted model.

Why can’t I get as a university or my researcher or a mini company that we spin out, where’s the value chain to feedback in? Mm-hmm. Or. Do we keep it because the actual protecting of, it’s more important for us and we’ll use the open models to test against our private. Uh, so I’m imagining now that takes us, will we know when AGI has happened will OpenAI tell us. Mm-hmm. Well, so you know, we had Steven Worrall on the podcast, right? And I put into the question, ’cause between the contract of Microsoft and OpenAI. Should they achieve it? Everything stops and the definition exists within those contract artifacts to which no one’s allowed to know about it. I asked Steven about it, he wouldn’t say a word.

Um, will we know about it? Almost certainly not. So again, why should we have publicly available and I mean for research publicly available in the sense of, uh, creating common good. Uh, for common problems at global scale, if you allow, uh, that type of development to happen, uh, by a private company. Totally cool.

Um, they’ve invested, um, shareholder money to go into that. Where’s the country? Uh, where are the country’s efforts to create AGI at the moment? Mm. If it’s that level of importance, where are the public open models? Heading towards in the AGI conversation. Mm. Does it become a privately held shareholder tool?

That’s an interesting model. And if it’s done by a private company to private company rules, now here we’re going total totally metaphysical, let’s go. But then you’re, then you end up, if, if, if an agent is basically prototyped on a human and AGI is closer to a human and a company creates a human. I call that a slave if a government creates that activity.

We know as governments, we know as, uh, communities what we do with humans, and we treat humans differently in the public sector than we do in the private sector. So I apply a mental model of it’s great that all this investment into AGI and AI is happening because we’re getting. Uh, genuine benefits and, and, and Sam speaks about this eloquently, that there’s a whole lot of passionate people wanting to do genuinely scientific good through these company, uh, infrastructure patterns.

Uh, no disrespect meant at all. That is wonderful work by wonderful, smart people happening that is genuinely going to benefit, uh, humanity. Uh, probably. Uh, I’d also want some of this to be happening out in the public, under public scrutiny with public visibility over these kinds of capabilities. And if we generated AGI in the public, what would that mean As a public creation?

Mm. Versus a private creation. Mm. Excellent. Really interesting. Look, this has been a fantastic conversation. I’d love to ask you one more question, uh, to finish off our chat today for people listening who might be, uh, questioning the value of some of this technology and where it’s all headed and perhaps a little bit fearful of the unknown.

Uh, I think we’re at the precipice of, you know, a new era in terms of, uh, technological capability for humanity. Uh, I, I guess I take from this conversation, you are, you, you take a more optima optimistic view of where this technology can take us. Uh, what would you say to people who are a bit unsure or a bit fearful of where all this is going?

Humans build, uh, humans, build tools to build. Um, we’ve built amazing things with tools. Uh, we’ll keep building amazing things with tools, uh, tools and then infrastructure transforms societies. Um, healthy running water in a city, uh, stops, disease stops people dying. People live longer. Um, uh, uh, building roads to get us from A to B faster enables us to do, uh, commerce and trade more effectively.

Stuff that’s sufficient to do over there can come over here quickly. Digitally, we know the transformations that have happened digitally. The, the, uh, friction freeness that I don’t have to line up at a counter at my bank to do 99% of my banking work, uh, is now doable in my pocket. That is saved humanity, countless productivity hours.

I, I, uh, have a research assistant in my pocket with ChatGPT. The, the desire of the researcher has always been to have a team of researchers working with them at speed, at PhD level, quality, uh, and, um, we’ve always loved for good or evil, yelling at our, um, young. Uh, students to produce output so that we could move faster.

Certainly, I have 20 PhD students in my pocket right now with ChatGPT ready to do my bidding to do little micro pieces of work that we can start pulling together. It’s incredible. What I’m hearing from you is that our aspiration will always be greater than the gains that we get from the technology.

Massively. I want to do this kind of stuff that we are doing and transform a couple of other planets in the, like Elon’s got that Mars thing. Why stop at Mars? Yeah, we need these massive goals that human’s always had at scale. Um, I’m not scared about the a Agentic or AI or AGI future. I’m excited by it.

You know what? It’s a happy place to be because it’s gonna happen. Anyway, we’re all on this path. It’s been amazing talking to you. Really. Pleasure. Appreciate it. On set. Here we are. Theta Jono. Really insightful. Thanks so much for joining us. Thank you so much, Luc. Appreciate it.


Cloud Reset

About the author.

Cloud Reset is the podcast where no-nonsense meets cloud strategy. Hosted by Jono Staff and Naran McClung from Macquarie Cloud Services, it’s all about cutting through the noise with straight talk and real solutions for IT leaders. With decades of experience on both client and vendor sides, Jono and Naran arm listeners with strategies to save costs, reduce risk, and maximise cloud ROI.

See all articles by this author

Get in touch.

1800 004 943 +61 2 8221 7003

Enquiry Sent.

Thank you for contacting us. Our specialists will get back to you shortly.

From the Blogs.

Azure Security with Managed SIEM: What G...

Cyber threats have always moved fast, but the AI era has turbocharged the risks due to automation, generative AI, and wide accessibility to ...

Read More

Secrets of a Smooth Azure Migration (wit...

You’d be hard-pressed to find an IT manager who hasn’t been tasked with “get us to the cloud” in the last few years. There’s good&...

Read More

Komatsu Australia partners with Macquari...

Komatsu Australia has signed a multi-year, multimillion-dollar partnership with Macquarie Cloud Services (MCS) to modernise and strengthen i...

Read More