Cloud Reset – The Podcast | Episode 14: Don’t Give a Task to a Human That AI’s Better At – with Jim Cook

June 26 2025, by Cloud Reset | Category: Cloud Services
Cloud Reset – The Podcast | Episode 14: Don’t Give a Task to a Human That AI’s Better At – with Jim Cook | Macquarie Cloud Services

Show Resources:

Here are the resources we covered in the episode:

Follow Jono Staff on LinkedIn

Follow Naran McClung on LinkedIn

Cloud Reset’s YouTube Channel

Listen on Spotify

Listen on Apple Podcasts

Contact us at enquiries@macquariecloudservices.com with any questions, suggestions, or corrections!

And don’t forget to subscribe, rate & review!

Episode Summary:

In this episode of Cloud Reset, we’re joined by Jim Cook, Manager of Digital Innovation at the University of Sydney, to unpack the uni’s groundbreaking journey in building and scaling Cogniti – their in-house AI teaching platform now used by over 70,000 students and expanding across 22 institutions.

Jim shares how Cogniti was born out of the need to create safe, responsible, and equitable AI experiences in education – addressing challenges like authentic assessment in the age of generative AI, and ensuring students from all backgrounds have access to transformative tools.

He dives into:

  • The origins of Cogniti and how it was designed with educators and students, for students
  • The shift from policing AI use to embracing it as a teaching and learning tool
  • How Cogniti enables AI-guided roleplay and Socratic tutoring, and even allows students to build their own agents
  • The impact on educators’ productivity and how AI helps lighten the teaching load
  • Insights into the deep partnership with Microsoft, the commercialisation journey via Azure Marketplace, and managing infrastructure at scale

Jim also offers advice to any organisation wanting to get started with AI: “Just do one thing. Start small, play with it—and the use cases will come.”

If you’re interested in the future of education, practical AI implementation, or how to scale innovation within a large institution—this episode is a must-listen.

🎧 Tune in now to hear how Sydney Uni is reshaping the future of learning with AI.

Watch or listen now?
#CloudReset #AI #Leadership #DigitalTransformation #TechStrategy #MacquarieCloudServices #Podcast

Episode Transcript:

All right, so Cloud Reset podcast. Here we are, day two at the Higher Education Technology Agenda Theta, as it’s known. We are in beautiful, sunny Perth. The mornings have been fresh, uh, I have to say, but the place is beautiful. This is awesome. There’s heaps of people. Jono, you’ve just gotten off stage.

You’ve done a keynote. What did you talk about?

Well, it was a pretty, uh, pretty difficult keynote, I gotta say, because I followed a guy called Craig, who was pretty much single handedly responsible for rescuing 12 Thai kids out of a cave in 2018.

What an amazing story. Amazing, incredible story. Yeah. So tough act to follow there with Craig.

Yeah. But, uh, what we’re there to talk about today is the launch of Caudit Cloud. Yes. We’re not gonna talk too much about that on this podcast. No. We’re gonna introduce our special guest next. We are, but a fantastic group of people and, uh, it’s been really great to be here.

Thank you, Jono. Well look, uh, without further ado, I’m gonna introduce our guest.

His name is Jim Cook. Jim is the Manager for Digital Innovation Sydney University. I love that job title. Jim. I’m jealous of that already. Uh, let’s see. He’s, uh, he’s got a mandate to become a leader in safe and responsible AI. He’s led a transformational journey to develop robust AI capability, including Cogniti.

We’re gonna talk more about that. Sure. Their in-house AI tool, uh, which has now been commercialised. Huge investment and support from Microsoft. Jim, welcome to the Cloud Reset Podcast. It is fantastic to be here. Thank you for having me. You’re very welcome. Talk to us, right, right? What, what’s going on at Sydney University?

Why do you get to have such a cool job title? Yeah. I mean, what’s going on? And you also got a cool beard as well, so it’s all kinds of cool going on here. It’s, uh,

We were just saying when we came in, uh, the, the beard and the jacket are to make you remember the things I say. Very importantly, it’s working already.

Um, so at Sydney, we’ve been sort of on this AI journey for the last sort of three years, maybe longer than that, maybe seven years. Uh, we, with the digital innovation capability that I lead, uh, we’ve been in there for since 2012. Uh, and we were sort of, we were always tasked with the mission of let’s drag people kicking and screaming into modern technology.

So that’s been like our journey to cloud. Yep. That’s been, uh, you know, virtual reality in the classroom, those sorts of things. Lots of emerging technology and, you know, modern architectures and practices. Uh, but the last three years has been pretty much dominated by artificial intelligence as you can expect.

Uh, since we had our, uh, all our executive lost their mind when ChatGPT came out back in November. What, not even two and a half years ago, three years ago at this point. Oh wow. That wasn’t that long ago, was it? It’s not that long ago. It’s quite terrifying. Wow. Um, you know, and then now we’re sort of trying to punch ahead tools like Cogniti because we’re trying to create equity in the classroom.

Yep. Um, make sure everyone’s got access to Safe and Responsible AI, as you said in the intro there. Yes. Safe and responsible. That was the vice chancellor. He went to, uh, the Sydney Morning Herald before coming to us. Mm-hmm. And just said, we’re gonna be leaders in safe and responsible AI. So now I’ve been playing catch up for two years.

Okay. Uh, but I think we’re doing pretty well.

Alright. Now something like Cogniti, we wanna talk about what that is and, and what purpose it serves. How did that start? Was there a brief, was there a problem to be solved? Talk us through that. The, the origins of where that came from.

Yeah. So, you know, mid, mid 2023 with, with, uh, ChatGPT out about everywhere, students are gonna start using it for assessment.

I see. Right. We just know they’re gonna start, they’re gonna start putting the homework. Into the, uh, into the ChatGPT. Yeah. The answer’s gonna come out. They’re gonna submit those. And now we’ve got a real, like a problem with authentic, as I see. We, we know that we’re not going to be able to assess them in a way that’s, uh, meaningful.

Right. Right.

We’re not gonna be able to prove they’ve learned something.

Did it happen immediately, like as soon as chat gt uh, GPT rocked up on the scene? Was it just immediately in play?

Yeah. Look, there might’ve even been guys using other tools before that. Right? There was things out and about there. Um, and like it was a, it was a day one problem.

Yes. Uh, and so. We had to think about. Well that’s good. We could, we can take assessment and we can make authentic assessments where AI is allowed. Yes. Yeah. And we could say, use AI to do this homework, but you’re gonna have to explain the working, or you’re gonna have to explain how you use the AI to do it and things like that.

Yes. Or you’re gonna do tasks that are Socratic in nature. And so Danny Lou, uh, professor Danny Lou, uh, in educational innovation at Sydney, uh, came up with a great idea that what if we built an equitable system that we could control the prompt engineering around, we could control the flow of discussion and we could review the student’s work Yes.

Inside the system as part of that teaching. What? It’s threefold because it gives us a better mechanism of assessment. Yep. It gives us, uh, we can control the quality of the discussion Yep. That the student has with the agent. Yeah. Uh, and it’s, uh, it’s just an equity of access component because you know, students from low socioeconomic backgrounds, they can’t afford what is Google charge now for the last one, 200 bucks.

Yes. You know, or, uh, you know, ChatGPT $200 for the, for the high end model. Yes, yes. That’s not something we can pass on that cost on. I get it. Got it. And

really interesting. Tell us more if we could just, uh, dive a little bit deeper into the impact on student learning. You know, Naran and I have spoken to a few different industry leaders around this idea of the democratisation of knowledge.

As it relates to AI and what that might mean for the education sector and the student. And you are talking about now making it actually doubling down and leveraging the tool to teach, as opposed to sort of moving away from it and being worried about plagiarism and these sorts of things. Is, is that, is that probably fair to say?

Uh, a hundred percent. I think one of the, the big challenges we face is that we’re taking a student. We’re molding them with, you know, the, the skills of Sydney Yeah. And the knowledge of Sydney. Uh, and we’re, we’re crushing all of this information into them, but we’re also building out things like making them good leaders, making ’em good researchers, making them be able to, you know, operate in teams and all these graduate outcomes that we’re pushing for.

And really, I think one of those graduate outcomes, emerging graduate outcome is fluency with artificial intelligence. We wanna send them out into the workforce ready to take the jobs that do not exist yet. Yes. Uh, we need to prepare them because, uh, I guarantee you my dad’s 70 and he is using generative AI in his work.

Wow. Right. Like I showed him how to do it. He’s like, I’m gonna use this all the time. Yeah. Uh, you know, once people get it, and it’s in every workplace, and he’s, he’s not a software engineer or anything like that. He, he works in plumbing and gas fitting. Yeah. Right. So the every job is a technology job.

Every job is going to have some element of generative AI in it. Yes. If we’re not building out our students to have those skills when they go to the workforce, then they’re already behind when they leave

Jono, this takes me back to the conversations we had about the math scientists in our own organisation.

Uh, we’ve got a very successful graduate program within Macquarie. Uh, and so we’re always looking for grads that have got that aspiration, that desire to explore technology and to play around with it. And it seems to be. Self-fulfilling somewhat as well and the more you play around with the tools. And that’s why I was really interested about the origins of Cogniti and sort of how you got that up from grassroots.

Um, we know within our own business that the more we work with the technology and the more we can encourage, um, members of our team and not just within our hosting management center either, but our engineers, et cetera, to use the technology to come up with capabilities and efficiencies that it will spawn other ideas.

And the more we play it, the more we use it. The more we think, well, hang on a second, we can improve this or we can improve that. And we know that we reduced our meantime to respond in our security business by half, um, just by sheer determination. But more importantly, it was nurturing the aspirations of the people in our organisation that wanted to play with the tools.

And I imagine you’re doing the same thing.

Yeah, absolutely. Look, look, there’s, there’s a big difference between how our staff use AI Yeah. And how our students use AI. So Cogniti specifically targeting our student cohort. Yeah. Um, it is pedagogically designed by, by teachers. For students. Uh, but it’s a little bit co-designed by students for students as well.

There you go. Um, and you bring them in and you take them on that journey for it. We’ve just released a new feature actually, uh, that’s coming out. I think it’s in beta right now. So some users who are on cognitive already will have access, uh, but it’s the Sandpit and essentially allows the students to then make their own Cogniti agents so they can engage with their own learning and teaching in that way as well.

So they can, uh, you can set, you can set an assignment now that is. Uh, go and create an agent for this task. Here’s the actual task, and you’re gonna create the agent that achieves it.

That’s incredible. Do you see this technology, uh, speeding up the time that it would otherwise take to complete a degree, for example, and make somebody ready for the workforce?

Good question. I think, uh, we have to be careful with some of the things around context or regulations and things like that around what, what level of readiness we need to provide to students, and we also have to be able to, uh, it’s not just making them more intelligent quickly. Uh, it’s also university is also a social experience.

Uh, and I think they, if they did university in one year or one and a half years, they might miss some of that other important building blocks of being a, a, a human and a participant in society.

I’ve gotta say, just for my own personal experience, I didn’t grow up until about 23. Yeah. You know what I mean?

I needed every year of university just to turn into a reasonable human being. Well, there you go.

There any, there’s any students listening out there? Um, I’ve just, sorry guys. I tried. Yeah. It was, uh, it was,

it was a valued effort. Okay. Look, can it, can it give them mastery quicker? That is potentially true.

So, uh, general capability in a degree, if you take a, you know, an engineering degree or something like that to get to the base level, that’s a general thing. But if you wanna be a master of your trade, uh, and that’s the, the students that you want for your graduate program, right? Yes. If you wanna get those ones that wanna be a master of their trade and going all the way to that thing, then it can really help people go from like, being good enough to excelling.

Well, we, we’ve got that mix of skills too, right? I mean, like, our business is very people facing. We, we, you know, we, we service other businesses and we are huge on customer experience. And so we need personable, aspiring technologists who can hold a conversation and care about, uh, customer experience as well.

Right? It’s a massive part of our business. So it makes perfect sense.

Yeah, a hundred percent. Yeah. And I think, uh, that, that interacting with people. Is one of the big skills, the soft skills. Yes. Uh, and AI can actually help with that quite a lot. In Cogniti, we have like, uh, a few different modes and a few different methods.

Uh, the Socratic method is that Socratic tutoring, it’s like, I’m not gonna give you the answer. Yeah. I’m gonna guide you to the answer, but there’s also the role play element to it. Yeah. Uh, so using it for role playing your future career. Got it. Uh, particularly, there’s a really good example that you’ll find in one of the videos that we’ve published with Microsoft, uh, around, uh, in, uh, occupational health therapy.

Uh, and essentially, uh, having that conversation with the AI as if they’re the patient. Oh, wow. And the AI access the patient inside Cogniti with all the guardrails and scaffolding of the subject matter and keeping it from going off the rails. Yes. Uh, and keeping it from hallucinating and things like that.

Yes. Uh, and just basing it in the curricula, uh, and having it role play in the context of being a patient. It’s just incredible. We do it in a, a second year physiology class, uh, blows the students outta the water every time. Wow. Often for some of them, it might be the first time they’re encountering like a guided AI experience.

Might, they might be using, you know, ChatGPT on their phone or something that, or copilot on their phone and just, uh, I’m chatting to it, having discussions asking it where the best pizza in town is or something like that. But this is often their first guided experience with it. And it’s like the, you see the mind shift, uh, you see them like, oh, hang on, I could use this to, uh, do test assessments right up until, uh, you know, until the, the main assessment’s due.

I’m still gonna have to do that main assessment prompted or, uh, in an environment where I can’t use the AI, but I’ve got so many practice opportunities as often as I need. Yeah. Right.

We, we are often looking at use cases in the commercial space. A lot of our customers and potential customers, um, for good reason, are very interested agentic AI and generative AI for productivity gains.

Yeah. I’m curious, what does this tool mean from a productivity gains standpoint in terms of the effectiveness of the, the teaching staff at at, at the university? Can they teach more students per teacher? Can you, how, how’s that playing out?

Well, there’s been a, uh, I mean, a massive sweep through Australia recently that you guys will be familiar with is the right to disconnect.

Yes. Right? Uh, and the right to disconnect is not a thing that teachers have. Uh, like legislatively they do, but they’re always thinking about class, practically. They’re doing marking practically. They don’t really do it. Yeah. Um, this is actually what, what would a duplicate of you be able to achieve in the times when you are not available?

So can we put one in there that’s, uh, you know, we ask students to do, uh, homework essentially, right? What if we can put an agent together where they can engage about the homework with that agent after hours, and it has all the rubric, it has all of the knowledge of the subject, and it has all of the instructions that you provided it with.

What does that do to release the load on you? Because previously that might have been a, a chat forum or a, you know, study board or something like that where you would have to engage, or the tutors might have to engage with the students. What if you’ve got a first tier that can capture all of that. And also around like the, uh, the operational things of running a unit, uh, managing the rubric and creating test questions and things like that.

AI is obviously for synthetic data, one of the best use cases we have. Hmm. Uh, you could generate 50,000, uh, questions, the time it would take you to write five. So using it for generating really high quality, uh, test exams and things like that is really, really powerful for the teachers.

Such an optimistic, um use of the tech. It’s Well, just, it, it just like, it’s an evolving list of capability. Yeah. But it feels like, like with every month, I imagine you just keep coming up with new ideas and capabilities. You’ve opened it up to students, like you said, with the sandpit, which is amazing. Now, obviously, um, models are changing all the time.

Like we, we had our own project in house, uh, as a project leader that the our manager, James Customer Insights project. Yep. Whereby. Every other week there was a new model or a capability that just suddenly found itself in scope and it was delivering a different cap, different requirement or a different output.

How are you dealing with that?

Yeah, so not so much Cogniti. I think we’ve kind of nailed that down. What we do is we use the AI Foundry, do evaluations across the models, uh, it as your AI foundry, um, and evaluate across the models with the prompting that we have in place and the workflows we have in place.

That becomes a bit old hat now. It’s kinda like. We have to do it. So, you know, when we’re going, right now we’re looking at moving off of 4 Turbo. ’cause 4 turbo will be retiring, uh, early July, I think. Okay. So we’ve got four or so weeks to get that, that outta the way. Yes. Um, and we’re looking at like what models, which of our, uh, agents will work best with 4.0, which will work best with 4.1.

And then making sure that we have those decisions in place. I see. And working with the academics who technically own them. Yes. That’s their, that’s their agent. Mm. Right. And making sure that they understand what the new difference will be and how they can test it.

Because we, I mean there, this, there’s constant debate too of, uh, what state does my data platform and my data modeling need to be to take advantage of generative AI?

Now there’s, there’s different schools of thoughts on this. Um, within our own business. For example, I know that, um, we applied a, an agent to a data source and we had like 20, 20 different data sources that allowed us to. Uh, draw the sufficient insights necessary to explore where we’re at with any given customer on what we’ve sold to them.

Are they happy with service? Do they consider a value for money? Is net promoter score tracking? Is the relationship good? And so on and so on. Now, we have agents for every data source, and there’s a scheduler agent that sits on top that sort of corrals them, but we know that. Different models behave differently with different data sources.

Sure. Right. And so there was an art in figuring that out. And I think it was just through lived experience and playing with it that we were able to induce the right outcomes. I’m assuming you are doing the same thing. There’s a lot of trial and error and testing and playing around, going on. You must have a good cohort of people that are giving you that validation.

I think, I think the key thing is, is that, uh, expectation management with the, with the customer. Right? Right. And I. I use the term customer loosely. Mm. Uh ’cause everyone is a customer of the University of Sydney, of course. Right? You could be, you’re, you know, you guys are gonna come back, do a postgraduate degree.

You are a future customer, right? Sure. We require you in that way. Um, so I think what I mean is the coalition that we’re working with, the students, the academics, uh, the expectation management is everything. Mm-hmm. Um, we need a new type of service model now, and I don’t know what it looks like yet.

Interesting. We’re, we’re teasing it out as we go. Right. Because as you said, there’s a new feature every six days, right. There’s a, there’s a great piece of benefit that the organisation can take advantage of. Yeah. Microsoft’s released it, you know, it’s a new thing. It’s in preview. Yeah. Uh, and you’ve gotta start working now so you can deploy it in three weeks time, or four weeks time to get the benefit and stay as a leader.

Yes. Right. Uh, I think expectation management about this, you need a backup plan. ’cause this might not work is the first thing. Yes. Um. Uh, and we’re not here to replace humans. Importantly, we’re here to augment humans. Yes. Uh, and just setting the expectation with people around what they can expect as you tune it.

Yes. As you, and not necessarily through fine tuning, but maybe through adapting the grounding and things like that, right? Yeah. Across the model, uh, or fine tuning if necessary, which we don’t see a lot of now. I think actually, uh, if you ask me two years ago, fine tuning was the answer to everything. Uh, but now I think actually, you know, better grounding technologies and better grounding approaches are actually the better way to do it with foundational models.

So that makes it a little easier. ’cause what we can do is we can plug all those prompt flows or those, uh, orchestrations, all those, um, functional outputs and things like that into the evaluation pipeline, and we can include them as part of it. Uh, and now with structured outputs, it’s a little easier too.

So as the models come more into the structured outputs, we can architect what we wanna get back from it a lot easier. It used to be. Very difficult. I’ll say the just in 3.5 and, and early 4, you used to, you know, oh yeah, I want this back. Well, you might get that back. Uh, you know, the, the determinism, the non determinative nature of it has sort of been tied down by the technology a little bit.

And that’s, that’s through the work of people way outside of our scope. That’s it. Engineers at Microsoft and Redmond that are handling that sort of thing and making that easier for us. Uh, but I think before that. It was a, it was too time consuming. You wouldn’t be able to do it. You’d just have to wear it.

Yeah, of course. Um, uh, especially like Cogniti now is, has a thousand agents at Sydney. Uh, and it’s in, uh, there’s 22 institutions already using it, and we’ve got a, a backlog of 120 institutions that wanna onboard it. Uh, so we can’t test all of their, uh, every single prompt that they’re gonna do. No. Uh, on my presentation this morning, I showed that we had 70,000 users currently on Cogniti at Sydney.

Which is our entire student cohort, pretty much Amazing. So, you know, we’re almost, we’re almost there, almost all the way to every class using it. Yes. Um, and, uh, we can’t test 6,000 agents manually. No, of course. It’s gotta be, you’ve gotta do it on an automation process. Wow.

That’s an incredible user base for a relatively short period of time in two years, there’s a lot of, um, ISPs that would kill for those.

Yeah. Kind of numbers, right. Jim?

Look, uh, testing on our own students health though, right? For sure. So, you know, we built it inhouse and then we, uh, our big are saying audience not choice.

Uh, well, no, I’m kidding.

Technically you’re right though, because Yeah. If, if we’re looking at it in that concept, um, Texas is bringing in regulation around what assessment should look like.

Yeah. Right. So we’re gonna see, uh, we’re going to see the two lane assessment approach that we’ve adopted at Sydney become probably more and more common across the sector. Um, which means they won’t have a choice. Uh, uh, Danny Lou, the, the mind behind Cogniti, the, uh, professor Danny Lou, he, he makes the great point that if an assessment could be completed by AI, then it probably should be completed by AI.

So don’t give a task to a human than AI’s better at

Yeah. Right. So created it’s pitch higher. Yeah. Like have, have bigger aspirations. I get it. That’s right. It makes perfect sense. And

so theoretically then the, the rising tide lifts all the boat. That’s exactly right. Yeah. So we should have better quality, um, learning outcomes.

That’s right. And better quality graduates for your graduate program.

Now I wanna talk to you about your working relationship with Microsoft. Clearly we have a lot to do with Microsoft. I’m the head of the Azure business. I spend every other day talking to and working with Microsoft, bless them. Right. And they’re a critical strategic partner for us, as is Dell, who we are with obviously for Theta.

Um, Satya Nadella’s language around this stuff has changed. He’s now talking about peering relationships. Yeah. Right. He’s talking about peering off to agents that can do, and I loved your, the role playing example that you gave. Yep.

And so there’s a changing nature of working with the technology as opposed to it just being augmentation.

But let’s talk about working with Microsoft. Obviously they’ve they must love you for a start. You are a child of outcome. So they’ve certainly gotten what they wanted out of this, but equally, you must have gotten what you wanted out of them too.

Yeah, so we, uh, um, our, our chancellor and their former CTO very good friends, both came through Telstra, uh, together, uh, and uh, they kind of got together and said, what can we do at scale?

Uh, how can we make, how can we make the relationship. A bit better. ’cause what we were doing is essentially, uh, you know, we buy Azure services right? Course, and we have, and everybody’s got Office 365 and all those services. Uh, and so we have Copilot and things like that as well. Yes. But we are building things in their environment and how can we make it, uh, a bigger story about, uh, education, being a partner with Microsoft in that way.

Uh, so we signed a memorandum of understanding with them to, uh, look to advance the use of AI in the higher ed sector. And that was the goal. That was, it was a very broad goal. Yes. Uh, but that was the goal in that context. And so that’s meant is that Microsoft have really been, uh, uh, our partner beside us on, on that journey.

Um. I, I think that we would meet with Microsoft five to six times a week, uh, across different streams of initiative. Uh, Mondays, every Monday, uh, meet with, uh, Peter Zing and Katie Ford and Karen McGray. They’re our, uh, um, you know, uh, data AI. Uh, and then Jen comes along, Jen Hogan, our account specialist. Uh, and you know, they also bring in people that are necessary for other areas where we have questions about things and they’ll get in the weeds with us around development.

So, uh, in my team I’ve got a couple of great devs, but they’ll come along to those and they’ll pose their problems to Microsoft and they’ll say, I wanna try and do this. We’re building an app or a tech or a piece of kit to do this thing. What have you done before? What have you seen before? And they are very, very collaborative.

They’ll come in, they’ll help us on those sorts of things. Uh, also more broadly, you know. Uh, we are doing all our migrations around our infrastructure and things like that. Sure. Our advisors on that. The usual, everybody’s gotta run their business, of course. Right. Uh, and then also obviously Cogniti, we’re in the process of commercialising it through the, uh, Azure marketplace.

Yes. Um, so, uh, probably by June 16th mm-hmm. Not sure when this comes out, but, uh, uh, if, uh, if it’s after June 16th, it’s out, uh, going on, look on the marketplace and download Cogniti, um, helping us with the investment needed to deliver that. Uh, because we’re a small team in Sydney. Sure. Uh, you know, there’s only a few developers that have been working across Cogniti for the last two years.

Uh, and to build a marketplace app is, that’s a startup job. Mm. Everyone in Sydney has their full-time job either teaching or doing research or something like that. Uh, so how are they gonna do that? Well, Microsoft’s gonna bring some money and help us out. We had the, in that context, we had the largest ECIF investment, uh, that Microsoft has ever done in Australia.

Congratulations. How much was that? Uh, yeah, yeah. Don’t have to answer that. You don’t have to answer that. I’m being cheeky.

Well, let, let’s do some cool things first. Let’s get the Microsoft logo there. Yep. Let’s get the Sydney Uni logo there. Love it. Alright. Really good. Love that.

Perfect. Good. Speaking of commercialising this stuff, we’ve got a bit of experience with the marketplace and, and delivering some offers on there as well ourselves.

It’s, it’s not an easy thing. It’s quite a rigorous process for those ambitious people out there, so. We definitely have firsthand experience with what you’re going through and um, obviously Microsoft is really generous and they’ve got deep pockets of great ideas and, uh, it’s fantastic to supporting the sector.

You’re talking about, um, commercialisation of this offer. Is this a new revenue stream for the uni? How do you charge for this is like, are there new commercial models for this tech? People are talking about paying for tokens or cycles or like how, what’s your view on this?

So. It’s, it’s a great question because it’s such a hard question.

Um, if you go out, you buy a copilot for Microsoft, right? Uh, on your copilot 365 and you spend your 30 bucks or whatever, you get a bunch of capabilities and a bunch of tools that sit in your environment. You do, it’s great. It’s really, really useful for people who live, eat, and breathe in SharePoint, in OneDrive and things like that.

Uh, but we can’t charge 30 bucks for every student at an institution. No. Uh, with Cogniti, because it’s got a targeted use case, right. Um, and then we look in the market, ChatGPT 20 bucks. There’s different tools that are targeted. I, I use a great tool called Gamma Dunno if you know it, but it’s like nine bucks a month or eight bucks a month.

There’s a lot of different things around this, but like, they’re SaaS models. SaaS models are going to work for some use cases, but generally speaking, in the higher ed sector, what we’re seeing is that people wanna deploy into their own Azure environment. Mm. So what we’ve done is we’ve built out all the bicep and all the structures to deploy into their environment.

The entire Cogniti instance. I like that. And then they just play a small licensing fee.

That’s cool. Like a solution. I get it. I mean, that’s very similar to our own Macquarie Guard. Same sort of thing. Yeah. A bunch of controls that live within existing customers, tendencies, subscriptions, et cetera. Makes perfect sense.

And so then the customer can tune their infrastructure costs according to the scale.

Yeah. Scale it up. Scale it down. They need Awesome. Yeah, exactly. And, and you can manage the infrastructure costs in that way, and you could do that evaluation component that we’re talking about. Right? So if you think you need this different model, you are free to go and plug it into Cogniti, you know, so you could go and

get 4.5 if you wanna spend that sort of money. Yeah. Plug it into Cogniti for, uh, for some sort of service and deliver it in that way.

And how is your team keeping downward pressure on your infrastructure costs in terms of your test dev and Yeah. You know, production environments. ’cause I know that’s a huge concern for a lot of people that we talk about when they’re developing in hyperscale cloud.

Yeah. Uh, light, light flight and uh, essentially migrating through, uh, dev, UAT. Uh, we have a situation where we only deploy one set of token engineering, one a zero AI search, and then have different partitions inside it. Uh, it’s just not financially viable to deploy a full dev version of it in most cases.

Mm. Uh, the same thing with things like fabric. We do a lot of development on fabric. Yes. Um, and like until very recently, and I’ll say like April, uh, if you wanted to test any of the agent, uh, agentic things in a dev environment, you had to deploy a F 64 skew. Uh, you know, $30,000 a month on infrastructure.

Yeah. But now you can do like an F two and F four for a couple hundred bucks. Yeah. That, I think Microsoft’s really coming to the party on, uh, enabling little dev environments and enabling scalability because previously if you wanted to do a fabric integration, well you better have a full prod dev environment spending 40, $50,000 a month on each, or you wouldn’t be able to do it.

I love that. Jim, we we’re getting the wrap up on time. I guess, uh, final thoughts for. Uh, maybe other universities or even organisations that are yet to really embrace generative AI, how should they start?

Do one thing, do one thing, uh, the, the make go and deploy some Azure AI tokens. Uh, little, little GPT-4 or something like that.

And then plug it through power apps into teams and fiddle. Yeah. Do the most basic thing. It’s low code. It’s not gonna require any developers or anything like that. Yep. Get that in there and start to fill it out. And you will, you will develop use cases. It’s great.

Love that. Great experience. That’s lived experience.

Jono keeps coming up every conversation we’ve had, Jim, getting your hands dirty.

Thank you so much for joining us on the Cloud Reset Podcast. It’s my pleasure. Podcast. Fantastic.

Thank you Jim.


Cloud Reset

About the author.

Cloud Reset is the podcast where no-nonsense meets cloud strategy. Hosted by Jono Staff and Naran McClung from Macquarie Cloud Services, it’s all about cutting through the noise with straight talk and real solutions for IT leaders. With decades of experience on both client and vendor sides, Jono and Naran arm listeners with strategies to save costs, reduce risk, and maximise cloud ROI.

See all articles by this author

Get in touch.

1800 004 943 +61 2 8221 7003

Enquiry Sent.

Thank you for contacting us. Our specialists will get back to you shortly.

From the Blogs.

Cloud Repatriation Is Real — Here’s ...

Public cloud repatriation is a growing trend that has been gaining momentum, particularly as organisations reconsider their IT strategies. B...

Read More

Certified. Audited. Proven. Four-Time Az...

Deploying in Azure can seem simple. Staying sharp as it changes? That’s the real challenge. Services shift rapidly and staying secure an...

Read More

Disaster Recovery in the Cloud: Real Tac...

Let's start with the uncomfortable truth. Nobody likes thinking about downtime, data loss, or explaining to your board why the whole system ...

Read More