Deepest Conversation on AI This Year | Deloitte's Edge Officer Opens Up about Mind & Machines
In this powerful episode of the Ignited Neurons Podcast, host Utkarsh Narang speaks with Pete Williams, Chief Edge Officer, about the evolving relationship between technology, leadership, and human connection.
About
Pete Williams is a dynamic speaker who makes complex technology accessible and practical. A leader in Generative AI, Pete has been immersed in the technology since its rapid adoption and helps create innovative platforms that integrate AI into our lives. He has spoken at major events for organizations like Meta, Deloitte, and RMIT, helping audiences understand and apply Generative AI to transform the way we live, work, and learn.

🎧 Tune in for a conversation brimming with wisdom, humanity, and actionable insights for leaders at every stage of their journey.
Transcript
Utkarsh Narang (00:01.126) Welcome to another episode of the Ignite Neurons podcast. And today I have a special guest who, how I'd love to share how we met each other and what attracted me the most to Pete today who I'm speaking with is his title. It said Chief Edge Officer. Welcome Pete. I look forward to the conversation today. Pete Williams (00:19.374) Hey Utkarsh, How are you? Utkarsh Narang (00:20.656) I'm doing well, thank you. Before we even dive in, Chief Edge Officer, can you tell me a little bit more about that? Pete Williams (00:28.878) Well, I was like the founding CEO of Deloitte Digital. And when that started to go global, it had started to go global, but when it went to the US and was really scaling up, I stepped aside from that and went into more of a, I suppose, a futurist sort of research role, which is sort of where I prefer to play than... managing large businesses and it was in a thing called Center for the Edge and Center for the Edge was founded by two of the best thinkers I believe in the internet era, John Sealy Brown who We used to run Parc Xerox and if you're using a mouse today or a graphical user interface, can thank John and his team. And John Hagel, John Hagel III, who's just some of the stuff he wrote in the early 90s as how the web would play out. So I get to join these two luminaries to set up an Australian branch of Center for the Edge. It's sort of like an applied futures think tank where we look at how technology and society are evolving together and what it might mean. So I'm thinking, you know, what will I call myself? You know, like I was partner at Deloitte, research partner. So, yeah, that doesn't sound very good. You know, know, applied futurist. That sounds a bit, you know. self-serving and then I sort of realised that in the old days you used to have like a CEO, a COO, a CFO and that was it. Now you've got COs flying around everywhere, they've even got multiple letters, know, the CISO and the CHRO because they ran out of three letters to do. And I thought, well, I used to be a CEO. Pete Williams (02:17.102) So if everybody else is calling them to say something, I want to call myself the chief edge officer, because I'm running center for the edge. the other sort of angle to it was I often found a lot of. Utkarsh Narang (02:21.64) Mm. Pete Williams (02:28.086) Chief Officers with whatever was in the middle saw their role as, what am I saying? No, know, like stopping you. And mine was like, if you want to do something edgy, the only thing that a Chief Edge Office can say is let's go and let's see how we can do it. So I can't say no, you know, how do I make it happen? And particularly focusing on, you know, and if we think about the word edge, you know, it's sort of where things can come together. It can be on the edge, like, it's dangerous out here, edgy. something sort of sharp and interesting. So yeah, I thought that fitted pretty well. Utkarsh Narang (03:04.259) Amazing. Thank you for sharing that. And you're almost kind of given us a preview of what is to follow in our conversation today, because I always do this where I almost like have a hypothesis of what you and I might speak about, what me and my guests might speak about. And today, I think it's going to be something that will keep our listeners at the edge of their seats, hopefully, and then also give them a little bit of this phrase that you use, this interaction and play between society and technology. So we'll get into the weeds and the deep end. But the first question that we start the conversation with is, if that eight year old Pete of many, many decades ago, eight year old Pete were to come and meet you right now and have a conversation with you, what do think? What kind of a conversation will emerge between you and that eight year old self? Pete Williams (03:53.582) I'm glad you picked an easy question to start with. think the eight-year-old Pete lived in a very different world to the sixty-five-year-old Pete. grew up in inner city Melbourne in a suburb called Flemington. A very rough suburb surrounded by... Lots of, you know, like sort of Greeks, Italians, lots of migrants who came in from the second world war and, you know, who were my main friendship groups. So yeah, I think he would have said, you know, just go and do what you enjoy doing, mate. Yeah, because I think I tended to do that when I was kid. And the other one is, yeah, yeah, be fearless. Yeah, don't be scared. Have a crack. Utkarsh Narang (04:42.472) Hmm. Hmm. Be fearless. And fear is something that I kind of have a very personal relationship with. Like how do you really overcome fear? Because as a young kid growing up in India, there were multiple fears that not for any specific reason, but were instilled within me. But when you say that that eight year old will say be fearless, was Pete fearless through from eight to sixty one, as you're saying right now. Pete Williams (05:11.694) I think it's not always, not always, but I think, yeah, it was funny. I asked my mum once, you know, what three words would you use to describe me? And fearless was one of them. Some grads had to do a thing interview like, what three words would your mum say? I don't know. So I rang my mum and she said, yeah, but I think it was always this sort of willingness to try new things or even do stupid stuff, you know, like, which I've. You know, just made a, you know, like, yeah. And, I am, you know, there are some things I'm scared of, like, I won't go like bungee jumping or stuff like that. But yeah, but again, you know, yeah, get in, get in, get in the middle of things and don't be scared to have a crack. And I think the other one for me particularly is don't be afraid to speak up if you've got a different view. Utkarsh Narang (06:02.152) Yeah, so give it a crack is one thing that I'm hearing you say. then, yeah, when you have something to share or don't be afraid of speaking up. But it does not come naturally to people because I've heard so many stories, have had so many conversations with the coaches where people have the right ideas to share, but something stops them. How do you, what would be your kind of... What's your guidelines or rule book for them to be fearless and to speak up? Which just seems like such a simple thing. Pete Williams (06:35.79) It's like If you... I have this thing that I think quite a lot and I have certain points of view and I just think... Well if you've got something to say and yeah I don't know like you know when you go to a meeting or I'm on boards right and you go to a board and there's like some board members don't speak the whole meeting and I'm like what are you doing like why are you here you know like if you're there and you're in the mix well put something in or the other one is like Ask the dumb question. know, that's number one, ask the dumb question. Because if you're sitting around the table and people are speaking stuff and you're like, I have no idea what they're talking about. It's like, what are you talking about? What do you mean by that? You know, and then you see everyone else around the table, oh yeah, yeah, yeah. What are they talking about? You know, or why would you do that? Utkarsh Narang (07:29.864) you Pete Williams (07:32.69) I don't understand that. know, so, or I'm sorry, I might say, you know, this might sound like a dumb question, but, and it's like, I realised over time when I do that, because I don't have this sort of thing about, I'm interested if I don't know something, or if somebody's looking at the world through a different lens than me. So, you know, sometimes you're having this conversation and you're like, Why do they ask that question? That's a weird question. know, like, it's, what that to me is a signal of like somebody's looking at the world from a completely different perspective than you are. That's the biggest learning opportunity you get. Or you might, you know, I just, yeah, if you've got a better idea, yeah, put it in. Yeah. So I don't know. I just sort of feel like if I'm there, you know, I'm not there as an ornament. I'm there to participate. So say what you think. Utkarsh Narang (08:20.936) Hmm. Utkarsh Narang (08:24.712) Yeah, that's such a beautiful piece of advice and it sounds so simple, but I think just to look at it from this perspective that I'm not there to be just an ornament, not there to just sit around, but to add value, whether that's through a dumb question or whether that's... Yeah. Pete Williams (08:40.238) I'll give you an example, right. So I was with that staff member and we'd done some work for a sort of prestigious high school around technology and in the classroom and stuff. Anyway, so we're walking in and she's done a lot of the work and you're talking, get there, silence. Have the whole meeting, thinking like, anyway, so walk out, boom, I said, stop, we're going back. She's like, what do mean? I said. What's the point of talking about everything on the way there? You get there, you don't say anything and now you talk on the way out. It was like, that's madness. Like you've got something to add, like listen to how passionate you are. So what I learned from that though, that was sort of, you know, relatively early in my career was if I've got somebody who isn't used to talking and sometimes it can be, sometimes it can be, you know, for some people it might be a status thing in my culture. Utkarsh Narang (09:25.788) Hmm. Pete Williams (09:41.76) I'm younger so I can't speak. In my culture, I'm a woman so I have to let the man speak, whatever. And as an Aussie, don't really have much culture because I'm a mixed bag of everything. So it's a bit like, I would say, right, when we get in the meeting, I'm going to throw to you on these things. So you, that's the stuff you know how to talk about. You'll be talking about it. you know, I think, you know, as a leader, you learn these things and it's, and sometimes it's, you can say like, why didn't you do that? Or, you it's, or you just need to say, yeah. I do remember once I went to a meeting and a client was doing something with a piece of technology that I don't like and spending too much, taking too long. And I said to them after we left the meeting, I said, do you think I was strong enough in my point of view that they needed to stop this project? He said, I think when you said... if you continue this project you should all be taken out and shot was fairly strong. Yeah so again and sometimes it's those yeah maybe not everyone's going to be as extreme as me but sometimes it's just like Utkarsh Narang (10:44.945) Hmm. Pete Williams (10:54.894) you know, throw politeness out the window and just, you know, say what you think, not in a obnoxious way or what, just like be really clear. I mean, I said to a client the other day who I was talking to, it's like, you know, you need to avoid that like a plague. I think they're a relatively small company, didn't have a proper IT function, but we're talking about doing this sort of integration project and self-building this stuff. And I'm like, no, you know, you don't have the capability to Utkarsh Narang (11:01.97) Yeah. Yeah. Utkarsh Narang (11:12.552) Hmm. Pete Williams (11:24.848) manage it, to build it, to maintain it, you know, avoid self-building like the plague. And it's at the end of his life, I wrote that down. So sometimes those strong messages are the ones that get across. But again, you don't have to be, and sometimes you'll be, you know, it's not unusual. Sometimes you'll have an idea and everybody sort of shoots it down. It's like, okay, and you might be pissed off or you might be, and then, but the question then you got to ask is like, Utkarsh Narang (11:31.112) Yeah. Pete Williams (11:55.542) Was that their fault or was that mine? What didn't I do to help them understand my perspective in such a way that they felt comfortable in saying, yeah, let's give it a crack. So again, it's like, own it would be the other thing. Utkarsh Narang (11:57.352) Mmm. Utkarsh Narang (12:07.048) Hmm. Utkarsh Narang (12:12.828) Got it, got it. No, these are such, such amazing examples. And I think you, this idea you spoke about how that person was speaking before you entered the room and then after they entered the room, after you left the room and not in the room, it just shows that, that cultural nuance. just shows maybe how they've grown up, how, what kind of leaders have they been around? So I think. Pete Williams (12:33.314) Yeah, but again, it could have also been, you know, I'm with the boss of the school and with my boss and I'm here. And again, it's like, you know, you're not there to carry the bags. I bring you to these meetings because you've got something to add. But again, whose fault was that? Was that hers or was that mine? Well, yeah, I assumed that, you know, maybe she would just be like that. And it's not, you know, it's not that she was a female or anything. It was just, yeah, just sort of. Utkarsh Narang (12:37.958) Yeah. Yeah. Utkarsh Narang (12:49.648) Yeah. Pete Williams (12:59.832) didn't feel comfortable, but then you learn, well, if I'm going to take people to meetings and expect them to talk, and they're not perhaps as unrestrained as me, tell them, I'm going to throw to you on this. And that way, they'll be hopefully ready. Utkarsh Narang (13:17.094) Yeah, yeah, yeah. And that's your role as a leader, that you help the other person find out the skills that they have or things that are hidden within them. Love that. Pete Williams (13:23.65) Yeah. Yeah. That's, yeah, I mean, to me, you know, before we started, said, I've been working on things I like. I would say the number one thing I like is sort of... Sometimes I call it, you know, sort of lifting the veil on people who've got a certain view of themselves and then you lift that up and say, actually, you are this. You always were, you just didn't ever let it out. You know, and I think that is, and again, a lot of the, yeah, I do that sort of stuff with, like, so I might be called in as a futurist, you know, what's the future? And I'll be like, you know, what do you... Utkarsh Narang (13:42.984) Hmm. Utkarsh Narang (14:01.01) Hmm. Hmm. Pete Williams (14:04.878) Actually, I'm sick of always having to tell you guys what the future is. Tell me the big issues of the future. You know, and I start talking and write it down and it's like, you know, energy, climate, aging, you know. Utkarsh Narang (14:08.86) Hmm. Pete Williams (14:16.558) health and chronic disease or transport or employment, technology and it's like yeah, you know, you don't need some futurist to come and tell you all that stuff, geopolitical, all that sort of stuff. It's like, yeah, okay, so you already know that. know, in moving into that, how would we look at doing this? You know, and then suddenly, or you know, sometimes I get people to prototype a mobile app. Utkarsh Narang (14:30.428) Mm. Utkarsh Narang (14:37.105) Yeah. Pete Williams (14:43.18) technology solution to something and give them sort of a bunch of cards around social and technology issues and say, right, go and do this, you've got 20 minutes. And I might say to who at the start of your things that they could design a mobile app for this? I couldn't do that. After 20 minutes they've done one and they're prototyping and sharing it. It's like, who reckons they could do it now? Utkarsh Narang (14:54.31) Yeah. Utkarsh Narang (15:04.957) Hmm. Yeah. Pete Williams (15:08.012) Yeah, we all could do it because it was sort of they hadn't given themselves permission to sometimes play outside their domain. Sometimes people think that they have to have a technical knowledge of the how instead of, actually think about the experience of the what or the problem I'm trying to solve or why this is important and how it might be better. And, you know, people can get there now to build it. Yeah, you then it's going to be able to maybe or probably can in the next six months with AI. But Utkarsh Narang (15:14.279) Yeah. Utkarsh Narang (15:19.368) Yeah. Utkarsh Narang (15:23.304) Hmm. Utkarsh Narang (15:28.754) Hmm. Pete Williams (15:37.936) Yeah, it's that sort of taking people outside of their norm and giving them the confidence to have that self-belief to think, create, solve collectively as well. Utkarsh Narang (15:52.028) Yeah. So many, so many things that I want to kind of speak about. think and even before we dive into deeper into AI and artificial intelligence, I mean, we're speaking about this idea of where what we can do as leaders to bring up the human intelligence, because you spoke about removing that veil from people where they've not even seen themselves in the way that maybe you observed them. And also this idea that you spoke about giving them permission to play outside the current demands of their role or their experience and they need to give them that permission to get there. And you spoke about self-belief. What's been your journey like, Pete? Like, and I'm assuming and correct me if I'm wrong, we're all not born as leaders, right? But we grow into becoming the leaders at who we want to and choose to be. What's been your journey with leadership? Are there any leaders that you felt inspired by? Have you built this on your own? Yeah, just a little bit on that. Pete Williams (16:53.326) Nobody does anything on their own. this, and perhaps, you know, maybe we go through that. Like I think I grew up in a sort of maybe a typical Western sort of point of view of, you know, we're self-made and we do it on our own. I had to put myself through uni and I moved out home when I was 17 and started at night school and it was all me. But I had a, you know. couple of crises occur around me and found that that sort of nonsense about being self-made and everything, I don't want myself, know, all that bullshit, was hopefully inadequate as a way to go through life. And funnily enough, a audit partner at Deloitte said to me, mate, I know you've been going through some things and stuff. Here's a book. Utkarsh Narang (17:35.389) Hmm. Pete Williams (17:47.31) called the art of happiness by the Dalai Lama and you I'm not gonna have some bloody Buddhist monk tell me what to do so but I was in Sydney so I get on a plane and I haven't got a book to read I'll read this bloody art of happiness and I reckon I read about the first six pages was like you know like many people think they're independent but we're all interdependent think about you know what you're doing right now and you know the paper that's here this phone you know this water this bottle like how many Thousands of people have just contributed to that being here in this moment. Yeah, that's a strong point. I'll give you that one, Della Lama, and then the notion of impermanence, you know, like... We live in this world where we think, you I have to control it and it has to be the same and I want it to sort of fit me as opposed to, actually nothing's permanent. Everything's impermanent, you know, like, and I was sort of going through these things and was like, got off the plane and I'm thinking, yeah. I think the other one too, and sorry for getting. Buddha's spiritual on you, but there's sort of notion that everything arises through causes and conditions. And if we can sort of suspend our emotional reaction, you you feel it here sometimes, like, you know, somebody yells at you or somebody does, Utkarsh Narang (19:13.128) Hmm. Pete Williams (19:14.784) It's like, well, what's going on there? What's going on with me? What's going on with them? know, what are the causes and conditions that are doing this? And I think being able to sort of learn to control, I think self-awareness and learning to control your mind. Utkarsh Narang (19:16.764) Hmm. Utkarsh Narang (19:29.576) Yeah. Pete Williams (19:30.158) You know, I'm a big believer that there's you and there's your mind and your mind will run in plenty of directions, whether that be utopia, catastrophe or something happens and then, then this is going to happen and we pile on ourselves. Well, I think the other big one is... Utkarsh Narang (19:41.85) Mm. Mm. Yeah. Pete Williams (19:50.094) is compassion and from a western point of view, like I was brought up Catholic, but yeah, compassion was very much something that you had for others where in Buddhism and in an eastern point of view, it's more compassion is something that you can have for others, but you can also have for yourself. And I think that to me was a big... Utkarsh Narang (20:00.37) Mm. Pete Williams (20:13.346) a big change in terms of how I led, how I thought, how I viewed others. And you you talked a bit about overcoming fear. I don't think I've told you, have I told you I do horse whispering? Yeah, just to make me feel a bit more normal. So I grew up in Flemington and there were stables everywhere. So Flemington is where they run the Melbourne Cup. in those days, these days, the horses tend to live on the outskirts of the track. In those days, they lived around the suburb that I lived in. So I had all stables at the back, directly at the back of my house, at the end of my little street. across the bridge and some of them were the best racing stables in the country bro both harness racing and flat you know gallop racing and so from about the age of I don't know six or I you know be hanging over the fence looking at the horses and at eight sort of summing up the courage thing I'd say can I come and work with the horses and so you know that that would be yeah go and clean the stables and you know it's like you get some little kid to go pick up horses for two hours who thinks it's the greatest thing that deserve happened to them in their life. but also then watering them, feeding them, riding them. So yeah, I had an affinity with horses but... Utkarsh Narang (21:26.312) Hmm. Pete Williams (21:27.098) later in life I moved out to the country and I've got two retired racehorses that live with me and yeah I watched a movie about horse whisperer and I said to my sister who's a horse person like I'd really like to learn about horse whisperer and she said you know what probably the best guy in the world lives 5k's from you his name's Carlos Taburnaberri and one of the first things he taught me about leadership we actually did a talk on it called Lessons in Leadership the CEO and the horse whisperer was that that every horse comes to you with its combined experience and its fears. You need to understand what its fears are and work with that horse gently to help it overcome its fears. another sort of great comment he makes is that force comes in when knowledge runs out. So think about that force comes in when knowledge runs out. Like if you don't know what to do, you don't know how to sandwall situation, think, I'm the leader, gonna ra ra and force this stuff. It's like, no, force will create that equal and opposite reaction. So it's sort of this notion of awareness of the horse being present, overcoming fear, encouragement, acknowledging every small step. If I'm asking a horse to do something and it's just, if I'm asking a horse to, and it's to me to ask the horse to walk back. it's just a slight squeeze of a rope with a thing like that and the horse will be like what's this guy squeezing a rope on his shit for? then he'll be like does he want to step back? And he might just move an inch back and it'll be like stop, relax. You know, hey, well done. And yet it's another thing called pressure and release. It's like, I'm asking to do, if the horse is, woo, woo, I'm nervous, release, relax. Okay, you're okay, let's try it again. And yeah, and it was interesting. I reckon I learnt more about leadership from Carlos the horse whisperer. And I used to have weekly sessions with him. And I still practise that today with horses and people. Utkarsh Narang (23:16.616) Hmm. Utkarsh Narang (23:23.74) Wow, fascinating. That's what I love. I did not. did not. This is what I love about having these open conversations because I've just taken down a full page of notes on what you shared. First of all, let's go to The Art of Happiness by Dalai Lama, one of the most beautiful books that I've read and what you almost like synthesized so beautifully that we're all dependent on each other, right? So there's this interdependence that we live with and we feel that Pete Williams (23:26.316) You didn't expect that one. Utkarsh Narang (23:52.807) We are enough, but are we really? And that's a beautiful paradox that you could live with. And this idea of impermanence. This could be the last conversation that we have. This could be the last podcast episode that we release because something might happen to me. And to me, that impermanence just makes everything so much more beautiful. So I love that idea. Pete Williams (23:55.95) Hmm. Pete Williams (24:12.62) I think the other side of impermanence is attachment. Yeah, this notion of I'm attached to staying things as they are, it's futile. You you can't step in the same river twice. We've all heard these sort of cliches or, you know, same, but it's like, yeah, change is a permanent thing. So therefore don't get attached. you know, as though it can't change it because that then leads to this grasping, clinging, stressful behaviour and you, you know, you don't need to worry. I mean, people right now, as we're talking about this, are probably, you know, ah, look at the economy and it's all going to shit and I don't know what's going on. Yeah, it's, you know, yeah, it's weird. Will it kick back? Yeah. Is it good? Nah. But, you know, should you be going to sleep at night and, you know, worrying about the future of your young, it'll, you've got plenty of time. If you're old like me, well, you know, it'll be what it'll be. So, yeah. Utkarsh Narang (25:06.728) Fascinating. Yeah. Impermanence and attachment. think again, two powerful ideas. And you know, if our listeners are still listening to this and if they continue to listen to the full episode, then I don't know. I don't know. It's up to them. It's up to them, Pete. Are they ready to receive what you and I are speaking about? If yes, then they'll be listening. And so if Pete Williams (25:18.51) You reckon I've lost them already? I hope not. Pete Williams (25:30.552) Yeah. Utkarsh Narang (25:32.488) If they're still listening that the person who entered this podcast is very different from the person that they are right now and will be very different when they end this episode. And that's true for both of us as well. And then you spoke about you and the mind. Fascinating. Tell me little bit more about that. Like you spoke about that the mind could go into utopia or could go into catastrophe, could think of the worst or could think of the best. But how have you separated you and the mind? Pete Williams (25:40.856) Yeah. Pete Williams (25:58.528) I just turn it off. I mean through, well through meditative practice it's like, you know, I can sit here like I'm sitting here listening to you not thinking, what's the next answer I want to give? I'm like, you know, what, what are you saying? You know, being present and it's part of being present where you're just like, you know, I'm not thinking about anything else. I'm not thinking, what's my next meeting or, you know like when you've got those Apple watch people, you're like, do you, do you, stop. Utkarsh Narang (26:01.308) Hmm, that easy? Utkarsh Narang (26:27.622) Yeah. What's next? Pete Williams (26:28.398) I've never worn a watch, I've got a good sense of time, but it's like, yeah, just be present. And the other thing is you can control your mind. And maybe that's this thing, it's like, your sort of mind can be this sort of non-stop chattering monkey, or it's just like that sort of existence where your mind never stops. It's like, you know. Utkarsh Narang (26:34.95) Yeah. Pete Williams (26:54.894) If you do some meditation or breathing or, you know, like say with horses, it's like you go, the horse is like dealing with some 500 K animal that, yeah. And you know, my horses have never kicked me or anything. So that's a good sign. But by the same token, it's like, yeah, you know, they could be injured and I might have to get them off a fence, which I've done. you know, it's like, you're not wanting to be thinking about what time's the nine o'clock news on, it's five to nine, better get, you know, you, gotta be. present and if you can do that with people and not be thinking but again it takes practice but yeah it's surprisingly easy once you do it just to sort of get more into the habit of my mind's not chattering away or if I do want to think I think okay I want to think about this good you know so yeah Utkarsh Narang (27:41.874) Yeah. Yeah. Yeah, it's almost like we're speaking about this and I love this idea and I've spoken about this a lot that there's this consciousness on top of the mind that is in our control. And if I can choose right now to be fully present to Pete, then I will choose that. And if I don't, then I'll be distracted with the hundred thousand things that can come into my mind. And I think the easiest thing that I've observed, Pete, is that when you're speaking to someone and it is for in person, but can also be for virtual, try to look into the other person's eyes. Pete Williams (27:54.658) Hmm. Pete Williams (28:15.341) Yeah. Utkarsh Narang (28:15.852) And if you can do that and if you can share the color of their eyes after you've had a five minute conversation with them, then you were present. How does that resonate with you? Pete Williams (28:25.644) Yeah, think it's funny, you know what, I don't know if this is normal, but a lot of people say to me like after we've had a conversation, and I'll say my work later and say, know, when you spoke about this, you remember what I said, it's like, yeah, isn't that normal? no. Yeah, just, I don't know, I find it strange that... Utkarsh Narang (28:42.972) Yeah. Utkarsh Narang (28:50.972) Hmm. Pete Williams (28:51.756) Yeah, but again, it's something that you learn. I think if you can get into that sort of one, it just gives you that ability to, you know, survey a situation, understand, think about the emotional contexts of other people. And again, I'm not saying like I'm perfect at it. You know, there are times where I'll be pissed off or whatever, but the ability to come back down and... Utkarsh Narang (29:13.276) Yeah. Pete Williams (29:18.958) and talk to yourself about it. know? Yeah, I find that something that you can do and yeah, it's you know, it's that sort of classic of people saying, no, I'm too busy to do that. Yeah, but you'd have a lot more time if you spent that time to, you know, just get in control of him. I think about the stuff you want to think about, not let your mind decide what it should think about. And I have these discussions with Utkarsh Narang (29:26.76) Hmm. Utkarsh Narang (29:32.508) Yeah. Yeah. Pete Williams (29:47.542) people quite regularly and they just sort of they can't they can't accept that their mind and their consciousness is is different you know they believe the mind controls them whereas Utkarsh Narang (29:57.362) Hmm. Pete Williams (30:01.606) through meditation and just relaxing or letting things go or even go on a holiday sometimes or just being in an experience. Swimming's good, you because you're in water and if you don't keep swimming you'll drown or things like that. Fairly handy way. You can do it when you're walking, you know, just feel the ground. Yeah, you know, sort of just feel the ground or just be intently listening to the sounds that are around you, you know. Utkarsh Narang (30:05.894) Yeah. Pete Williams (30:31.284) it's yeah just sort of take your mind out of that and put it into I want to feel my heel and my toe touching the ground how does it feel in my shoe and you know just practice yeah because you don't have to sit there sort of you know lotus position going you can do it anyway yeah although yeah anyway let's move on Utkarsh Narang (30:35.368) Yeah. Utkarsh Narang (30:44.316) Yeah. Utkarsh Narang (30:49.168) Yeah, yeah, yeah. These are these are these these are. Yeah, no, no, I think this is this is perfect. This is this is what the intention of the podcast is, because it's actually giving people those actions and insights that can actually help them be present, because being present is such a super skill, Pete, that I think if we all get there, the world would be a very different place. But yes, as you were saying, we should go on. What I want to kind of move into is now we've spoken a lot about this human experience, the human intelligence. for us to be understanding this idea of impermanence and this idea of horse whisperer, which we'll have more offline conversations on. But what I'm really intrigued about is like, I'm seeing AI also become this whisperer to each one of us. I see that on LinkedIn, I see that on content, I see that on presentations everywhere, where you see an imprint of AI in everything that we do. What do you think? How is that going to impact humanity? And how is that already impacting society? What's been your take? yeah, take us through that thought process. Pete Williams (31:53.514) I was having a chat about this with Chat GPT last night. it was more, it was more, the thing is that oftentimes I don't have people I can easily talk about this stuff with. So I was talking about technology philosophy. Utkarsh Narang (31:59.192) How ironic that you have a contest with GPT Utkarsh Narang (32:12.057) Mm. Pete Williams (32:18.21) those who I follow, those who I don't necessarily agree with. the sort of, if I say, when I first went onto the web in 1993, I went onto the internet and we'd been living in this analog world. you know. access to information or people largely was about place. It could be phone or whatever but you know we weren't doing video calls like this. And so analog world was very location dependent. So you know the cultures I'd be exposed to would be the people that generally I live with unless I've traveled. The books I wanted to read would be in a bookshelf or at a library. If I wanted news some specialist topic I'd go to a newsagent. If I you know wanted to watch a video it'd be more likely on limited choice of broadcast TV and if I had a video player you know what was on the shelves there. So we had this sort of very analog thing you go the incident you're like shit I'm sitting in a cafe the first time I was in London now sitting in a cafe in London and I'm looking at information stored on a server in Boston at MIT and then I'm like who's paying for this? know, how does this work? You know, it's just the existing telephone lines, computer modem, you know, there's a browser which has just got standard protocols of being able to take information if it's published with that format. Utkarsh Narang (33:41.352) Mm. Pete Williams (33:55.886) Well, who owns it? Well, no one owns it. What do you no one owns it? You know, it's like just this open protocol that anyone can use. like, shit, this is going to change everything. And I could see, like, wow. you know, more and more people will publish stuff and then, you know, like I'm running off a 14-4 modem or whatever in the 16th of November, I'm like, you know, do think you'd be able to do video down here at some point? Yeah, you know, like right now it's, the band was that small, but it'll get bigger. And then so I'm thinking it's going to get richer. And then I started saying, you know, I reckon we'll be buying and selling stuff there. And then I read Nick Negroponte's book, Being Digital, and it was like, yeah, I can see this world. With the AI, it's not as obvious as to how it will play out with me. And so there's two reasons, I think. One, it's moving very fast. you know, Melvin Kranzberg is the great technology philosopher. know, it's sort of sixth law, I think it is. Technology's neither good nor bad. Utkarsh Narang (34:43.25) Hmm. Pete Williams (35:01.89) but it's not neutral. So, you people make these things for a purpose, but you know, what people do with it is where it plays out. So we look at the web, you know, back in the early days, it was all very utopian. now, you know, information wants to be free unless I can get your data and try and sell everything, you know, if you have a conversation. I looked up. looked up a movie or something on Apple TV and then next day I'm I'm reading the New York Times with an ad for Max, some streaming service called Max. Like, you know, who's this Max? And then next thing I'm gonna go like, who gave permission for Apple TV to take a bloody? search thing that I looked at and then you know it's like this is not right you know like and you know that's just yesterday and it's happening so again you know I don't think we always I didn't so I didn't sort of see that but I'm more wary of that but right now it's like with the AI it's like firstly the vast majority of people have to scratch the surface of what it could do so and if I go back to that analog world it's like with the web we got information Utkarsh Narang (36:04.296) Hmm. Pete Williams (36:13.078) that we could access it through searches and stuff. And the better we are at searching, the better we are at finding information. But we got it in the form and context it was published. With this AI right now, particularly with the LLMs, we're getting it in the context that we're seeking it. We're getting it in from a perspective that we ask it for, whether we're using Act As and Play This Role or look at this through the lens of Melvin Kranzberg, look at it through me moras of or you know look at it through the eyes of a bitcoin zello or whatever actually don't do that the yeah the yeah you know it's sort of this this sort of thing and there's this this sort of Utkarsh Narang (36:47.912) Hmm. Thank Pete Williams (36:59.01) two sides of a coin. One's called social constructivism, which is that technology evolves and it's the social and policy factors that shape it. And then there's technology solutionism. The existence of a technology means that that technology will be the solution or a combination of technologies will solve everything. I'm more in the social constructivism camp, so I don't tend to have these absolute statements of things. And then the other big question with AI is like, yeah, people, you know, the number one thing I say to people about using is don't be lazy. Utkarsh Narang (37:38.823) Hmm. Pete Williams (37:39.246) Always be validating, know. Learn to prompt, learn to use it well, learn to be critical. Put your thoughts in. You know, I did what I call reverse prompting, where I'll say, right, you know, here's an easy one, Ask me three questions. I want to align 15 words, no more than 15 words that explain what I think about creativity and what I think about innovation and that highlights the difference. So anybody can do this. Ask me three questions on each, one at a time. Why one at a time? Because you get three questions at once, it's all too much. So boom, boom, boom, boom, boom, boom. And then it'll come up with a line. And then when I do that exercise with people, were, I want to tell you my line. It's like, I don't actually care about your line. I want to care. How did you feel doing that? And then they're like, Utkarsh Narang (38:29.744) Mm-hmm. Yeah. Pete Williams (38:35.694) Well, normally I'm waiting for it, but it was waiting for me. And it actually asked me these questions that sort of surfaced things in me and that I hadn't really thought about or, you know, and it's that sort of thing. It's like best practice with digital or the best outcomes with digital has always been take what you know, take what it knows, get an outcome in a context. So it's a two way tool. Utkarsh Narang (38:38.504) Yeah. Utkarsh Narang (38:48.552) Hmm. Utkarsh Narang (38:58.8) Yeah. Pete Williams (39:05.648) we get to the question of generalized intelligence, never say never, but it's the same as we're going to have driverless cars, the same as we're going to have flying cars. Where the hell are the flying cars? And then it's like, where's these driverless cars? Well, they were supposed to be here today, but they're still not because we don't fully understand what consciousness and intelligence is. Utkarsh Narang (39:25.543) Hmm. Pete Williams (39:35.358) It's not all just in the brain and it's not all just through connections and parameters and there's certain things that language can't describe. So yeah, I'm more, hey, this is fantastic. Utkarsh Narang (39:47.538) Hmm. Pete Williams (39:48.11) and some of the stuff that we're doing is quite amazing, know, with the teams that I'm working on. And it's like, and just opening up possibilities, but also, you know, I built one GPT the other day to write grants for my Landcare Group, I do a lot of work in natural disasters. This is the National Indigenous Disaster Resilience CAP. I do a lot of work after major natural disasters. Utkarsh Narang (39:50.63) Hmm. Utkarsh Narang (40:07.688) Hmm. Pete Williams (40:19.624) and often writing grants and helping people get money to rebuild or land care for the environment. And so I built this GPT and it was like, I sort of had to do it quick. So I'm gonna throw it in there and boom, out it comes. And I sort of sent around the first draft to a few people. like, oh, you didn't pick up that the maximum capital you could spend was three grand and you've got something there for 10 grand. It was like, Don't be lazy, always be validating. And again, but then I reworked it to, right, here's first step, do not move forward until you've asked the person who uploaded the guidelines. Then give me the key issues in the guidelines, the eligibility, measure the eligibility against the information I've given you about my organisation. then ask me questions about why I'm thinking about this grant what I'm doing. And then pull out the questions one at a time. Utkarsh Narang (40:47.144) Yeah. Utkarsh Narang (41:05.672) Hmm. Pete Williams (41:14.768) and let's work out the answers. So again, and yeah, but again, it's that sort of, we're still learning. It's easy to make mistakes, even if you use it lot. So yeah, don't outsource your mind. Utkarsh Narang (41:15.112) Hmm. Utkarsh Narang (41:30.472) don't outsource your mind. I'll come to that. Don't outsource your mind. I'm going to make a note of that. But what you're saying is, don't be lazy. And you're also saying that chat GPT or AI overall, AI and not just chat GPT is moving very fast. But I'm also seeing, and you spoke about reverse prompting, I think, which is a great way to make it ask questions to you so that it is more in the context of what you need that the prompt responds to you with. Pete Williams (41:57.88) Hmm Utkarsh Narang (42:00.764) But what do you mean by this? Don't... What did you say? Don't give up your mind? Pete Williams (42:07.156) Don't outsource your mind. It's like, hey, I've to do something. Hey, can you just do it and boom. It's like, that's... Utkarsh Narang (42:13.35) Hmm. Pete Williams (42:19.086) It can help you be more productive. I'm not a fast typer. I grew up in the world of pens and I'm left-handed, so I wasn't even allowed to use a pen because in the old days you had those ink pens and if you're left-handed you used to just have ink everywhere over your hand. But in terms of this reverse prompting, I wanted to get my thoughts out there. In terms of thinking through a process, Utkarsh Narang (42:21.33) Mm-hmm. Mm. Utkarsh Narang (42:34.694) Yeah. Yeah. Pete Williams (42:48.372) and then constructing an AI GPT or an app to do it, what are those things? What really matters about that? How am I validating that? How am I getting it to self-validate? Utkarsh Narang (42:54.237) Yeah. Utkarsh Narang (43:02.952) Hmm Pete Williams (43:03.99) And how, you know the other one too, it's almost like I find it works best in a domain that you've got. I was talking about another conversation about, I was talking to it about AI and bias. I was sort of saying that by its nature all information is biased. because we can't have perfect knowledge of all things ever, you all the time and make it available, just not possible. So we're always working off a limited data set. And, we were sort of going through Voltaire and all of that stuff. And it actually attributed a comment to Morozov. And I'm like, that wasn't Morozov. That was, you know, let's say it was Kransberg or somebody. So, shit, you're right. But again, it was sort of I'm having that discussion, but I've got... Utkarsh Narang (43:35.464) Yeah. Pete Williams (43:57.708) the capacity to have this in-depth discussion about tech philosophy and history because it's the shit I read, I know, and I've got perspectives. I want to have a conversation or be challenged or, you know, I would argue this, what would be your counter argument? know, these sort of things. But again, if it was like, you know, hey, I just went to China and I'm going to buy a property, but... Utkarsh Narang (44:00.232) Hmm. Pete Williams (44:23.886) I don't really want to waste money on lawyers. I'll get ChatGPT to write me a contract for the purchase in Mandarin. Utkarsh Narang (44:28.934) Mm. Mm. Pete Williams (44:33.032) nothing about that. I couldn't validate it. can't even read the, you know what mean? It's like, you know, the further outside and again it's a bit like, you know, what I used to say to people about using Wikipedia. It's a great starting point. It's generally pretty good but it has footnotes and you can validate based on those footnotes and that's the other one with a lot of the AI that I'm doing in organisations is having citations and stuff. So you know, hey, here's the answer but validate it right here. Well, here's the source. So you Utkarsh Narang (44:34.375) you Mm. Pete Williams (45:02.936) might be using it to discover information or get a context and then like boom and here's my sources for that and you see that in perplexity and things like that. yeah I think this that's the danger for all of us that we just say know ChatGPT can do it and we don't actually apply thinking and I think it's the sort of we're hearing that a lot in education as well that whilst it can be an amazing educational tool it also can be hey I've got the right this is Utkarsh Narang (45:07.208) Mm. Pete Williams (45:32.768) assignment, can you write it for me? you know, teachers are smart enough these days to like, you didn't write that. I mean, there's even one with a, I'm on the Futures Committee for the Law Council of Australia, and I'm not a lawyer, but they wanted somebody who's a bit of a different sort of thinker or understands this place. there was a judge who in the New South Wales Supreme Court that put out a note and saying, do not use LL. Utkarsh Narang (45:34.856) Yeah. Utkarsh Narang (45:39.292) Yeah. Yeah. Pete Williams (46:02.692) lands to do witness statements because you know he's reading these witness statements and like the witness the statement like you didn't that's not what he said because they were just sort of putting the words of the Chat GPT between say turn that up into a witness statement it's like well no you've got to put proper statements there so again it's this sort of we want we want the real yeah sometimes not just the the artificial and even a shithouse prompter Utkarsh Narang (46:04.808) Wow. Utkarsh Narang (46:10.184) Don't match. Utkarsh Narang (46:28.434) Yeah. Yeah. Pete Williams (46:28.568) you're going to get a mediocre result. So that's where I'm like, learn prompting, learn prompting, be advanced, share and learn from each other. Utkarsh Narang (46:37.256) Wow, powerful. I love these three ideas that all information is biased and we need information that's well validated so that we know that it's true in its sense. And then this contextualization that you're speaking about, I think that's very apt. But you know, there's a lot of fear as well, Pete. And there are those who love and have these intellectual conversations with chat GPT. But then there are those who completely fear AI and are blocking it right now. What do you think? Pete Williams (47:04.46) Yeah. Utkarsh Narang (47:05.352) How do we shift their mind? Because it's not a bus that you don't want to get on like you in 1993 got onto the internet. Pete Williams (47:10.606) If You should have been around in 1993. It was way, way harder. God. You know, it was like, it's sort of, I'll tell you what happens. New technology emerges. It's often with information technology. Don't fully understand it, you know? It's going to change the way things are. Therefore, we'll ban it. because, it could be bad, we're not sure. So, you know, if you go back ancient history, Socrates, I think it was, to ban the alphabet because it's like, you know what's going to happen? We're used to people talking and knowing, but they're going to write stuff down and they won't use their memories anymore. You know, when the 15th century, when the Gutenberg press came out, it was like, what about the scribes? You know, the scribes were like, it was a high status job in society, we're not going to need the scribes anymore, you know? on was like shit people are going to publish stuff other than religious texts. You know there was a little precursor problem that had to solve as most people couldn't read because writing wasn't available but you you think of you know over the next say three to four centuries the impact of printing, writing, publishing more information. And you know, but we still get that today. Oh, you know, people are going to put stupid stuff on the internet. Oh, look at the stuff they put on social media. So this sort of, we've got this natural tendency to say, oh, we don't understand something. It's dangerous. It could be bad. Therefore we must stop it. Where I have a view of... Utkarsh Narang (48:43.72) Hmm. Pete Williams (48:46.67) it's interesting, it could be profound, it could change the way we're doing things but I think I would put that caveat on it having been an internet or digital zealot in the 90s. you know, beware the dark forces. There is a dark side. Utkarsh Narang (48:59.752) Hmm. Pete Williams (49:04.558) And even if we don't fully know what it is now, it will emerge. And I feel strongly about that in terms of the behaviour of people and what they do with data and how they use it in surveillance capitalism and all that stuff. I really don't like it. again, yeah. yeah, let's just hope that... And the other one, people say me, AI can be biased. And I'm like, yeah, I can make it... Tell me, have a point of view on this in the view of Donald Trump and have a point of view on this in the view of the Dalai Lama. So, it's sort of, yeah, I can make it as biased as I want. Do a... Utkarsh Narang (49:40.04) Hmm. Pete Williams (49:45.966) you know, an extreme ideology of this versus an extreme ideology of that, you know? Okay, yeah. So we can, it can be, it has to be biased because it hasn't got everything, you know, at all times known to everyone. But yeah, and you know, we can expand data sources and all that stuff, but also we can apply our own, yeah, you know, find people with different points of view, you know, how does that play with you? So, yeah. And so I'm sort of saying it's like, again, and in, Utkarsh Narang (49:51.068) Hmm. Pete Williams (50:15.92) and of itself it is neither good nor bad. But it does, you know, there is, it's not neutral in the sense of say, open AI want to move to artificial general intelligence, that's what they want to do. But in the meantime, there's a lot of good stuff we can do with it and we need to think about what the bad stuff is, which is don't be bloody lazy and stupid, yeah. Utkarsh Narang (50:36.072) Yeah, yeah, there's good to bad, good and bad to everything. And so I think we need to be very specific as to what we're taking away. Where do see the future going here? Like, what's your best prediction? I know it's really hard with the speed at which everything is changing, but what's your prediction? What's going to happen somewhere in the future? Pete Williams (50:43.342) Hmm. Pete Williams (50:55.502) All right, so I think number one, the profound nature of this is that we're taking as much of humanity's knowledge as we can, that's written down, together with the conversations that are being had. And we're making that accessible to people in their context, in their words, in their language. So to me, it's democratising access to information and knowledge. Utkarsh Narang (51:11.133) Hmm. Pete Williams (51:24.95) I think that in terms of applying that knowledge, it's sort of... you first you start with conversational searching, then you start with, you know, doing stuff with it, then you start with, I want to do this, let's work together and work out how we can do that, you know. So suddenly it's becoming this enabler and, know, but we talked before giving people skills that they didn't know they had. Like I, I learned to code like basic HTML and I learned basic code in 83, I think it was. Utkarsh Narang (51:44.968) Hmm. Pete Williams (51:59.864) You know what? I'm never going to be as precise in detail as you need to be if you're to be a great coder. you know what? I can do it. There's people who will be gazillion times better than I ever will be. But I'm pretty good at coming up with seeing problems, how you might use technology to solve them. So I was been working with a thing called Bolt. What was it? Bolt or Jolt? Yeah, Bolt.new last week and starting to work on that, you know, to sort of develop some applications. One of the software engineers I worked with was, and I built a little app in half an hour. It was actually an idea capture app, but just for something different. And I know there's others out there, but they're too bloated. So was like a real simple one. And it was like, shit, you know? But again, I work with Chat GPT to say, right, act as a specialist in prompting poll. This is what I want to do. Now explain, I want you to break it down step by step and explain why we're doing it. And so I'm using it as an applied teacher and learning experience. So again, wow, that couldn't happen before. Utkarsh Narang (53:07.548) Yeah. Yeah. Utkarsh Narang (53:15.208) Yeah. Pete Williams (53:15.51) And then the next one is interesting. So that's the world's information and our ability to learn. So I think that prompting learning how to use generative AI in particular will become a much bigger part of education in society and also how we actually interact with information across the long term. The next thing I think is that we'll start to see a lot more Sort of more. codified processes like, know, A, I've got this task to do, somebody's written a kick-ass prompt, I put in some variables, boom. So I don't have to, you know, craft the exotic prompt every time I want to do something, but I've got access to, and not so much just a prompt library, but you know, like, you know, boom, here's the task, here's the information, here's my context, here's this, here's that, run, do, adjust, you know, and I've got a great result. I'm finding deep research on ChatGPT to be really good at that. Utkarsh Narang (53:48.136) Mm-mm. Pete Williams (54:17.993) So there's more workflow apps and step-by-step apps and multi-party apps. know, hey, I've come up with this thing and, you know, it sends it to you, boom. And then we move into the agentic stuff where we're starting to say, right, I can get, you know, I can do robotic process automation. You know, like you go here, log in there, pull that off. Utkarsh Narang (54:20.326) Yeah. Yeah. Utkarsh Narang (54:39.944) Hmm. Pete Williams (54:41.646) create a statement in a PDF, put it in that file and tick that that task is done and the information's correct based upon a trusted source. But again, codified steps. Like, go and find me, trusted sources of information around this person's identity. Again, it's given the agent some agency. So it's not a codified process. Now, Utkarsh Narang (54:49.767) Mm. Utkarsh Narang (55:01.222) Yeah. Pete Williams (55:05.846) That's, you know what they're saying, to stuff somebody up, something up, use a human to do it at scale, use a computer. So again, it's like, okay, we're gonna see these agents doing stuff, but the other thing is it's like, we might, let's say that you're working in an organisation and you've got a situation come up and you need to take some leave quickly. So hey, how much notice do I have to give for leave? Utkarsh Narang (55:10.834) Hmm. Yeah. Utkarsh Narang (55:28.872) Hmm. Pete Williams (55:33.774) It could say, you know what, if this person's worried about the time of notice they've to say, hey, you've got to give a week, get a print from your manager, but is there something that you need to take it quicker? Yeah, well, I've had this issue. Okay, if you've had that, do want me to draft an email to your manager? And by the way, you've got 10 days worth of leave. And actually based upon all your colleagues' calendars, it's not a bad time. Yeah, you know, so again, it's this sort of notion of Utkarsh Narang (55:57.382) Yeah. Pete Williams (56:01.602) There's a trigger in the sense of the question you ask in the context of the question, the action that is most likely and the ability for the system then to say, I'll go and do that for you or hey, is this what you want to do? No, actually I want to do this, cool. So this sort of trigger response act. so again, I think we're to see agents come and take away a lot of crap that we have to do. Utkarsh Narang (56:23.858) Mm-mm. Pete Williams (56:25.39) to give us more time and just to make things easier. Now, the other thing is the easier you make things, the more compliance and bureaucracy needs to come in. So just, yeah, it's a two-edged sword. so I think we'll see, I think we'll see some monumental stuff ups with it. And again, you know, if we look at some of the stuff ups, like say Robodex won, the people say, oh, I got that wrong. That was the rule for a wrong. Utkarsh Narang (56:32.72) Yeah. Yeah. Utkarsh Narang (56:41.256) Hmm. Utkarsh Narang (56:49.384) Hmm. Pete Williams (56:49.536) and they just apply those rules at scale. It wasn't really an AI problem. It was inappropriate governance and inappropriate rules applied to a situation and inappropriate oversight and all that sort of stuff. So again, know, like what is my governance framework in that agentic world? But it'll play out and I think over the next few years. But then the one that I'm perhaps most excited about is the relationship that we can have with software. So software... Utkarsh Narang (57:15.74) Yeah, so powerful. Pete Williams (57:18.79) computing is a pain in the ass. We're trying to get better at interfaces and stuff but hey the mute button you know it's still you know like my Wi-Fi is down it's just it's painful and you know hey you know didn't you know you can do that in XO if you go click click click boom boom boom No, I didn't actually know that, but I can do the formula plus. I'm good at that. I can do the copy and paste. But so there's a lot of powerful software out there that to actually get the full value from it, you have to you have to have a deeper knowledge of all the menus and all that stuff. but, you know, this ability to just have a conversational interaction with a lot of the software that we use, I think also that then brings that sort of computing power. Utkarsh Narang (58:04.2) Mm. Pete Williams (58:08.844) to us. so again, you this sort of notion of, I hear, I did a, I did a, an image on, I think was mid journey or one of them, Dali, it was, and, it was like, I want an image of, you know, the power that this technology gives to a person, but I don't want it anthropomorphised or whatever. I don't want it to be, you know, make the robot be human. So of course it does humanoid robot. It's like, no, you don't get it. It's like, it, it, Utkarsh Narang (58:34.332) Yeah. Pete Williams (58:38.888) a machine which we can control, it can do stuff for us, it gives us powers and it gave me a guy like in a Superman suit with a machine that was in control. And I think that's where the fear comes in. It's like what if that machine actually does control? And in some cases it will. And then you know how do we manage that? Actually, how did we mention that? There's an article Utkarsh Narang (58:54.79) Yeah. Yeah. Yeah. Pete Williams (59:06.541) that I co-wrote when, you know, what's the future? And the article is called Predicting the Unpredictable. know, predicting, yeah, here you go. Can you see that? Predicting the unpredictable, exploring technology, how, exploring how technology could change the future of work. And the picture that we use, Utkarsh Narang (59:17.106) predicting the unpredictable. Utkarsh Narang (59:21.682) Beautiful. Utkarsh Narang (59:31.25) beautiful. Pete Williams (59:33.786) is a picture of a baseball guy running for an outfield catch. And the thing is that directionally Utkarsh Narang (59:38.034) Hmm. Pete Williams (59:42.22) we can see where things are going. So that baseball guy doesn't sit there and, you know, come up with his predictions. He moves towards the ball. And then each step, each sort of thing, and there's wind going and all that, but each step gets him closer to be able to make that final leap to catch it. So my way of believing that you engage with the future is to start doing it now. You know, that's why I'm playing with various systems. That's why I'm finding problems to solve. That's why I'm putting it in the hands of Utkarsh Narang (59:47.986) Yeah. Yeah. Yeah. Utkarsh Narang (01:00:05.128) powerful. Utkarsh Narang (01:00:10.545) Yeah. Pete Williams (01:00:12.144) One project that we're doing that's starting to get some international attention is this one that we're doing with schools in Papua New Guinea. So I'm on the board of a company called Lighthouse. It's a charity and it's about L-I-T-E-H-A-U-S in the... Utkarsh Narang (01:00:25.48) Mm-hmm. Mm-hmm. Pete Williams (01:00:31.532) top pigeon or Papua New Guinean pigeon English. And what we do is we take end of life computers from large corporates, we refurb them, wipe them, and then we put them in schools, we teach the teachers, and we have a little box called a NUNET box, which has got six million articles on it, Wikipedia, Khan Academy, heaps of education stuff, and it's available to kids in those schools and the teachers. Problem is... Most of those schools don't have internet. That's okay because it's got a hard drive in it. these are often teachers and kids have never seen a computer before, let alone touch one. So we've got the training there, but it's still clunky. So we're building using Llama and another Facebook thing called No Language Left Behind to teach Llama to speak Tok Pishin or to take input and output in Tok Pishin, the bridging language and English, and provide learning plans, lesson plans, information Utkarsh Narang (01:00:59.889) Right. Utkarsh Narang (01:01:07.336) Hmm. Hmm. Hmm. Pete Williams (01:01:27.618) to kids in the computer labs that we set up. Yeah, so you you look at that and you know, that's, that's bringing a lot of things to the table. I mean, but the other one that I'm often on about when I'm working up in PNG is like, we've got to get more PNG content in here, you know, like, we've to get their context, their situation and all of that stuff. And yeah, so you know, to me, it's it's learning about those things and also being aware of like, Utkarsh Narang (01:01:29.65) Fascinating, fascinating. Utkarsh Narang (01:01:42.759) Yeah. Yeah. Utkarsh Narang (01:01:52.548) Mm. Pete Williams (01:01:54.574) yeah how do we make sure that the teachers know how to use this? How do we teach the kids not to be lazy with it? But how should we also help those kids be able to go home and say you know what I can really help fix up that corn crop that we've been having problems with right? Yeah so and again because a lot of people in Papua New Guinea live in sort of like semi-subsistence so you know they might have a family like there'll be a village which will have a series of family sort of enclaves and they'll have you know they'll be growing Utkarsh Narang (01:01:57.404) Yeah. Yeah. Utkarsh Narang (01:02:05.938) Beautiful. Utkarsh Narang (01:02:12.882) Hmm. Utkarsh Narang (01:02:17.373) Hmm. Pete Williams (01:02:23.01) a lot of their own food and a couple of people might have a job and they sort of live a bit more communally. But yeah, there's a lot of stuff that, you know, that could help them. it also your health and issues like that massive, you know, tuberculosis is still a big problem. Polio outbreaks happen. So, know, how do we, you know, help communities like that, but do it in a way where, you know, not just sort of giving them Western shit and saying this is the way you got to use it, being able to do stuff that Utkarsh Narang (01:02:34.726) Yeah. Utkarsh Narang (01:02:39.41) Hmm. Yeah. Pete Williams (01:02:53.034) positive. you know that's some of the stuff I do. The other one I've been doing a lot is in the natural disaster work building GPTs that help people understand how to run a community recovery. So yeah it's that you know I've got all this knowledge and information here and what here's a GPT let's talk but here's a GPT I can leave with you and just ask questions about you know how do I do this what happens if this happens yeah. Utkarsh Narang (01:02:54.148) Yeah. Utkarsh Narang (01:03:04.936) Hmm. Utkarsh Narang (01:03:14.588) Yeah. Yeah. Yeah. Yeah. Amazing. What stands out for me is the enthusiasm and the energy and the curiosity with which what you're playing with technology, Pete. And I think if listeners can take away something, it is to engage with AI, engage with technology now and learn it to modify and contextualize it with the communities, for the people, for the causes that they want to serve for. But as we now get towards this end of our conversation, I love Pete Williams (01:03:43.235) Yeah. Utkarsh Narang (01:03:48.156) where we have reached if in a couple of decades from now if that 80 year old Pete were to come to you right now and give you one piece of advice Pete what would what would that be? Pete Williams (01:04:02.338) you made the right choice. Because when you get older, you end up in a situation. There's a few things about when you get older. And I'm not sure, like say from your upbringing and background, but nobody sort of told me or my mates at school that when our parents get old, It's really, it's difficult and we've got to do put a lot back into helping and care for them as well as other relatives and all of that stuff. And they don't, you're not really prepared for it. And you know, the other one, I actually, you know, as you do built a GPT for how to navigate my age care. Yeah. Cause it's, it's just really difficult and nursing homes and all of this stuff. It's so there's, there's that thing. Then the other question you've got is like, what's... what's that sort of purpose of my life? So I'm 61, 62 in June. And you sort of say to yourself, well, I've got a lot to offer, but I could retire now, you know, you know, go and play lawn bowls, which I do play, but competitively I'm reasonable at it. But because I thought that's what you do, because I'm theoretically semi-retired. But it's like... Utkarsh Narang (01:04:57.554) Hmm. Hmm. Utkarsh Narang (01:05:06.642) Hmm. Pete Williams (01:05:23.758) Do you have an obligation to give back? Do you sort of say, you know what, it's all worked out for me. I'll go off and have a holiday home up in some sunny spot in winter and... And I had this chat with a guy called Steve Killalai who he had a couple of big tech exits and he set up the World Peace Index. And Steve, you know, he's a Buddhist as well. you know, we were talking and he said to me, mate, you know, I've retired four times. He said, but you know, like the first time I retired, decided to, know, heaps of money, money was never gonna be a problem. I'll go on tour of the world, but I'll do it, you know, for all these obscure places. And then he observed that whenever he went to a place that was involved in conflict, the economic situation was disastrous, which is what caused him to set up the world peace index and stuff like that. And, you know, and he also talked to me about this sort of never-ending sort of like wrestling between the good and the evil and nobody's ever really gonna sometimes it might seem like one's on top of the other but it's this sort of constant tussle and that, you know, in his view if there's something that you can do to help the good guy you gotta do it. And I feel an obligation to society, to myself, to my family, to get out there and keep doing good for as long as I can. And that's what's gonna happen. So if I do make it to 80, the 80 year old will say, yeah, maybe you could have spent more time on holiday. Maybe you could have... Pete Williams (01:07:17.324) you know, spend more time with the family, although I work from home and spend a lot of time with the family. Or maybe you could have gone and made a difference in the world than, you know, for a lot of people and that's the path I choose. Utkarsh Narang (01:07:34.322) Wow. Thank you for this conversation, Pete. And I'm pretty sure that listeners took away so many beautiful things from this conversation. yeah, wishing you the best as you go and influence and impact and inspire societies, communities around you. Thank you for this time and for our listeners who listening on a podcast platform. Share with someone who you feel would enjoy this conversation between Pete and myself. And if you're on YouTube, like, comment and subscribe, because that's how we make this wisdom go viral and we beat the algorithm in taking over the world. Thank you for time, Pete. Pete Williams (01:08:10.402) No worries, thanks for having me. All the best. See ya. Utkarsh Narang (01:08:12.872) Cheers. Just wait for it. Wait for it. Wait for it. for it. Yeah. That was that was such a.


