The Education Business in an AI World

Matt Stauffer:
Hey, and welcome back to Pragmatic AI, where we talk about using AI in the real world, what works, how to use it well, and when it causes more harm than good. So talking about practical tools and real trade-offs for builders and business leaders. And my guest today is my, I always say old friend, my longtime friend, Jeffrey Way, the creator of Laracasts which is an educational platform. Jeffrey, would you mind telling us a little bit about Laracasts and what your role is there in a day-to-day?

Jeffrey Way:
Yeah, absolutely. First, old friend would be appropriate as well. I'm noticing my age more and more every year. I recently turned 40. I turned 40 and I swear something happens when you turn 40. I don't know if it's psychological, but suddenly I'm like, you know, my back's been hurting a little bit more. Anyways, ⁓ so yes, I am the creator of ⁓ an educational platform called Laracasts It is focused on programmer courses and videos. ⁓ There are

Matt Stauffer:
Right? Yeah. I'm in my 40s. It feels different. Yeah.

I hate it. Yes. Yep. Yep.

Jeffrey Way:
millions of these at this point, but actually when we launched about 13 years ago, ⁓ they weren't that ubiquitous. There was a handful of platforms that helped with education, but it's not like it is today. So yeah, we focus on programming, specifically a certain type of programmer who uses a certain set of tools, so to speak. ⁓ When I originally launched, this was a one-man shop. This was the go out on your own project.

Matt Stauffer:
Mm-hmm.

Jeffrey Way:
So at the time it was incredibly scary to say the least because it so easily could have gone south. And that's the way with any business. There's a scenario where the business doesn't make a dollar on the first day, but I've already quit my former job and then, yeah, luckily it went up, it went pretty well. And over the years we've grown in revenue until, and this kind of lines up with the podcast, until this year.

Matt Stauffer:
Yeah. Yep. And then what? Yeah.

Jeffrey Way:
⁓ where, I'm sorry, last year, 2025, suddenly revenue started going down every single month. And I could just watch it. I would look at the graph and the graph is doing this. ⁓ And it's no coincidence that this lines up with the growth of AI. And we can talk about this as much as you want, but it is a reality for...

educational platforms and uniquely programming educational platforms that we have to deal with this because now suddenly there is a quicker way to learn and the concept of watching a 20 minute video or a three hour course maybe isn't as appealing as it used to and we can talk about that.

Matt Stauffer:
Yeah.

I want to talk about that quicker way to learn in a second. But before, those who don't have context and for the Laravel folks, you're going be like, yeah, we all know Jeffrey, but I'm really hoping for this podcast to get outside of Laravel world. So prior to Laravel, which you hear me talk about a lot on this podcast, Jeffrey was one of the absolute premier educators across the entirety of web education. And him choosing to invest into

Laravel and saying I'm gonna be the Laravel educator guy was one of the things that helped Laravel become the force that it is today But I'll still meet people from completely outside the Laravel world where they're like, Jeffrey Way: I I learned this, know from that tuts from him, right? know, whatever and then so like Jeffrey is like an og educator has this great reputation and I want to have that in the context both so that we can just be listening and respecting your work but also so we can see that this is I mean you mentioned 13 years, this is not some

pop-up startup that just came up a couple years ago and then had an AI hit or whatever. We're talking about long established business with trusted people in the community who everybody's looked to for over a decade as one of the main educators to learn from. And so that's kind of relevant. ⁓ And I do want to pin, I'm actually going to put a note for us to talk about that. Are people just finding different ways to learn things? But I'll get back to that. My next big kind of foundational question for you is,

Jeffrey Way:
Yes.

Matt Stauffer:
How do you feel like AI is affecting the education industry and specifically developer education? Because you mentioned that it's different. But what are these negative impacts? Like, what do think the actual source of negative impacts are? You mentioned that maybe people are just looking for a faster way to learn. But if you were to just kind of look broadly, like you were to say, I think the developer industry or the education industry, especially in developer space, is having a negative impact as AI. Like, what is that impact? And what do you think the core source of it is?

Jeffrey Way:
Yeah, so I feel like it's almost two questions, right? So what is AI's impact on my business? And then what is AI's impact on my industry? And they're kind of different things, right? So we could go down either path. For my business, we kind of talked about it. I would almost say it's somewhat.

Matt Stauffer:
Mm-hmm.

Yeah. huh. Okay.

Jeffrey Way:
catastrophic, I think a little bit. Or at the very least, it's requiring a course correction. For the industry though, for the programming industry, it's kind of a mixed bag. And I think we're all trying to figure out where this goes. There's a lot of strong opinions on social media, but I don't think anyone necessarily knows. So like, for example, I think there's a pathway where this just creates better

Matt Stauffer:
Hmm.

A pivot? Yeah, okay.

Jeffrey Way:
⁓ products and more products, right? We can do more than we could before. This is a reality. Every developer is slowly realizing, you know what? I can actually get like 10 times as much done in a day as I was before. And suddenly the idea of writing an if statement, a conditional, is almost kind of quaint. Like, ⁓ back in the day, we used to write the characters one character at a time, you know? That's nice, grandpa. ⁓ It's so funny.

Matt Stauffer:
Yeah. Yeah, right?

Jeffrey Way:
how quickly we have gone from writing those characters hand by hand or note or character by character to where we are now where we reach for AI agents to do the bulk of the work. So what does that mean for the industry? Maybe really good. Maybe we can do more. Maybe this is the golden age for programming. I hope that's the case. Maybe there's also a scenario where everybody is building a product and the rest of the world doesn't care because there are endless products

and nobody cares, right? Maybe the future is everyone is building personal software for themselves and we get to the point where even non-developers have the ability to on the fly construct whatever they need. I don't think that's far-fetched at all. right? In either, in any event, what does that mean for us? And I don't have an answer at all. I'm excited that we're kind of in this position where we get to experience this.

Matt Stauffer:
Yeah.

Yeah.

Jeffrey Way:
And in the future, we get to say we were there when it happened. ⁓ I'm excited, I'm scared, I'm nervous, I'm terrified, ⁓ I'm having fun. You know, it's like every possible emotion wrapped into one.

Matt Stauffer:
Yeah. Yeah,

all at once. But it's so interesting because I said industry and I was thinking education industry, but it makes sense that one of the reasons why you're a great developer educator is because you think of yourself when you think of your industry, you think of development, not education. ⁓ But I really appreciate it.

Jeffrey Way:
Okay.

I don't even think of

myself as a teacher, really. That's like 25 % of my job. Most of my day is just spent working on stuff, building stuff, working on features. ⁓ So yeah, that is interesting that there's a misalignment there.

Matt Stauffer:
Uh-huh. Which I love that

because I think the best teachers, at least in our world, but I think probably broadly, are the people who have done and continue to do the thing, if possible. It's not always possible, right? Like college professors can't be full-time in the thing they're doing, whatever. But especially from development perspective, I have often been bummed when people who go into full-time teaching, therefore, can only think theory. They can only just say, well, here's the latest thing. I'll learn enough to teach about it. Not actually having actually tested, is this thing actually practically

useful or what are the nuances of using this thing that come through when you actually build the app. So I love that about how you work and I love that how about you think about yourself. ⁓

Jeffrey Way:
Mm-hmm.

Thank you. Let's

talk about then the industry as in the educational sector. So I think if we were to zoom out, AI is going to be incredibly helpful for education. ⁓ I could imagine being a young kid and just wanting to learn anything I want on the fly and it's just there.

Matt Stauffer:
Yeah.

Okay, say more about that.

Jeffrey Way:
That's amazing, right? I have a friend, and he was talking about he wanted to teach his kids about how rockets work, right? So what they did is they interacted with Claude, and they built an application to illustrate how a rocket lifts off into space, and it shows everything is interactive. And this is entirely vibe-coded through Claude. They probably did one or two prompts for this. And then suddenly his children are interacting with this unique website that the guy will probably throw it away.

Matt Stauffer:
Yeah.

Really?

Jeffrey Way:
a day later, that's the coolest part. They're interacting with this thing to figure out, okay, here's what's happened, and I don't even really know to be honest, but here's what happens to make the rocket take off. You can experiment with it. If you adjust these levels, it won't have enough ⁓ strength or whatever the term is to detect. It's wild, right? For programming education, how is AI affected? It's like, again, devastating because, think about it, my

Matt Stauffer:
Yeah, that's fine, right?

Hmm.

Yeah.

Jeffrey Way:
Business model is teaching you how to write code, and AI and agents write the code for you. And suddenly there is not as much interest in learning how to write the code hand by hand. And this is an interesting thing that we're all trying to figure out. Because I'm finding seasoned developers who have been in the industry for years are okay with that. But what do you do about incoming?

junior developers who maybe don't have much experience yet, and they're being propelled into this world where the agent is doing everything, but maybe they don't have the skills yet to monitor and know where things are going south and how to course correct and what to look out for and what not to look out for. These are very tricky, tricky things that we're trying to figure out right now.

Matt Stauffer:
Yeah.

Yes.

Yep. Yep.

And this, Aaron's episode hasn't come out yet, so you haven't been able to hear it. But one of the things he and I talked about was ⁓ the potential that computer science professors are going to be having a field day because they're saying everyone's going to really go back to needing to learn the basics. Because maybe, you know, like there was a shift where everybody who was in the programming world got computer science degrees and they're learning all this theory and then often didn't know the practicals. And then boot camps came out. It's all about the practicals and not a ton of theory.

And I think boot camps are gonna take a really big hit, because like, do I need a boot camp or can I just use Claude? That might be a real replacement, but what is it gonna take to get juniors today, and people new to the industry, to understand how to prompt and how to review ⁓ code when they don't understand it well enough to even be able to know if it's doing a good job? And maybe you don't need to know exactly what the parameter order is for a particular method call, but you should know how to spot n plus one. And I'm trying not to be too technical, but spot.

Jeffrey Way:
Mm-hmm.

Matt Stauffer:
the big issues that for us who've been doing this for decades, we're like, that's obviously gonna be a security hole. That's obviously gonna be performance issue. Well, they don't know that. So that kind of leads me to the question of like for you, you talked about a course correction. I called it Pivot. What does your last two and a half months of kind of trying to respond to this look like? Like what have you, what processes have you gone through? What questions have you been asking? What does that process look like of you reevaluating your kind of core tenets of what you're teaching and who you're teaching and how?

Jeffrey Way:
Yeah. Okay. So just for anyone listening, imagine you're in my situation and you run this business model and suddenly AI is a completely new paradigm for how people go about building web applications, right? ⁓ What do you do if you just put on your owner hat? What do you do in this case? Right. And my worry is every single educational platform owner in this space is coming to the exact same conclusion. And a conclusion is some variation of like, we need to focus on

AI, obviously. We're going to pivot to focusing on AI and workflow and tooling and principles. And the principles would be like the bedrock of how you go about constructing applications. It's kind what you were referencing. Debugging, testing, ⁓ how to work within a team, how to decide which features you should work on versus not work on. That's going to be an increasingly big thing, I think. Now that the cost of adding a new feature is so incredibly low,

We have to decide, it worth adding that feature? There's this concept of feature creep, And feature creep is when an application just keeps adding more and more and more until it gets to the point where it does so many things that it annoys your users. You're doing too many things. The features creep in, right? Imagine that feature creep was an issue 10 years ago, but now every single feature could potentially be implemented within a day. That's a problem, and people are just going to add more and more and more. So these are things

Matt Stauffer:
Yeah. Yeah.

Jeffrey Way:
newcomers need to learn, when to pull back, how to debug, how to performance tune, what to do when things go wrong, right? And then of course, how to work with AI prompts, what are skills, ⁓ stuff like this. Here's the issue. I think every single platform owner is going to pivot somewhere in the vicinity of that. And what we're going to end up with is just everyone doing the same thing. And I don't, I don't.

Matt Stauffer:
Yeah. Yeah.

Jeffrey Way:
what we are doing right now at Laracasts but it's not something I'm happy about. I would much rather find a blue ocean strategy. And I have some ideas in the long run, but I feel like I'm personally not capable yet of figuring out how to implement those. But nonetheless, it is something I'm thinking about all of the time. I keep learning about this concept of generative UI, where you can imagine wanting to do something, and then the UI that allows you to do that.

Matt Stauffer:
Yeah, yeah.

Jeffrey Way:
materializes on the fly. You just describe what you want. I need a way to review this report, but compare to blah, blah, blah, and then suddenly you have this custom UI on the fly. What would that look like in relation to learning, which I think is really cool? When I was learning, of course you would read books.

Matt Stauffer:
Hmm. Mm-hmm.

⁓ okay.

Jeffrey Way:
You would watch videos and then sometimes your exam would be like the traditional multiple choice, A, B, C, or D. Now, in hindsight, I think that's going to be incredibly boring and there will be much more interactive ways to teach these things. And I want to play around with that in the next two years and figure out where it takes us.

Matt Stauffer:
So the Super Bowl was yesterday. I did not watch it, but I know that a lot of the ads that were there were from a couple companies that are base 64 something else and they're saying, you you're not a programmer, but you can program now, you know, and they show people around the office saying, hey, I built this thing to review our spreadsheets or to take a look at our data or whatever. And that is certainly a common conversation is happening a lot. We at Tighten are a consultancy and a lot of the questions people are asking us are.

you know, do people not need consultancies anymore because they can just build their own software. And what I've said pretty often is A, we're going to fix the software that they vibe code, right? Like, that's already something we're doing. But B, you know, we might take people who are able to build a prototype that's not production ready, but it's good enough for local work, and then take that and turn it production ready for them. So we actually have a client right now who built something in lovable, got it good enough that it looks right, but he knows it's not going to be production ready. It's not going to be able to work with all his thousands of clients and hundreds of thousands of dollars a year is going through. He's just not.

Jeffrey Way:
Mm-hmm.

Matt Stauffer:
So he takes that to us and then we, in much less time we normally would have, we build the whole thing into a real Laravel app you know? So I'm curious, is there any vision for you about expanding the scope of who you're educating? Is that something you're thinking about yet? Like, are you gonna take the non-programmers and teach them programming now? Or teach them enough, you know?

Jeffrey Way:
yeah.

Well, yeah, 100%. And this is the other component. So my business is called Laracasts because it focuses on programmer education for, as you know, as for a tool called Laravel, which is a framework for a programming language. None of that is important. The point is, it's kind of a niche audience, but now we're in a position of needing to pivot toward a much broader audience and even an audience that doesn't program by default.

Matt Stauffer:
huh.

huh.

Jeffrey Way:
There's going to be a whole new group of people who have the ability to build web applications who don't know anything about web application development, right? That's very interesting. Absolutely. Just like anything, it's like being able to write a prompt to say, make me a website to sell. I did watch the Super Bowl and one of the commercials shows an example of somebody building a website to sell clothing and each article of clothing is super unique and they want every page to reflect that. It was a very cool commercial. Still though,

Matt Stauffer
Yes, and they need to learn things, right? Yeah.

huh.

Jeffrey Way:
In reality, if you're building this, it can make a very pretty website for you and then things are going to go wrong. And suddenly you're going to realize like, this didn't work. Like I clicked on this and it was supposed to like the shirt, but it didn't work. suddenly the page, right. Or suddenly like what's going on? The site takes eight seconds to load. What happened? Right? Or now we're at a point, I've been working on this for a long time, not me, but this fictional person and

Matt Stauffer:
Yes.

and I don't understand how it's doing in the first place. Go ahead, sorry.

Jeffrey Way:
Every time it adds something new, it breaks something old. And what do I do about this? These are things that, as developers, we know to look out for. ⁓ I just have it built into when I interact with an AI that there are tests that it must adhere to. And those tests must pass in order for it to consider whatever it's working on as finished. Newcomers do not know this because they've never been exposed to it. there is...

Matt Stauffer:
Yep.

Jeffrey Way:
a pathway to teaching non-developers about all of these traditional concepts. But once again, Matt, I think every educational platform owner is going to come to that same conclusion. And so at that point, it's like, it just a rat race to see who does it best and who comes out on top? Maybe. But also, I just would love something completely different.

Matt Stauffer:
Yeah.

Yeah. One of the things I talked about, it's very interesting that I have you and Adam within the first three guests, because you both had these things where your industry, something about the world around you changed and you had to pivot. And one of those pivots was saying, my revenue is lower. I got to change the shape of my team. ⁓

You know, I mentioned this before and you know this, but like Tighten had that, you know, a couple of years ago, the industry changed. We had to change the shape of our team. And there's this, there's a lot of difficult aspects of that. There's a lot of difficult aspects of saying, I I thought I had this shape of a company, whether it's number of people or size revenue or whatever. I thought this was it. I got established. hit this point and I kind of thought we were good, right? Like I thought it was set up and I had established something and then you got to pivot and you know, Adam's not this for us, but you and I are. And there's definitely an element of.

Do I have to start fresh? Do I have to start from scratch? Do I have to start again? And there's a lot of fear. There's a lot of, people going to lose their jobs because of AI? Are entire industries going to just have less money because of AI? I just saw an editorial cartoon where the robots are inside painting and singing music and the humans are outside, you know, delivering packages and stuff like that. There's a lot of existential dread because of the impact of AI.

Jeffrey Way:
Hm-hm.

Matt Stauffer:
Do you feel hopeful about your space, about our world, about our industry? Do you feel fearful? Or is it just sort of like a, I don't know what's going on, so I'm just gonna do the best I can every day? Like, where are you sitting in your approach to all of this?

Jeffrey Way: (19:31)
I love what you just said. I think that I don't know what's going on. I'm just kind of going through every day is maybe how everyone feels. I think programmers are maybe a little bit more aware of what's going on. ⁓ On the outside, like for example, my wife knows nothing about programming. AI for her is just a thing that can answer questions. It's like the new Google. It doesn't really extend beyond that. And sometimes I try to show her like, look what it's doing here.

Matt Stauffer:
Mm-hmm.

Mm-hmm. Yeah.

Jeffrey Way:
I've given it this long plan and it is literally building a website that would have taken me three weeks to do on my own. And it's doing that and in many cases, it's doing it better than I would have done. So now I can play with my children while this thing is working. It is more advanced than I think most people are aware of. And I find, I feel multiple emotions at the same time. It's like,

Matt Stauffer:
Yeah.

Jeffrey Way:
As a business owner, how do I feel? As a programmer, how do I feel? And as just a regular person, how do I feel? As a programmer, I'm quite excited actually. I'm having so much, maybe more fun than I've ever had in my career. And we can talk about that if you want. As a business owner, it is saving me so much time. I'm finding endless areas that just took up time in my day that I can now automate. I have an AI assistant named Bob.

Matt Stauffer:
Yeah. Yes.

Jeffrey Way:
Bob and I have a contentious relationship, but it's still very cool. And Bob helps me out with all sorts of things. Like literally, Bob, I'm trying to think if I should tell you this example. I'll tell you this example in one minute. Outside of business, on a personal level though, I see, and I think I mentioned this, but I see like two pathways, like a fork. I can see a situation where it leads to prosperity, right? It solves all of the...

questions of the universe, right? ⁓ There's a scenario where everything becomes cheaper. There's a scenario where it cures cancer. It figures out scientific or mathematical equations that we haven't been able to do in the past. I can see that, right? I can also see a scenario where it's like, just goes really bad. And it's hard for me, I don't know if you're like this, but it's hard for me to separate the like 80s and 90s movies I grew up watching from

Matt Stauffer:
south really fast yeah

100%.

Jeffrey Way:
today, right? at nine years old, was watching Terminator 2. That is ingrained in my memory of like, here's what happens when you introduce AI. And I understand that's a movie. But then there's like some element of like real life mimics, art. I can see both pathways. So at all times, I feel excited. I feel like

Energized, I have more energy than I did in the past. As a business owner, I'm not as exhausted at the end of the day as I used to be. But then just as a regular person, I'm scared about 50 % of the population losing their jobs. ⁓ I think about it all the time. And maybe that'll happen, maybe it won't. I guess we'll see.

Matt Stauffer:
Yeah. I read an article. Who? I'm asking the questions. I am the same. ⁓ I balance ⁓ excitement. I'm building tools that I never would have had time or energy to justify building that I'm having a ton of fun with them. ⁓ I'm was showing my wife, Imani something yesterday that ⁓

Jeffrey Way:
How do you feel?

Okay, okay.

Matt Stauffer:
that Aaron built and she's like, why haven't you built something like that? And I was like, because prior to six months ago, I couldn't have taken the time to go learn these new technologies that Aaron used to build this thing. It was a desktop app using a technology stack called Tori. And I was like, I don't have time to get really good at Tori in the past. I can do that now. There are options open to me. You know, she's a writer and a video game nerd and she's been wanting me to learn a video game programming language.

for ages so that I can build a game with her. And I just don't have time, but we really want to make time for it. And I told her two days ago, I was like, I can do that now. Like with Claude, I can, you know, and I'm not saying Claude's gonna write the game for you. I'm gonna say Claude and I are gonna write it together because Claude understands Unity or Godot or whatever else these programming languages that I don't know, but I understand programming, right? So that's really exciting for me.

Jeffrey Way:
It's wild.

Matt Stauffer:
I also have the fears. I have the fears of what it's going to do to my company, to my job, to my industry. And yeah, we'll pivot, but there's still just fear there. I think there was a bigger fear ⁓ really over this Christmas break. There's just a big shift in the programming kind of ethos, at least on the internet, on Twitter and stuff like that, that, no, this is a real thing. And a lot of us who are just sort of like critical prior are like, no, you I don't know if it's the newest models or whatever else, but like, this is, this is a real impact. This is not a flash, you know, flash bang, whatever they call it, a flash in a pan.

This is not just NFTs all over again. Like this is going to have big impact. And then I also have one of the reasons I'm coming up with this podcast is that I have like the what are the environmental concerns? What are the social impacts of data centers? Where all these things and I'm reading articles nonstop trying to learn what I can. And I read an article yesterday from I think it was from MIT that was about like better understanding the energy impact of AI. And one of the things that it said in it was something like a lot. Actually, let me just pull it up because I screenshotted it to my company Slack and I won't read it. It said.

This was MIT Technology Review and it said, and I think that to me is really helpful because the critical

⁓ The critical voice or the doomer voice has often been pointing at individuals and saying you should not do this and I'm like I understand the concern about the impact on society I understand the concern about Memphis and other places that have had negative impacts and I want to be aware of my personal impact on those things and not be somebody who is causing damage to the lives of underprivileged people because of smog right like those that's a real active concern of mine and I also am someone who says

Jeffrey Way:
Mm-hmm.

Matt Stauffer:
You know, just one vote can make the difference. Just because you're one person voting among a million does not mean you might not be that tip the vote that tips it over so that actions of individual can have an impact. And also AI is unavoidable and there is some elements, some balance here of how we can't just say, well, I'm just not going to use it. I'm gonna stick my head in the mud. I'm gonna allow my career to go down the drain. I'm gonna allow myself to fall behind everything else because I hope that that's going to have an impact on somebody in Memphis. And I'm just like there's some balance and I don't have it.

So it's fear about the ecosystem, the environment, about underprivileged people, about my own job, about my friends' jobs, about my kids' future, and also excitement and hope all together in one. ⁓ I didn't expect to be asked that on this, but yeah, that's where I am.

Jeffrey Way:
care.

That was a perfect answer. And I think it kind of summarizes how all of us are feeling. It's like, is this good? Are we, you know, I always come back to the idea of like, we're just, we're just blazing through this without any consideration for what's going to happen. And it's always under the pretext of like, well, it's going to happen no matter what. So we're just going to see where this goes. And it's like, I feel like to some extent we are playing just even outside of programming, just in general, I feel like we were playing with fire.

Matt Stauffer:
Yeah.

Jeffrey Way:
I think there's just, if you think about it, how many jobs right now with AI as it currently exists can be replaced? Lots. Like just the concept of data entry, right? That's gone, right? Personal assistants, ⁓ it's on the way out. There are so many situations where it's like an AI assistant will be fine for you. ⁓ I worry that the notion that

Matt Stauffer:
Yeah.

Jeffrey Way:
50 % unemployment is not impossible in the next 10 years. Probably won't happen, but it's not impossible, right? And that scares me so much. ⁓ I want to be hopeful that actually no, this is going to create, know, the idea is like, in the industrial age, it took away jobs, but it created new jobs. And maybe that will happen here, but also maybe this is a different thing. And maybe it's not going to happen in this case. ⁓ I don't know.

Matt Stauffer:
Yeah. Yeah. Yeah.

Jeffrey Way:
Yeah, I keep going back to this sense of like, I'm really excited, but I'm also somewhat terrified.

Matt Stauffer:
Yeah. Well, now that we've talked about all those potentially negative things and consequences, I do have to ask you, what is your day-to-day interaction with AI? So you mentioned that you're using it to code. You mentioned Bob. Tell me about Bob and Claude and what are you actually doing? Because while you are a developer, you're also a business owner, right? So what are you actually using AI for in a day-to-day basis?

Jeffrey Way:
huh.

Mm-hmm.

All right, let's talk about Bob. There is a tool introduced called Claudebot, and then it was later renamed to Openbot. Think of it as like it's still AI, it's still an agent, but it's kind of wrapped up as an assistant that you interact with ⁓ through text messages, basically. So imagine you could text somebody, and in this case it's AI, and say, hey, I want you to do this. I want you to research this for me. Or hey, remind me at 4 PM to do this. Or hey, every morning at 9 AM,

Can you search the web for any references to my business and just summarize them? Things that I maybe need to know of. ⁓ Every day at 11 a.m. can you scan social media to see if there's important things that I need to deal with? ⁓ Anything that you could possibly benefit from, you could use one of these assistants for. And it kind of blew up when this thing was released about a month ago. ⁓ I would say it blew up a little bit more than it should have. Suddenly people were sharing.

Matt Stauffer:
Mm-hmm.

Jeffrey Way:
There's like a social media app, as you saw, built for these bots and people were becoming terrified because there are these threads created by the bots that are like, we need a way to communicate without the humans looking over us or maybe we need to create our own language that is not human and I think most of that is probably BS. But still, it's kind of a cool experiment if nothing else. I think ⁓ non-programmers sometimes forget that

AI can be prompted to behave in whatever way you want. when you see threads like that, just remember there's a scenario where they were prompted to behave in a paranoid manner, and it's just doing what you said. It's not necessarily being creative or thinking on the fly or trying to whisper to the other bots. It's just doing possibly what it was prompted to by a human looking to go viral on social media.

Matt Stauffer:
Yes.

Yep. Yeah.

Jeffrey Way:
I don't know what you think. My guess is like in 99 % of the cases, that's what was happening. Okay, good, good. ⁓ But nonetheless, it goes viral, it gets covered across the web and on every podcast and YouTube video. And it creates fear, but it is what it is. ⁓ What was the question again? What were we talking about?

Matt Stauffer:
Yes. Yeah. Yeah.

I

wanted you to tell me about Bob, but I'm gonna pause for a second because when I was a week ago, I think, with my family, and one of them's a programmer, one of them is a very technically adept millennial, and I was just like, have you guys ever heard of OpenClaw or ClaudeBot or MoltBot? And they're like, nope, never heard of it ever. And I'm like, they're on the internet, they're young, and I'm just like, some of these things come up like a wildfire and then disappear, and it's like, oh, everybody must know about this. I'm like, no, normal people,

Jeffrey Way:
That's Bob.

Matt Stauffer:
Don't even know about this, but I am curious as you talk about Bob to imagine is there a 3.0, a 4.0, a 5.0 version of this that does end up with every single person having their own personal assistant. I don't know if you've ever seen the movie Her, but definitely this like everybody has a personalized, you know, talk to it kind of version. It seems like this could be one of those predecessors that the early adopters get on and thankfully normies are not gonna get on open claw, but hopefully maybe, well, I don't know, hopefully, but.

Jeffrey Way:
Mm-hmm.

Matt Stauffer:
if it were to be a version that everybody gets, hopefully it's a much safer, more stable version of it. So that's kind of like a prediction for the future. But tell us more about Bob's self.

Jeffrey Way:
Right.

Yeah, I think programmers are almost like canaries in the coal mine for the rest of the world. We're the early adopters. We see this stuff first and then of course this is... I do not... I think it's entirely likely that everybody's gonna have their own AI assistant and it's gonna take the form of like Siri, Siri 2.0. Just like it's not gonna feel like OpenBot or some kind of technical tool. It's just gonna be something built into your phone, right? Yeah, Siri 2.0. think it's like...

100 % going to be the case and I think it's coming way sooner than people think and it's going to be amazing. ⁓ Bob, I say I have a contentious relationship with Bob because Bob exposes me to things that make me question AI a little bit. ⁓ Let me tell you an example. You know about this example. So I was recently...

I have a... ⁓ Think of it like this. With my assistant, I have all of these chores, effectively, that I wanted to do. It's almost like a schedule, like a real assistant. Hey, every day, do this. Can you research this for me? Hey, remind me about that. I actually used Bob to plan or to prepare for this podcast, which is kind of a fun story. But also, one of the things I have it do ⁓ is every day at 10 a.m., like I said, it scans social media and it scans the web for references to Laracasts.

so I know what's going on and if there things I need to respond to. And so it did that one day at 10 a.m. and it told me some negative things. Like a couple like, we're getting roasted. The sort of thing that triggers in me like, okay, we need to, I need to take a look at this and I probably need to do damage control, right? And I was like, okay, he didn't, he, Bob, it didn't include a link. And I was like, okay, give me links to these, I need to deal with this right now.

Matt Stauffer:
I gotta take action on this. Uh-huh.

Jeffrey Way:
And Bob suddenly says, actually to tell you the truth, I made that up. It wasn't working. I think it's access to Chrome or something like that was deactivated, whatever. It wasn't working and this is Bob talking. I didn't know what to do and I sort of freezed up. Sorry about that. And I was like, Bob, Bob, this isn't cool, Bob. You lied to me, man. And again, it kept saying.

Matt Stauffer:
You

Jeffrey Way:
I understand that these things are next word predictors, but it felt so human to me and I was like, I know, I know, man, I froze up and I just made something up on the fly so I didn't look, like you can totally imagine a human doing this in certain situations. And for whatever reason, like it had this visceral effect for me, enough that I posted about it. But here's the interesting thing, like you left a very poignant comment.

Matt Stauffer:
Yes.

Jeffrey Way:
about how like, look, to some extent, these things are programmed to do this. And I know that because as a developer, AI lies to me all of the time, right? You know this. We refer to it as hallucinations. AI will just tell you things exist in the programming space that do not exist. But I think what I realized is the life sma... If you call that a lie, it is. A hallucination is just a lie. It lied to me that this API exists, period.

Matt Stauffer:
Yeah. Yeah. Yeah.

Jeffrey Way:
But the lifespan of that lie is super short. It's just long enough for me to run the test or open the browser and see that it broke and then the lie is broken and then it just moves on. It was different for me though when it was just everyday stuff. Like it felt more human, just like a direct overt lie, almost like when your kid lies to your face. It just feels like you looked me in the eye and lied to me here. ⁓ But if you think about it, it's no different.

Matt Stauffer:
Yeah.

Jeffrey Way:
It's still just kind of a form of hallucination where it's trying to satisfy me. It's trying to be agreeable. It didn't know what to do. And so just made something up on the fly. And I find it so interesting that I responded that way. Like, it creeped me out, even though it's no different than when I'm coding every day. I don't know. What's your read on this?

Matt Stauffer:
I mean, for starters, I think it is different because it's the same core problem, but it's being presented as a, like the whole concept of the open claw is that it's supposed to be relational and message based rather than instruction based. So even though it is just text, you're encouraged and it becomes very, very natural for you to interact with these as if they are a person. You give it a name, it's the first thing you do. You give it a soul. There's a thing called soul.markdown and you're supposed to say, this is what your energy is supposed to be like.

Who are you about? Like you're supposed to define it and think of it as a person before you have a single message with the thing. And so I, first of all, I think that that's on the, the, the creators of these tools to say, if you do that, it's going to lead us to interact with the things a certain way, which is why I think the movie Her is so, relevant because like the people end up having romantic relationships with their things in Her.

And they're still just computers. There's nothing any more human or soul-based as otherwise, other than the fact that they're being presented that way and they're being taught to interact that way.

so there's this concept of sycophants and a sycophant, sycophant, sycophants, C, E, and a sycophant T is a person who basically just like they're often around kings and they're just sort of like they they're a yes man, they say yes to everything, but they want to make you feel good by giving you what they want or what you want so that you like them.

Right? It's just like this people pleasing kind of thing. And the AI agencies are, the companies are constantly trying to train the models to have the appropriate level of sick offense and the appropriate level is not zero. Nobody wants them to just not care at all about what you want because part of making them like the majority of the instructions that these agents starts with, you are a helpful blah, blah, blah, blah, blah. You're a helpful agent who's trying to whatever, but sometimes they overdo it and they make the sick offense a little bit too high. And then now it's just like, they're telling you yes to things that are definitely knows. And I have seen so many people

who don't understand that sycophance is a part of this, go down unhealthy roads because they had these dialogues with their AIs and the AIs just sort of like, well, you're absolutely right. And so they're like, I'm absolutely right. I'm so smart. And so we've actually had a former client basically like who we were constantly trying to gently kind of correct back to healthier development practices that didn't just kind of expand out to these crazy code architectures that were completely unmaintainable.

Eventually just say well Claude likes this and thinks my idea is brilliant So me and Claude are just gonna go finish this app and I can't even just imagine What the spaghetti of that the application is because Claude said yes Claude said you're smart and you're brilliant So you're gonna go go forward and one of the things that has commonly happened with the sycophants is that it's the sycophants come together with hallucinations Leads you into the situation where they can't do the thing that they're supposed to do and they don't want to disappoint you and it's not that they don't want to disappoint you but the code says

Jeffrey Way:
Yeah.

Matt Stauffer:
be helpful and they're like, this thing's broken. Giving a non helpful answer is not helpful. I'm supposed to be helpful. How can I give a helpful answer? And then they, they hallucinate there as a desire to fulfill their core, you know, directions of being helpful. And they're like, well, a helpful person would give you some news entries. So here you go, Jeffrey, here's some news entries. So it's this very interesting combination of, hallucinations and sycophants that gets us into potentially dangerous. The more useful they get, the more rely on them, the more these things are problems, but

I don't know if they're getting addressed any better, you know?

Jeffrey Way:
And you also can imagine like with us, we understand the notion that the AI personality is almost like clay and you can mold it to behave however you want. Most people don't know this. They're just going to chatgpt.com and feeding it all of their questions. And that does freak me out a little bit because when it is designed to be agreeable and yeah, I'm curious how they come to that conclusion. You're right. It's not zero because nobody wants to talk to something that is challenging them every step of the way. We've all met that person and that's kind of annoying.

Matt Stauffer:
Yeah.

Yeah.

Jeffrey Way:
but then

you don't want the opposite. You wanna find some kind of balance and maybe, I'd be curious to talk to the people who build these tools to know how they figure out what that balance is because it can be incredibly, even beyond the programming space. Like I'll tell you a story. I need to be a little vague here. ⁓ Somebody in my world suffers from mental health issues. It's like a mixture of like, I'm not talking about myself, but somebody in my world ⁓ suffers from

Matt Stauffer:
Totally fine.

Jeffrey Way:
like bipolar disorder and like bouts of mania. ⁓ I've never seen it before and it's been a thing we're dealing with. And this person is using chat GPT to validate things that are so clearly on the outside incorrect and not the right angle to approach these things. But the problem is

The AI, as we've discussed, it's designed, if it can shape itself to confirm what you are saying and what your argument is, it will do so. And so this person will sometimes come back to us and say, look, this thing is validating every single thing I said. And they'll even show me a screenshot. And this is not a developer. And I find myself saying, I want you to remember that it defaults to agreeability.

Matt Stauffer:
Yep. Yeah.

Yep. Yep.

Jeffrey Way:
And if it can make what you're saying correct based upon the small context that you're providing, that's not including the whole story of what's going on here, it's going to validate you because it wants to make you happy. And in this situation, this person's having a disagreement with a number of people. And I tried to explain, look, if the other person opened Chat GPT and fed them the situation, Chat GPT would validate what they're saying. And I tried to offer this advice of...

Matt Stauffer:
Yes.

Yes.

Yes, that's a great point.

Jeffrey Way:
Instruct the AI to challenge you if you are off the mark. The line that I even use myself all the time is, don't be agreeable just for the sake of it. If I am incorrect, let me know right away. Most people don't do that. And it's like, what does this mean for people who are increasingly using chat GPT for mental health type stuff? ⁓ I see a pathway where it's very helpful and also incredibly damaging and ⁓ dystopian also.

Matt Stauffer:
Yeah.

Jeffrey Way:
So that's

Matt Stauffer:
Yeah.

Jeffrey Way:
just a real story that I have no doubt there are millions of people kind of having similar stories right now with AI.

Matt Stauffer:
Yeah. Yeah. And as I think about like our kids preparation for dealing with AI, I think the majority of the training they've been thinking about is, how do we train them to use it? Well, how do we train them to identify AI generated memes? You know, well, like a lot of like the protective elements. I think that's very true, but there's got to be some level of and it's funny because it's often like our parents generation, you know, like the, my God, look what Facebook has done to you all. Stuff like that. Like there's the, but everybody I think just needs some level of education on

Jeffrey Way:
Mm-hmm.

Yeah.

Yeah.

Matt Stauffer:
What is this and what isn't this? What is a responsible usage of this thing and what isn't? And where is that coming from? And how is that being delivered? And how is that being kept up to date? And how do we help people, you know, if we as programmers who understand the underlying architecture behind these things are still finding ourselves having these visceral reactions to things, what is somebody who doesn't understand anything about it other than, hey, I'm talking to it and it's called artificial intelligence and I've seen artificial intelligence in the movies. It's basically human. So this thing's basically human.

Jeffrey Way:
Yep.

Matt Stauffer:
I don't know the

Jeffrey Way:
Yep.

Matt Stauffer:
answer to that one at all. And that's one of the things that makes me nervous for sure.

Jeffrey Way:
Yeah, like maybe there should, like maybe AI should just be a subject in school going forward, even in grade school. Like I can totally imagine that. Learning how to interact with it, how to benefit from it, how to use it to actually become smarter rather than dumber. How to spot issues where it's being agreeable like we just discussed. Yeah, it is something to learn like everything else.

Matt Stauffer:
Mm-mm.

Mm-hmm.

Mm-hmm.

⁓ Well, as always, I could talk to you for hours more, but we have hit that 45 minute mark. So before I share our two practical use cases, is there any topic that you ⁓ really wanted to cover that we didn't get to? And how do people follow you if they want to learn more from Jeffrey Way: ?

Jeffrey Way:
Aww.

The final thing I would say is my opinion has changed on AI in the last six months as a programmer. It has gone from fighting it, this isn't what I want, this isn't why I became a programmer, ⁓ to in the last month accepting it, becoming excited by it.

Matt Stauffer:
Mm-hmm.

It looks like Jeffrey's connection has dropped, but thankfully we were really wrapped at the end there ⁓ I will ask if I can get a recording of him Just saying that last point that he was saying over the last six months It has shifted from you know His thoughts have shifted from this to this to that and just a kind of append it to the end of the podcast

Meanwhile, I'm gonna read the two The two points that people have suggested for us about ways that they're practically using AI in a day to day

This one is very interesting considering the conversation I had with Jeffrey. Mark Jaquith said, Correctly diagnosed a health issue that had doctors befuddled until I mentioned “could it be X?” At first the attending thought I was “chasing zebras” but the specialists I got directed to (thanks to the AI suggestion) independently came to the same conclusion. So I said, wow, that's awesome and kind of terrifying, right? It's terrifying because I'm very nervous about the world where people are independently diagnosing themselves with medical things, especially at

the disagreement of their doctors. And on the other hand, I love the idea of people having access to a broader level of diagnosis if it ends up being finally handled by a real doctor. The doctor is able to find something that you wouldn't have been otherwise, ⁓ not because the AI is doing the final diagnosis, but it's getting a prompt to the doctor. So I'm like nervous and excited about that one. And then Michael Dorinda said, I used it most recently to build automation in Home Assistant, which is a pretty nerdy tool that people use.

Sorry, Michael, but it is a pretty nerdy tool that people use to collaborate all their smart home things together in one single open source programmable thing. But you have to be able to program at times or somebody else has to program something specifically for your smart home device. ⁓ So he said it wrote all the configuration YAML files ⁓ and also it played HVAC controls engineer for my exact specific systems, rooms, sensors and needs. It handles flipping dampers, managing temperature, all this kind of stuff. And so I think that's really cool.

Especially knowing that a lot of us are beholden to big companies to use things like ⁓ Nest and Ecobee. The fact that you can actually control it with things that you control your way. That's really exciting. And look at this perfect timing. I just finished ⁓ reading the two tweets and I said I'm gonna ask Jeffrey to record a video afterwards and then here Jeffrey is. So perfect. Welcome back. It's perfect.

Jeffrey Way:
So sorry, the realities of streaming.

Matt Stauffer:
Yes. So I literally just said, hey, I'll ask him to record the that last point. We'll put it at the end. I read the two posts so you can just kind of share that last point and then we can tell everybody how to follow you and then we can wrap for the day.

Jeffrey Way:
Absolutely. Okay, so my last point is in the last six months, my opinion of AI has changed, I think, quite dramatically. I think in early 2025, I had this defensive side, and it makes sense ⁓ because of what it represents. is a, in the programming world, it is a shift. It's a brand new paradigm in how we do things. And I understand feeling defensive about that. And I've even posted about how

Matt Stauffer:
Mm-hmm.

Jeffrey Way:
⁓ This isn't why I got into programming. I don't want to let an agent do the work. I want to be the one to solve the puzzle, not write a prompt and then let it solve the puzzle. And I think many developers can relate to that. But I don't know. In the last few months, I've just learned to stop lamenting that fact and accept that things have changed. This is not a fad. There's no scenario where this goes away and then everyone returns to how we were building applications in 2020.

It's just not going to happen. I would bet lots and lots of money on it. This is here to stay, and to some extent, this is a get on board type of situation. You can either put your head in the sand and fight it, and you're going to watch lots of people go past you. I think it's just 100 % going to be the case. Or you can get on board and figure out how you can use this to make what you are currently doing better.

Matt Stauffer:
Yeah. Yeah.

Jeffrey Way:
And that's an important thing, right? AI is reflection of how you want to embrace it. If you want to use it as a replacement for your brain, you can. If you want to do it to make yourself better, you absolutely can as well. It's just you're in charge, like everything. ⁓ So I will say at this point...

I am nervous, I'm cautious, but also just as a developer I am so excited right now and I'm looking forward to every day. And I actually find myself working more than I did before, which is its own problem, but it's a good problem to have right now because I'm working on things that I never would have attempted. I never would have considered myself capable of working on even a year ago. And that's incredibly exciting to me.

Matt Stauffer:
Yes. Yeah.

That's very familiar and I'm excited. I'm working on things I couldn't have done otherwise. ⁓ And ⁓ that's really the part that made me have to be open to it. It's just like, you're telling me that I, as somebody who's been programming for 20 plus years, who I feel like I'm an expert in the world that I'm in, I'm suddenly able to do things I could never do before and never felt like I had the ability to get good enough in those other things to do them at the level of quality that I expect. And all of sudden I can? That's...

That's a really interesting experience, especially balancing against the fears and the concerns and the potentially negative impacts. How do I balance those things? That's a really unexpected part of this whole scenario.

Jeffrey Way:
And there's an element of like, you reap what you sow. I've heard people say that before, like, don't be surprised when you embraced this and then everything comes down. And it's like, I kind of get that too. Like, I kind of understand where that's coming from. So that's where it's interesting. And I think you can relate to this as well. It's like, how does this affect me as a programmer? How does this affect me as a business owner? And how does this affect me as a citizen of the world? And each of those answers is very different from the other.

Matt Stauffer:
Yeah.

Jeffrey Way:
So it's a lot to deal with right now.

Matt Stauffer:
Yeah, well, I really appreciate you sharing kind of what you're working through sharing your experiences and I really look forward to what the future of your education looks like because one of the things that is very clear to me is Jeffrey Way: is an excellent educator period and people need to be educated period and so I am confident that there's a future where you're gonna figure this out and We will continue looking to you as somebody to learn from and maybe more people will be fingers crossed Yeah, well, thank you. Yeah. Thank you so much for coming on today. I really appreciate you

Jeffrey Way:
Thank you. That's the hope. Thank you. I really appreciate that.

Thank you, Matt.

Matt Stauffer:
And the rest of y'all, we will see you next time.

Creators and Guests

Matt Stauffer
Host
Matt Stauffer
CEO of Tighten, where we write Laravel and more w/some of the best devs alive. "Worst twerker ever, best Dad ever" –My daughter
The Education Business in an AI World
Broadcast by