A Framework for Approaching Worrying Technology

Matt Stauffer:
Hey, and welcome back to Pragmatic AI, where we talk about using AI in the real world. What works, how to use it well, and when it causes more harm than good. Practical tools, real trade-offs for builders and business leaders. My guest today is my friend, Ruben Johnson, also known as Yo Big Rube, the founder of Fly Duo, which is a two-person studio, and they specialize in narrative storytelling for complicated messaging. And I know that that might not perfectly translate what that is that you do. So Ruben, would you mind saying hi to the people, talking a little bit about...

kind of what you do, and then we're gonna kind of jump right into it.

Reuben "YoBigRube" Johnson:
Absolutely. ⁓ Thank you so much for having me, man. ⁓

Matt Stauffer:
was a pleasure.

Reuben "YoBigRube" Johnson:
I was prepping for this episode, listening to some of your past episodes. I'm really excited to be here and have this conversation. But yeah, to answer your question, so the way that we look at it and the way we think about it is like high trust storytelling for complex messaging. So when you think about like brands in general, trust matters, right? If you run an organization that is doing work under scrutiny, ⁓ then trust matters even more. You can lose a lot of opportunity and then in

Matt Stauffer:
Ooh, that's good.

Reuben "YoBigRube" Johnson:
extreme cases, like if you're working in really sensitive spaces, you could potentially lose access to working with patients or even the ability to operate in the public sphere legally at all. So when you think about organizations like Planned Parenthood, or you think about like ⁓ researchers doing clinical trials for orphan drugs and like that, or things like that with patients, or folks who are working in the sexuality and intimacy space. So like what we do is we help to tell the stories of the

work they're doing to either communicate their message in a way that's like more accessible and ⁓ more approachable, something that has emotional resonance in addition to like all the great work they're doing around stats and data and all that. We try to break all that stuff down and get the emotional resonance and tell a story that will like really resonate with human beings and the why this thing should matter or this project and that kind of thing.

Matt Stauffer:
That's good.

And because I know you and love you and I know the work you do is really, really good, I also want to make a pitch that you don't only work with those folks because you also, like if somebody came along and said, hey, I've got a business and it's been really hard to tell the story of why we exist, whether internally or externally, or, you know, we really need to do a promotional thing where we get to see the people side of this new thing we're doing. Like you could do that. You could tell those stories. You are focused in the place that's a need for high trust, but like they are really good at what they do, dear listeners.

So if you're like, you know what? I want that. I don't know if it's for me. Give him a call. So

Reuben "YoBigRube" Johnson:
Thank you, Matt. That is so kind. Yeah, we can definitely help with stories that have ⁓ less sensitivity attached. Just think about it this way. We can tell really sensitive stories and make things that have a lot of stigma attached. We can make it approachable and work within a mainstream environment. Stories that have less of those challenges, we could tell that equally well and it'd just be less work, I guess.

Matt Stauffer:
Mm-hmm.

Yeah, for

sure. Got it. Okay. So you know that kind of when we get started out, one of the first things I want to ask is what's your relationship to AI? And some folks, it's really clear, right? I'm an AI researcher. Okay. There's no AI in your title. And we have lots of different things we want to talk about, but I want to start with just asking, like, what's the history and present situation of your relationship with AI to kind of give us like a founding background for the conversations we'll have?

Reuben "YoBigRube" Johnson:
Nope.

So that, yeah, thank you for that question. the way I think about it, your first episode, Aaron Lawrence, think it was. So one of the things I appreciated, I'm gonna kind of go dig it in the crates for a sec. So like he talked about how he was, I think Blessing did too. I think they were both self-taught developers, or unless I'm confusing. So like, and then because of the process of being self-taught and all of the like...

Matt Stauffer:
And Francis, yeah.

That's fine, yeah.

Yes. Yep.

Reuben "YoBigRube" Johnson:
laborious work of digging in and learning the thing and learning the craft and just developing over years and how it kind of very much will inform your mindset, right? And kind of how you think about tech. So I think Blessing said she was a 90s baby. So she came up in that era. So like, I'm gonna date myself. I'm a 70s baby. ⁓ So although I...

Matt Stauffer:
If you are not watching the YouTube, you see this man, you don't

believe he's a 70s baby.

Reuben "YoBigRube" Johnson:
Yeah, I know.

I get it all the time. I had a neighbor who was like in his 20s and I told him and he's like, nah, man, I don't believe you. He's like, nah, he goes, I'll give you 30s maybe, but nah, I don't think so. like, but I mean, that was probably like five, six years ago. So I've gotten more gray than I was back then. But so for me, I was also a self-taught developer. I got my start in, well before that, I got my start in CAD actually.

Matt Stauffer:
Yup. Right. Yup.

Reuben "YoBigRube" Johnson:
That doesn't really matter. I digress. So when my wife and I started in tech, we were self-taught. We didn't own a computer. We had to go out and do stuff like the real old school way. You make do with what you have. Learned at the library, taught myself how to code using Notepad. I did not know about Notepad++, probably because I had like, I don't know if Windows Millennial, yeah, the first computer I actually got a friend gave me. I think it was Windows Millennial. I think it was like 1999 or something like that. So I don't think it had Plus.

Plus or maybe I just know about it. So I built my first computer, my first website using that, using books, VHS tapes, all that kind of stuff. I don't know if you remember like ⁓ the folks from Friends, I think it was... ⁓

Matt Stauffer:
Oof, the theater.

Reuben "YoBigRube" Johnson:
Ross and Rachel. like Jennifer Aniston who played Rachel and I think Chandler Bing, I can't remember his name, ⁓ they were, they did ⁓ a thing for Microsoft to talk about the internet. Like this was a brand new thing and it was like to teach you about it. So they were like on the VHS tape and I remember me and my wife were like, yes, this is going to be a thing. The internet's going to be a thing. And every time website would come on a commercial or anything, I'd go bananas. I'm like, yes, I tied myself to the right thing because this is, know, this is brand new, you know. ⁓

Matt Stauffer:
huh.

Uh-huh. ⁓

Nice. Uh-huh.

Reuben "YoBigRube" Johnson:
But anyway, I only go into all that because for me it is informed how I think about tech, because I was not an insider. I was very much an outsider who did this only because I had to. I still worked a very blue collar job. That was my day job. And then we learned how to build websites. So I would do that all through the night. You know what mean? And we had a kid. And we were very much lower class to the nth degree. So.

As a result, the way that I saw tech in general, the internet development, all that stuff was through that lens. So it very much informs, when I would see a lot of this PR stuff come out from the tech industry, the advertising, the think pieces, the stuff in Ink Magazine and all that, there's a lot of that I had to go and get bootleg. We would go to the store and I'd take pictures of the books and the code samples and I would write down stuff from magazines, I couldn't afford to buy them.

like.

Like I could see a lot of like they had this big idea like solving all these problems. But then in my day to day life, I saw the reality of the world that I lived in and that my neighbors and lived in and what that looked like. And it's like, y'all can't even solve the problems that are happening right now where we live. And you're telling me you're going to solve everything. Like you can't even deal with the fact that AI in this day and age fast forward to 2020, 2020s, the AI is like, you know, it started in

Matt Stauffer:
Yeah.

Reuben "YoBigRube" Johnson:
say two teens, I can't remember when it was, when you would use the internet of things like the hand blowers, and black and brown people couldn't use the hand blower because it couldn't recognize skin, you know, or complexion. And then, you then you start finding out in the late 2000 teens that Android is like informing on Android users to the police and putting people at the scene of a crime who were nowhere near the scene of the crime.

And now they have to figure out how do I prove my innocence when my phone just ratted me out for something I had nothing to do with. And then you find out like, you know, they're using facial recognition and people getting caught up in something they have nothing to do with. So when I think about the fact that like, we're seeing the hype machine of PR, of press, of the ginormous tech companies saying, we will solve XYZ thing. And I'm like, y'all can't even make a product that does not.

create a drag net to pull innocent people in who got nothing to do with the technology you're building. And at the same time, I remember in the 2000 teens, like I'd go to like Atlanta Tech Village and...

I went to a couple of like, I don't remember what they used to call them, but like those little startup events where all the, you remember when the hype around startup was like massive, right? So you'd go to these, what were they, codeathons or whatever they were called, and they're just talking about machine learning. And before they were even using the term AI, it was like machine learning and something else. I just, and that's back when I first heard about blockchain, I was like probably like 2000.

Matt Stauffer:
Yeah, 100%. Yeah.

Yeah, hackathons.

Mm-hmm.

Reuben "YoBigRube" Johnson:
12, 13, blockchain, machine learning, all that kind of stuff. And I just remember like, ⁓ it was a mix between the stuff was so jargon heavy that I couldn't make heads or tails of most of it, but I also felt like a lot of times they're hiding stuff in complexity.

because they want it to seem bigger and less accessible to the layperson than it really is. And then that always makes me nervous. What are you hiding that you can't just dumb this down to a five-year-old's language? What are you hiding in the complexity? So all of this stuff.

goes into how I think about AI. I know I took literally the scenic route and then I jumped off and I walked through the woods. I got back into somebody else's car and we eventually made it there three weeks later. But like that's kind of how I think about it. And I can't, for me, I can't think about tech without thinking about the implications when it comes to things like social impact, when it comes to things about thinking about, you know, the class divide, the gender divide, orientation. Like I love tech and I love hardware and it probably doesn't look like

Matt Stauffer:
Hmm.

Reuben "YoBigRube" Johnson:
like

is from hearing me like the first things I say are like probably seem like like I like how you said maximalist that I think you said minimalist on one of your episodes. It probably looks like I'm absolutely against tech. I'm not. But I think that we need a hell of a lot more responsibleness because like if we are having conversations about AI and we're not talking about the Democratic Republic of Congo and little kids in mines digging up raw minerals and we're not talking about Gaza and automated weapons that are firing on civilians because they're in an open air prison. We're not talking about

Matt Stauffer:
Uh-huh.

Reuben "YoBigRube" Johnson:
Boston Dynamic and they have robots that they've promised that they will not weaponize, but the military is using them for right now, ostensibly removal of ordinance. What happens when those are weaponized? I mean, I remember when COVID and they were talking about, you know, getting rid of the limiter that's on police cars so they could heat up super high to kill virus. And all I'm thinking is, yeah, I hope I don't end up in the backseat of a police car from somebody who decides

I'm a virus because they could cook me alive and there's nothing I can do to get out of that car. And if you've ever been in the back of a police car, it is a confining space, especially if you're not even in there for having done anything wrong. Like if they gave you a ride, if anything, you're like, you you were in an accident, you needed some place to sit and they were, you know, they were empathetic enough to say, hey, you can sit in the back of the car.

Matt Stauffer:
Mm-hmm.

Reuben "YoBigRube" Johnson:
because you're not allowed to sit in the front. There's a lot of reasons and what would happen. You know I mean? There's a lot of things to think about and I think a lot of times what happens is we think about all these cool things we could do with tech that we can build. Should we? Or at least have we thought about how it can be misused, how it can be abused? So like.

Matt Stauffer:
Mm-hmm. Mm-hmm.

Reuben "YoBigRube" Johnson:
I've been, because you you invited me on, I was so appreciative. I felt like, yo, let me do my due diligence and dive even further. Because I started off very much super hesitant. And then I have done my own personal exploration of how do I integrate some of this into my work and figure out, you know, how does this impact my work? But then at the same time, like, because I do work in a professional sense in the sexuality space, you know, so I listened to Sam Cole's

new podcast, I forgot what the hell it was called, but she's dealing with deep fakes. So non-consensual image-based abuse and related to that. Colloquial, we know it as deep fake porn, but it is not actual porn because it's non-consensual and all those kind of things. So I listen to that and then I'm a...

Matt Stauffer:
Mm-hmm.

Mm-hmm.

Reuben "YoBigRube" Johnson:
avid reader of 404 media. then I watched, fortunately enough, it worked out that my appearance here is right after the South by Southwest weekend of they just did talk after talk after talk all on AI. And it's very, I find it really interesting to folks who think about talking about AI and the impact on the future and all that, but they never really talk about like in-depth solutions that deal with the social impact.

Matt Stauffer:
Yeah.

Reuben "YoBigRube" Johnson:
And some of the folks to their defense talked about, you know, how our utility bills went up because of the impact of data centers and all that, and regulators being bought off and those kinds of things. And then I got to the presentation I saw that really hit me. ⁓ Jamit Jabrou, Timmy Nabrou.

Matt Stauffer:
Mm-hmm.

Reuben "YoBigRube" Johnson:
Jibru, I believe that's how I pronounce her name. I apologize, I'm probably messing up. And Karen Howe and a guy from the MacArthur Foundation, and they just, they did it. They talked about the things and they're both people who are in tech. Like far more, you know, I don't have the type of consequence influence they have, but they do. And they talked about these things. I'm like, we need more conversations that like talk about like the powerful potential.

Matt Stauffer:
Mm-hmm. Yeah.

Reuben "YoBigRube" Johnson:
but also the real realities. I don't think we got enough to talk about the real realities.

Matt Stauffer:
Yeah, and that's one of reasons why I wanted to have you on. First of all, you know, we've been friends for years. just want to have your voice heard in many places, but I, I know the reason I talk about them maximalist, minimalist and pragmatist. Those are kind of my three categories, right? And the maximalist is like AIs, the future AGI is tomorrow. ⁓ you know, I use, I have six agents running it every time and I skip social events so I can keep my agent running. The minimalist is usually Luddite. They're usually like, I'm going to just put my fingers in my ears. I'm not even going to think about AI.

It's evil. I've read a couple things about it I've never actually engaged with it and then the pragmatist has to find some middle space and not every pragmatist is gonna look the same right but my goal is to say like what's what's the range of Pragmatists and I want people here who are on each side there are more maximalist pragmatists There's more but no matter what if you're in that center space You have to engage with all the aspects of it You can't just put your fingers in your ears in either direction because there's people put their fingers in the ears about the environmental impact

Reuben "YoBigRube" Johnson:
Yes. ⁓

Ahem.

Matt Stauffer:
And there's people putting their fingers in their ears about their reality that this is the new reality. It is here to stay. You can't just ignore it. So I'm grateful to you as someone who is both. You just listed off in your first intro, you know, 10, 15 massive kind of social impacts of tech and stuff like that. So you're aware and cognizant and ready to kind of rattle those off. And also you're in tech and you're trying to figure out how do I use these things? And you've been doing that for 20 years, right? You've been asking these questions of I have criticisms of technology and I'm also not just going to completely

Reuben "YoBigRube" Johnson:
Yeah, yeah.

Matt Stauffer:
step away from it. So that kind of being the foundation when it comes to your actual day-to-day relationship to AI tooling, let's in a second go into some of those kind of specific criticisms you have. Do you use AI at all? Do you feel comfortable using it?

Reuben "YoBigRube" Johnson:
Yeah,

so I do. like, because like even the things I have challenges like so I do. So like, for instance, like when I found out about OpenAI and chat GPT now I got to stop, you know, I'm committed to not using chat GPT And I'm using like, so I'm ⁓ in a professional community called hire.pro.

So they're a community for solo independent professionals and entrepreneurs. Think small orgs versus big orgs and medium and enterprise, right? So one of the things that comes with your, what do you call it, your subscription profile, whatever, is a customized instance that connects to Claude. I think theirs is called Sage, but I think it uses Claude behind the scenes. So what I do, I do a lot of strategy work and

Matt Stauffer:
Mm-hmm.

Okay.

Okay.

Reuben "YoBigRube" Johnson:
brand and all of that kind of stuff. I don't code nearly as much as I used to. I ⁓ guess I, asterisk, like I don't code as much as I used to. ⁓ Now I'm like, I do a lot more on the strategy side. And then because I'm involved, you know, and then I do like the person to person stuff, like hosting and on camera stuff like that, right? I still like to code and everything, but I'm also so thankful to be doing something else. I wrote too much code.

Matt Stauffer:
Right.

⁓ huh.

Reuben "YoBigRube" Johnson:
for too long

and then at a certain point I burnt out and I was like, I can't write code anymore. So that being said, so I'll use it to help me. What I prefer to do is I write whatever I'm writing first and then I will, you know.

Matt Stauffer:
Yeah.

Reuben "YoBigRube" Johnson:
put it in through there and I'll get feedback on it. How do you think this is gonna land for my audience? Or one of the things that I like a lot is I'll put something in that I've written and I'll say, I need you to review this and then I need you to give me, tell me, am I putting out anything that's misinformation, disinformation or what have you? I need you to give me, I need you to only cite verifiable sources that are reputable outlets and then I want you to give me all those citations. So I'll have it.

Matt Stauffer:
Mm-hmm.

Reuben "YoBigRube" Johnson:
go through and look at my own stuff. Because, you know, like, especially as I've gotten older, like I certainly do forget shit. I used to have a memory like a steel trap, not so much anymore. So like what I do is when I do that, if I write something longer, then I'm like, let me have it just verify it because I don't want to have put some stuff out that is like wrong. And a couple of times it said, well, you you said this, this could be argued, but you might want to look.

Matt Stauffer:
Yeah.

Mm-hmm.

Reuben "YoBigRube" Johnson:
this and that and then it'll give me the sources. I'll go read the sources and I'll decide, okay, yeah, I can see that. Let me revise how I said that. I don't want, and I'm thankful to people. I spend most of my time on LinkedIn as a social network. ⁓ I used to be big into Twitter after, know, ole boy, bought it. You know, I don't wanna contribute to his empire. So I deleted all my stuff. But so I spend most of my time on LinkedIn, right? So.

Matt Stauffer:
Mm-hmm.

You

the Empire Room.

Reuben "YoBigRube" Johnson:
Um, where was, shit, where was I? See, I also have ADHD like you. So, so as you could probably tell from the frenetic pace in which I talk, um, in my back, you know, my thinking a million things at once. So like, yeah, I'll have it. Oh, go ahead. My bad.

Matt Stauffer:
Yeah.

huh.

Yeah, well.

No, no, you're good. So yeah, I think that so you kind of mentioned how you're using it and you also mentioned in that, you know, I found out about open AI. So I'm using Claude. I mentioned in the previous episode that has not come out yet. So you haven't heard it yet. I active use Claude and in some ways we've kind of seen Claude become like the good guy and some of the culture war aspects of it. At the same time, I am an author and Claude scooped up my book and hundreds of thousands of other books to create the first thing. And therefore I'm a part of a class action lawsuit against Claude for being the bad guy there.

So one of the things that I'm really curious about is

Almost every episode so far, we've had a moment that's usually been 30 seconds where we just had this moment of like, you know, I'm not really sure exactly what to do about the potential social or economic or, you know, whatever impacts that AI potentially is happening negatively. And we're at a point where we're not really sure what to do about it. And the guest and I kind of go, yeah, I'm still sitting with that. I'm still watching. There's no super clear answer here, but it makes me a little nervous. And I think that's enough. That's fine. But I think that you have a lot more.

more

than your average technologist history of trying to figure out how to balance the appropriate response to something that you have active criticisms for and also is becoming necessary for your interaction with your industry. And so you mentioned, yeah, okay, I don't use OpenAI, I do use Claude. So you've already kind of got some metrics that you're coming up with there. But as you kind of decide whether or not to engage with a tool like this, how do you make those decisions? Because there's something bad for all of them.

Reuben "YoBigRube" Johnson:
Yes.

Matt Stauffer:
decide to engage or not.

Reuben "YoBigRube" Johnson:
So that, I love that question. The first thing I try to do is like one of the things I'm researching and this is, I'm thankful that I am in tech because I've had a lot of friends like you who are super smart with this stuff and have a lot of resources. So one of the things I'm looking at is how can I find an open source LLM that has, that is.

doesn't, isn't a malicious repository because I don't know if you read the stuff on GitHub, there's some malicious ones out there that folks are installing and setting up and then now you got an issue on your machine. So I got some friends, because I'm looking at how do I get an open source one, set it up locally, then I'm not also causing issues with running off a data center which is also powered by stuff that is.

way lower down the supply chain causing a lot of issues or yeah, lower level causing a lot of issues. Kids and mines, our electric companies, ya know, blowing out our electric bills, you know, people's towns losing water like, cause of these metal. So like, that's the first thing I think about is how can I get it on my local system? Then I only use it on demand as needed. And also I can control what data goes in, what data comes out, so to speak. On the other side, when you talk about the... ⁓

What do you call it? The intellectual property side. I think about Aaron Schwartz, the young guy who unfortunately committed suicide because the record labels went after him for sharing. No, it ⁓ wasn't the record. I can't remember. He was sharing like scientific papers. They were going after him. The federal government went after him. He was facing a 99 year prison sentence. He ended up, because they were making an example out of him, he ended up committing suicide.

Fast forward, Meta has scraped a bunch of data. They're not facing anything. In fact, they're rewriting legislation to protect these AI companies. So I definitely think a lot of this has to do with ⁓ power, politics, and systems than it has to do with helping to elevate humanity. I understand the impact of like...

a theology called transhumanism and there's folks who are like deeply invested in that belief system system where they want to transcend humanity and that is their real push for generative AI versus like the type of machine learning type AI stuff we have now. But I, that is a, that's, guess that's a whole nother thing. As far as problems and how I think about it, I think about what can I do to run it locally and have as little impact, negative impact as possible.

I do think it's really important if you're a person who's experienced being on the wrong end of systems of harm, it is beneficial to understand those systems. Because if a system is gonna be used to put you at a disadvantage, it is in your best interest to as best to your ability understand that system. If only to be able to find innovative ways to counteract.

the impact of that system on you or your community, what have you. So in that way, I don't think putting my head in the sand is a value. Like for a little bit, I was thinking maybe just completely stay away from AI, don't touch it, whatever, but I'm like.

Yeah, I don't think that's gonna help. When I heard about the story of the young guy, I think it was in Mississippi or Alabama during the beginning of COVID where say there was like 100,000 people sitting on the ranks trying to get food from food stamps, and then everything is shut down and these people were literally like, you're starving in the richest country in the world. And he went ahead, went into the data set and approved all those people who were just languishing in like a purgatory of no yes, no no, and he just approved them. ⁓

Matt Stauffer:
Mm-hmm.

Mm-hmm.

Reuben "YoBigRube" Johnson:
Like,

yeah, he could only do that because he understood the systems. And like, for, he was the Robin Hood of the moment that those people needed. They got the food they needed. And for folks who are gonna argue, yeah, that's this, that, and the third. Well, remember, a billion dollars a day is being spent on war to harm people. I really don't care if like 100,000 people get food who are starving.

Matt Stauffer:
Hmm.

Mm-hmm. Mm-hmm.

Reuben "YoBigRube" Johnson:
You can't argue, for me, you can't argue in good faith. You know I mean? So that's kinda how I feel about that. But when I think about it, I'm thinking about how can I have the least negative impact? How can I learn the most about these things? And then I try to think about, okay, I look at the companies who are making XYZ product, who's doing the least harm?

Matt Stauffer:
Yeah. Mm-hmm. Yeah.

Yeah.

Mm-hmm.

Reuben "YoBigRube" Johnson:
Sometimes

it's like, what is your least evil option? And then how can I at least in a temporary use that? And then how do I figure out my Windows to Linux switch, if you will? You know I mean? What does that look like? Because some stuff, I can't flip a switch and do it overnight, even though I might want to.

Matt Stauffer:
Yeah.

Mm-hmm.

Reuben "YoBigRube" Johnson:
But I have to write a strategy so I can figure out how do I switch over to something that's an alternative. In the meantime, maybe instead of, you know, X company, I choose this. So this is my analogy. When Chrome came out, like I'm not a big Google person. Like I've been skeptical of them for many years, right? So like when Chrome came out, I found it really odd that like they put out the browser, they put out the phone, they put out the email.

when typically most stuff was getting paid for and I just, there was something about it I didn't trust. I can't remember exactly why, but I'm like, okay, let me just go to Firefox. Firefox has its issues because they take like $50 million a year from Google to make Google their default search engine. So that is something to think about, but for me personally, they were the best option.

Matt Stauffer:
Mm-hmm.

Reuben "YoBigRube" Johnson:
versus the alternative and versus like say explorer at the time. So like that's kind of like how I look at it. Who's doing the least damage? Who can I roll with right now? And then when I think about my path to the future and where I wanna be, how do I get there with the least harm?

Matt Stauffer:
Right. Yeah.

Yeah.

Reuben "YoBigRube" Johnson:
harming the least amount of people, and then how do I learn the most about it? I think one of the steps I haven't done as well at that I'm working to do better at, how can I also talk about these things in a public forum to kind of like maybe inform other folks who don't have the time to learn about it, aren't aware of it, because that's how I've learned a ton, sharing this information. That's something like I'm working hard to do better at.

Matt Stauffer:
Mm-hmm.

Yeah.

So I heard you mention minimizing harm I heard you mention

even like searching for and helping create alternatives that are less harmful. You you talked about being a part of like the local LLM and for those, you know, who aren't aware the, if you run an LLM locally, you, like Ruben mentioned, you, A, you get the privacy aspect because your data is not being set up to them. ⁓ B, you're not minimizing all the energy costs because the largest amount of energy costs is in the training, which unfortunately those local LLMs, the energy was already used to train them, but the usage is more in your

control when you're running the thing locally. So you're minimizing harm, you're actively looking for new ways to minimize harm. You said there was this kind of like this potential to just, you know, go ahead in the sand and you're like, I don't think it's helpful in part because you can't be a part of changing and fixing outside systems as easily from the outside as you can from whether from the inside or at least from from an awareness of what's going on there. So I hear that for sure. ⁓ Do you feel I think that a lot of folks

who have some sort of like an activist bent or a desire to make a change in the modern day, there is a very common thing of like.

making the symbolic stand or getting educated is enough, right? Like, well, I chose, I said, I'm gonna use this browser and that browser. I chose to use this social media platform versus that social media platform. I did whatever and now I'm good, right? Is there a temptation to make the symbolic choice of I'm using Claude instead of whatever, or I'm using a local or whatever, and then being done there? And if so, what does it look like to say, no, that's not enough? ⁓

I want to be a part of that's further, know, it's sort of like the identity politics, right? Like I'm the type of person who uses Firefox instead of Chrome and now that's enough. Like is that enough or what does it look like to be pushing yourself to do more rather than just taking that one little kind of step? Does that make sense?

Reuben "YoBigRube" Johnson:
makes absolute sense. And it's something that I'm actively always working on for myself. I think part of it takes constantly educating ourselves and then constantly questioning ourselves. Because the second we think that, I've done enough, I've done all the things, then I think that's where we run the risk of making the mistake that it's done. We've reached the thing. I think that's part of the problem is the idea that there's some sort of finish line and we can get there.

Because I think where humanity, human beings were always, there's always room we can do better. There's always somewhere where we can say, can I evaluate my own thinking, my own kind of behavior, my own actions? Where can I do better? I have, I'm a person who's made a lot of fucking mistakes. I know like from my own background where I came from and some of the things I was connected to.

versus how I got educated over time and where I am today meant I had to make a significant amount of changes. Part of why I was able to get there is because other people were not satisfied with just making change themselves. They went the next step and then they put their ideas out there. They educated people. And also, I am a person who believes not all ideas are created the same. Not all ideas deserve the same amount of.

Matt Stauffer:
Yep.

Reuben "YoBigRube" Johnson:
⁓ equal elevation, like your ideas really need to be founded on something. And I think you have to have an expertise and education on certain things, or at least a deep level of lived experience to give you hard earned expertise. I wanna kind of like start from that vantage point. so yes, I mean, if those people had not put their ideas out there, their expertise out there, their lived experience out there, it may have taken me far longer to get to the evolution I got to.

Matt Stauffer:
Hmm.

Yeah.

Reuben "YoBigRube" Johnson:
So I'm trying to like do similar. There's ways I wish I had done more. Part of it for me was like, I didn't have a lot of confidence in my own perspectives or my lived experience. I didn't have like the college degrees. didn't, you know, not all these things that, you know, are often like elevated. Like once you have this, this formalized thing, now you have permission to speak. Now your ideas have value.

and your perspective has weight. Now you can put it out in the, I don't even wanna use the term marketplace of ideas, because now it kind of irks me. Now I can put it out there in the zeitgeist, you know what mean? And I just like, didn't have those things. So I'm like, my ideas, they don't have the, they're not comparable to other people that I look up to. But then as I'm watching stuff that I'd already made decisions on, and I already talked to my immediate family about starting to come out, I'm like, shit.

Matt Stauffer:
Yeah.

Reuben "YoBigRube" Johnson:
I was doing that 10 years ago, I should have said something about that. Maybe I could have had influence. At the same time, there is a risk of thinking we are too big or too important, that our ideas are like some sort of, this is the holy grail of ideas. I think we gotta be balanced and say, hey, if I know something I've done the due diligence and it has value, I should put it out there to help others. But it doesn't mean that like, I'm like, my ideas are, you know.

Matt Stauffer:
Mm-hmm. Yeah.

Sure.

Yeah.

Reuben "YoBigRube" Johnson:
you

know, the best or something like that. I think the thing is we gotta like collaborate with one another. So I don't know if I fully answer your question, but yeah, I think the next step is putting your ideas out there. And I, oh, one more thing. This, I take from like grassroots, folks who are doing grassroots organizing and all that. One of the biggest things, the recommendation is to do the work in community, do it with other people. And there's an idea of critique, self critique where,

Matt Stauffer:
Yeah.

Hmm.

Reuben "YoBigRube" Johnson:
You've got to be willing to put your ideas out there and have them critiqued and take that with grace. Be able to process critique with grace and then critique yourself. And that's hard to do if you're not positioning yourself and looking at yourself like with balance and saying, hey, I always have to be able to question myself, my own thinking about what have you. ⁓ I think that's really important. So doing it with community, think is very

Matt Stauffer:
Mm-hmm.

Reuben "YoBigRube" Johnson:
is valuable. I forgot to add too, like one of the things that this is like off in the other side. I have used AI for code evaluation too. So like when I was evaluating, I was trying to update a site or whatever and it was like a really old version of a CMS and I had to bring it to a new version. And for me, that used to be forum territory. You go to all the forums and get all that. Unfortunately, all the forums are now on Discord and they've all disappeared and like

Matt Stauffer:
Okay.

Yeah.

Reuben "YoBigRube" Johnson:
with the asynchronous communication, finding all the information, it is not so linear anymore. And while I'm not a linear thinker, like speaker, I should say, communicator, linear as far as information being produced and regurgitating is really helpful. And I miss the hell out of forums because of that. And I don't have to have a Discord account to find it. I just need a web browser and search. So that's disappointing. But one of the things I found through the process of

of trying to upgrade that CMS and then update the template stuff, because some stuff was deprecated, and new stuff would come out. It made me realize how much you have to be so careful with LLMs because they will spit stuff back to you, lying to you. They just don't necessarily always have malicious intent. I do think sometimes there is some malicious intent written into those algorithms if you kind of like look at some of what, but that's another part of the conversation. realizing the machine's like, no, this is the right code.

Matt Stauffer:
Yeah.

Reuben "YoBigRube" Johnson:
No, no, no, that was deprecated. I literally just read the docs, you're wrong. And then it sends back to me something that I already know, like, that is the wrong thing, that is deprecated. This is the right one. And then like 15 times of such uber confidence at presenting the wrong answer to me, and then, know, like, hallucinating ⁓ that this code works, and then I finally arrive at something that is actually accurate.

Like Aaron Lawrence had said, the only reason I was able to know all that, is I already know how to code, I just was looking for help to figure this thing out. I think that's incredibly important. I think that goes back to the idea of community too though. Because a lot of what I know, especially with modern development, is only because I was fortunate enough to build community with other developers ⁓ who were willing to share their insights and share their expertise with me and back and forth.

Matt Stauffer:
Yeah.

Yeah, so of the things you mentioned in there is you talked about kind of like potential bias in the algorithms and the next question I was going to ask you I think does lead there. My question is going to be do you see AI as a positive, negative or kind of mixed, ⁓ you know, impact on the world? Like what do you think the world is? If you were able to just press a button and make AI go away, would you? Or you're like, no, this is is, you know, something that has a lot of potential. And from there, I do want to dig into that. You said it's the top

for another conversation, but I want to hear your conspiracy theory or your actual practical thing about what's your concern there with the bias.

Reuben "YoBigRube" Johnson:
Okay.

So that's an interesting question as far as the part about would I press a button to make it go away.

I don't know, because on one hand, I don't want to have that kind of power where I can just do that, but on the other hand, I'm like, that's a cop out. But I don't want to have the power to affect a billion lives and take away their autonomy and their ability to act for themselves. But I do think AI, so some people say AI is neutrally written. It's just a piece of software.

Don't believe that. I think there is bias written into the code. If you look at, like I know you look at code all the time. If you look at code, number one, a lot of the code comments are written in English. That is already putting English speakers at an advantage over every other language. So like, you have to find a way, if you're learning code, to get it translated into your language. If I speak Farsi, then I gotta figure that out. That's on me to figure out.

So that is not like a bias where it's like someone was trying to do something malicious, but that is a bias, you know what mean? That it's like most of this code is in English and a lot of that is only happening because American foreign policy worked really hard to make English the international standard. That was an act on behalf of imperialism that was not an act on behalf of unification of the human species. So on that hand, you've got that kind of bias.

Matt Stauffer:
Mm-hmm.

Reuben "YoBigRube" Johnson:
On the other hand, I think there's bias written into it because like, if you ever watch those studies on gender bias and like, they'll put a bunch of pictures on and like they flash a message and you have to respond, right? So like that, and then people will find out, shit, I'm biased. ⁓

toward males as being harder working or this, that, and a third, right? That is biased too. And those, as the people saw in those studies, that stuff influences our decision-making. If we're looking at a resume or we're thinking about a candidate for a job or what have you, that stuff will enter into the equation. So that is in there. And then when I think about the fact that AI will hallucinate and tell you something's true with a straight face and it's dead wrong.

Matt Stauffer:
Mm-hmm.

Reuben "YoBigRube" Johnson:
or it's misinformed, that bias also comes in. I think about how the, what do call it? When they testified in front of whatever the legal body was. ⁓

about the DOGE right? So like the videos from that and the couple of guys that I've, at least that I've seen have said they put the grant application or the grant terms into ChatGPT and they put a bunch, you know, they created a prompt, put these terms in there that they thought were.

Matt Stauffer:
yeah.

Reuben "YoBigRube" Johnson:
were biased or they thought were illegal because they were so-called DEI and they spat back a response and they use that reasoning to say, this is wrong, XYZ. That to me shows the issue with the AI-ification of everything. The fact that I can abstract responsibility if I can have an AI that will automatically disqualify candidates I don't want.

And instead of me having to make up a ludicrous term, ⁓ like, and apply it broadly in some weird way or say, hey, they're not culture fit, or they're not this or that because it's ambiguous enough to keep me out of hot water. Now I can use AI to further abstract myself away from that. You know, plausible deniability of that.

that puts a lot of people at a disadvantage who already at a disadvantage. ⁓ Or I can use AI to deny health insurance to somebody. So now like since the Affordable Care Act passed, they can't say, it was a preexisting condition and ban you from health insurance, right? But who knows what goes into AI? Because it's essentially a black box. If you want to feed all your potential customers into that and then deny patients and have it come up with some answer,

we didn't do anything. didn't, you know, unlawfully, like, keep people from having health insurance. It just, the system said what the system said. I think that's concerning. Like, that, that, I think we need, we can use it to help with efficiency, but we have to be careful how it's being used. And I don't think the role in that is use it now, figure out the solution later. Well, what about all the people who are being, you know, put outside of access?

until you come up with a solution later, wouldn't it be more responsible to not be using it for certain things now? Figure out your solution, then think about how do we integrate it in? And by the way, we create an alternate source to access this information so that their only route to access is not through AI. I think about like all the things like job applications and all of that, like you wanna work at Publix, you wanna work at...

Walmart, you gotta go to the website and do it. You can't even do an application in person anymore. That also creates a barrier. Like initially people wanted applications through the internet because it created additional access or reduced friction. But then to cut off all the other means that are like ⁓ analog, now you're creating a different type of friction. And I think we saw when they took down information off the White House website, you see that digital can alter history.

Matt Stauffer:
Mm-hmm.

Reuben "YoBigRube" Johnson:
in a way that analog is far harder to do. So I think that like before we make it the AI of everything, maybe we consider making it an option and create and leave the existing stuff as your alternative because you, think it also creates, in my opinion, set of checks and balances. Like, because you've got the existing system sitting side by side with this new thing. And if the new thing is not working right, you can go the other route.

Matt Stauffer:
Hmm.

Reuben "YoBigRube" Johnson:
And maybe that in itself is data we can balance again, we can measure against. Why are people not using the AI one? Why are they all going this route? Sometimes it's gonna just be fear and discomfort, discomfort. Sometimes it's be other reasons that are far more significant.

Matt Stauffer:
Yeah.

And that makes sense from like a public services perspective, right? Somebody should not be getting or not getting insurance or legal help or whatever else. But then it's also reasonable because public services insurance companies probably have the funds to be able to contain, you know, to have two side by side. There's a lot of people who are using AI who don't even have a business if they're not able to rely on AI, right? Telling everybody, so in the past you built an app. Now you have to build an app and an AI version of the app. you know, you can't just build the AI version. I think really mitigates and distracts

Reuben "YoBigRube" Johnson:
Yes.

Matt Stauffer:
from or we lose a lot of the benefits that people are hoping to get from AI if you now have to double up on everything. ⁓

Are you aware of anything that is either a practice or movements or anything like that to help mitigate the bias? Because you said malice in your first one, but I think from what I heard, you mean less intentional malice, like programmers set out to make AI that is evil and more inherent bias, meaning the bias inherent in all the materials that they consumed was kind of internalized. And maybe even some of the biases inherent in the programmers, but it's those kind of like inherent biases

those implicit malice. ⁓ What does it look like to mitigate those things or be a part of ensuring that you're not encoding those into your own work? Like how does one avoid AI bias ⁓ in their thing if they can't just say I'm gonna build a side-by-side system? What other options do we have?

Reuben "YoBigRube" Johnson:
That, yeah, so I think it starts with we've got to educate ourselves on the things that are the most likely or most prevalent biases. I think that it starts there. We've got to educate ourselves on those things. So I think some bias is like we live within systems. All those systems are gonna bias all of us. And it's on us to, you know.

undo and work to undo the bias, first recognize it and then how do we work to undo specific biases. Other things are like, unfortunately there's some people who have access to create software that has, allows them to have an outsized amount of influence and power over other people. And they have very, very definite biases. The guys who are in those videos, I can't think of what the hell the term is.

they clearly showed they had very, very, yeah, they had very significant biases. Like they really weren't doing a good job of hiding it. Like, so I can imagine if those guys were writing software, that bias is gonna be in that software. It makes me think about the Gamergate and kind of like how those women who were reporters were being harassed literally for whatever they were reporting on. I can't really remember. ⁓

Matt Stauffer:
the Doge guys? Yeah,

Yeah.

Reuben "YoBigRube" Johnson:
So I think that's on the one side. To answer your question on when you have to create the side-by-side app, I think about the fact that even with the app, the native app ecosystem, how the fact that we only have that because Steve Jobs didn't want to have to compete with the open internet standard. he did what he could through the Apple operating system to kind of like.

make it more attractive than the open internet one. Like I think about things like HTML5, but some of the things that didn't end up in there or weren't supported well, and then the apps could or I'm trying to, I can't give you a really good example and that's on me, I apologize. So maybe I should move on for that. ⁓ But I know like the expense of having to create two different things and the open thing would have actually been the better thing, because it would have required not maintaining

Matt Stauffer:
No worries.

Reuben "YoBigRube" Johnson:
two and three code bases. know, ⁓ like what was it? Xamarin, you wouldn't have had to do the middle one, the Android one, the iOS one, and the web app one. And kind of people got the idea that, well, the web app one is the least valuable. We're not going to do that one. We'll figure out how many of our users, our customers are on iOS, how many are on Android, and we'll invest in that, and whichever one first has the most customers. Whereas the reality was if,

there had been more traction on the open standards, that would have been an easy yes. ⁓ It would have been actually more cost effective to manage one code base versus these other two. Putting all the challenges with responsive design and all that kind of stuff on the side, I know that's like, I'm oversimplifying, I'm flattening all that out and I get that. ⁓ So how do you address this when you're kind of building your own AI thing?

⁓ Using the example of like one of your guests on one of your previous shows talked about ⁓ like an ⁓ AI app for like accountants I think they had said ⁓ and I feel like with some of it maybe it's about getting more of it on your local system. ⁓ Okay I don't have a good answer for you let me start with that let me be take accountability for myself. Okay.

Matt Stauffer:
Mm-hmm.

Well, you know what? You actually gave a good answer. Let me step back. You gave a good answer at

the beginning and you brushed past it because I think you didn't even fully it was it's so native to you. You said educate yourself about the things that the bias are. So one of the things that Aaron and Blessing and maybe other folks have said, I think Jeffrey said as well, is I can do better to prompt and review A.I.'s output when I know the thing that it's it's it's building. Right. So

Reuben "YoBigRube" Johnson:
Yes.

Matt Stauffer:
So

as a programmer, can prompt and review better programming ⁓ than if I weren't a programmer or if I weren't aware of this language. And one of the things you said was educate yourself. And interestingly, educating yourself about that it's not as if like

a hundred brand new types of bias are being introduced every year, right? It's a lot of the same biases over and over again in the history of humanity and the history of the United States. And so you mentioned educating yourselves. And I do think that being aware of the biases that are present in the general internet, AKA the training material for all of these things, being aware of the biases that are present in government, in our society, I believe likely prepares you to better evaluate ⁓ the output of AI work, but also build in safeguards.

Reuben "YoBigRube" Johnson:
Yes. Good point.

Matt Stauffer:
Because if you're like, hey, I know there's inherent biases in the training material against people who speak this language or who look this way or who aren't this whatever, then you now know how to try to build around it. Or if you can't build around it because you can't catch every edge case, you have to make caveats and exceptions knowing that might happen. Right. So it's not perfect, but at least you're able to say, I understand the bounds of the system. And I think a lot of my value of this podcast is helping people understand the bounds of the AI systems.

What is it good at? What is it not good at? Where do you need to be worried about it? Where can you be confident about it? Because the more we can do that, the more we can use it well or not use it well, right? And so when I asked this question about what does it look like to, you know, deal with these biases, I think you're, and I very much appreciate you saying I don't have the answer from a technical perspective. Thank you for being willing to say I don't know. I love people who say I don't know. But I do think from a sociocultural perspective, you're naming that like if you want to identify

Reuben "YoBigRube" Johnson:
Yeah.

You

Matt Stauffer:
and address bias in any context, you need to understand it in the first place. We can't just say, do do do, the system I'm working with is perfect and won't have any bias. Like that's the problem. That's how you're introducing bias by assuming there is none, right?

Reuben "YoBigRube" Johnson:
Yes.

Yes. I appreciate, thank you for that. I appreciate that. When you said that, one of the things that came to my mind was somebody gave a talk at the South By stuff over the weekend and they talked about the whole thing with the AI girlfriends and AI romance and then also AI.

I don't know if you'd call it chat bot, AI characters, I think is what they're calling it, AI characters, and they're talking about how kids engaging with it, obviously kids are easy to trick. One example, my kid was five years old. They had just turned five, right? And I was about to go to work and I was like, you're five years old, or maybe it was tomorrow. You're five years old tomorrow, you're come to work with me, because you're five. And they burst out crying, and I'm like.

Matt Stauffer:
Uh-huh.

Reuben "YoBigRube" Johnson:
I immediately was like, I thought it was just a cute little thing, but they were like, this is real. And they're crying like, I don't want to grow to go to work. I want to be a kid. I want to stay home. And I was just like mortified. like, I thought you would know this was not real. you five year olds, mean, in a, in a good world, five year olds don't go to work. Hence the kind of democratic Republic of Congo. got kids working in mines.

Matt Stauffer:
I know that feeling. Me too.

Uh-huh.

Reuben "YoBigRube" Johnson:
Which this is perfect, that's perfect to remember when I think about my kid as a five year old, right? Crying because they believed that they had to go to work with me. And I was just, I thought I was being funny. And in their mind, they thought it was real. And they're bursting out crying, looking at someone to make their world make sense and feel safe again. So like, and while we explained it and we smooth things over, I'll never forget that. Cause it was the...

shows that like when you have proximity to the accurate information and you have a real understanding of things, where your position versus where you don't and little kids can be taken advantage of by software and by AI, which doesn't, it is not a living breathing thing. It is not generative artificial intelligence. It is pattern matching and math from what I've learned at least. And like, I can't explain it. So I'm the wrong person to talk about that, but like,

then you take that and you make an AI girlfriend or an AI character. And if you don't understand, if your brain is not fully developed, which, you know, I think it's like 25 years old till a human being's brain is like fully developed, you can be manipulated. You could be, I mean, that's why people are unfortunately harming themselves when their character bot or whatever says or does something. I think we have to, I think in that context, we really need to be thinking about.

Should I do this? Like if I'm a developer creating an AI, right? Should I do this? And if I do this, how do I create, to your term, how do I create guardrails to constantly remind a human being that is using this piece of software, I am not real, I am not generative artificial intelligence? Please take anything I output to you with a grain of salt or...

Trust your intuition and your gut before trusting the words on the screen or the visual I'm generating for you because you're the human being, you're the authority on the real world. I'm a machine, I have no authority. Like when my AI would constantly respond back to me stuff like, yeah, as a black person, this and that, and I'm constantly saying, and so I finally had to say, don't ever say you're a black person again. You are a machine. You are not black. You never, ever, in...

Matt Stauffer:
Mm-hmm.

Reuben "YoBigRube" Johnson:
ever live my life, you don't know what living in this world looks like. Don't ever equate yourself to a person. Don't ever say you're a person. In fact, when you talk to me, don't use the term we. Like, you are not human, you're a machine. Remember that. And I'm thinking to myself, what went wrong in that this machine thinks it can, and thinks, I'm, what, anapromorphizing?

Matt Stauffer:
Hmm.

Reuben "YoBigRube" Johnson:
⁓ I'm attributing human characteristics onto an inanimate object, even by how I'm describing it and how frustrated it made me. I was more frustrated by the systems in place that allow it so that a machine can try to use language to communicate with me in a way that puts itself in a human's position. And I'm like, you can't do that. And so I know this.

Matt Stauffer:
Yeah.

Yeah. Yeah.

Reuben "YoBigRube" Johnson:
But if it can do that to me and even me getting frustrated, what about someone who's not as informed? What about someone who's in a vulnerable spot? Like if I'm vulnerable, I'm much more likely to fall prey to something. I mean, that's how cults work. They prey on people who have a vulnerability, who have gone through a terrible situation. And that's human beings being able to spot the vulnerable and to exploit that to their own advantage. So I think about that, I'm like, how do we...

Matt Stauffer:
Yeah.

Mm-hmm.

Reuben "YoBigRube" Johnson:
put in guardrails so that at minimum, we're constantly echoing back to the person on the other side. I'm a machine, do not always trust me. Especially like when I'm using, when I'm doing a lot of written stuff with it and it's spinning stuff back to me. I'm like, yeah, but when you say that about sales, I don't really think that makes sense. Cause some of the stuff you're telling me as a person, I think that's just.

Matt Stauffer:
Mm-hmm.

Reuben "YoBigRube" Johnson:
I don't think that's gonna work. I think somebody's gonna read this and be like, this is crazy. Even the fact that the trope on LinkedIn now is like you can tell it's AI, cuz it's like five pages to say hello or something. Some stupid thing where it's it's verbose amounts of text just like splattered all over the screen because it's not a person, it can't be succinct for whatever reason. ⁓ Yeah, I feel like, sometimes I feel like a yo, I'm sucking all the oxygen out of the room, man.

Matt Stauffer:
Mm-hmm.

Yeah.

You're good, man. That's

why I have you here. But we are almost out of time and I want to make sure that if there's anything, especially as you kind of thought about preparing for the conversation, anything you wanted to get a chance to talk about that we haven't gotten to today.

Reuben "YoBigRube" Johnson:
Yeah, I think one of the things that's been helpful for me is trying to find my information when it comes to tech news, what is happening now, what is on the horizon in the future, trying to get it from places, from people who are thinking about the intersection of techs and social-cultural, social-political, that kind of thing, so that we're better informed when we're building tech. Because believe it or not, from my perspective at least, when I'm doing tech, it is more than tech.

Business to me is more than business. I think that sometimes we forget that these things are not just abstract. This is a thing in a silo. doesn't ever step outside of the walls of that. And I just don't think the world and humanity works like that. like, think find organizations that are doing work in this space. I mentioned Mozilla, they have some great information, but then there's other places like the tech we want. There's an organization called Motherboard. I'm a big believer in 404 media.

There's like a lot of, there are places, small technology foundation, they're doing some great stuff. ⁓ I think find like resources that can, and obviously like a podcast like this, like what you're doing, I think is vitally important. Find resources who can help guide you, inform you. One of the challenges I think we have in software that I've heard other people from other industries say is like, we don't necessarily have to do training that forces us to think about,

or compels us to think about ethics. And I think that is incredibly important. Like, so find places if we're not already learning about ethics, if we're not already kind of thinking about the humanities, find places to help, to dive into to help better inform yourself about the humanities, about ethics, about the world around us from that point. And there's a lot of great resources out there. I'm gonna shoot out ⁓ PBS, which...

Matt Stauffer:
Mm-hmm, yeah.

Reuben "YoBigRube" Johnson:
I'm a huge believer in PBS ever since, you I was a kid. And then, yeah, so I think that's a lot of places. Those few places, they will not do you wrong. They will really help you out. And I don't mean you, I guess I mean the royal you, like you as in whoever's listening, the audience, what have you.

Matt Stauffer:
Yeah. Yeah.

Got it.

Yeah.

Yes.

That's amazing. And we'll make sure those all get linked in the show notes. And thank you so much for kind of your note there, Ruben. If somebody wants to continue following with what you're doing, you mentioned you're not on Twitter anymore and you're on LinkedIn. Is that the best place to follow you? Is there anywhere else they should be keeping track of you?

Reuben "YoBigRube" Johnson:
You're welcome.

Yeah.

So yeah, LinkedIn is a great place to follow me. That's like the one where I actually will write stuff and I will write business stuff, tech stuff, personal stuff and all that and I'll put it right on LinkedIn. ⁓ I have a IG but I don't do anything with it and I'm not a meta person so ⁓ it's just kind of now it's just kind of squatting there. And then you can, XOFly Duo, like anything of our XOFly Duo, ⁓ that's me and Sherry, that's our...

Matt Stauffer:
Yeah.

Reuben "YoBigRube" Johnson:
our brand we've built in collaborative, feel free to follow our stuff there. And then I have another project called Sex, Tech and Chill. If you've ever been through like deconstruction and you're dealing with stigma and you're trying to figure out like, if you're at a certain point where you've gotten through the most challenging aspects of deconstruction and you're figuring all that out and you're looking now you're like, how do I build community or find resources?

Our project, sextechandchill.com might be something you'd appreciate. That's on YouTube and LinkedIn and IG as well. ⁓ between those few places, that's kind of like what I'm doing in the online space.

Matt Stauffer:
Amazing. ⁓ We have talked today about a bunch of things that are not, they don't have easy answers, you know? ⁓ And so...

I think that it's you even mentioned this early in the episode you mentioned that We inherently don't want to talk about the thing until we know we know everything And Sam who is the most recent guest again? You haven't gotten a chance to hear here and you I think I'm really grateful to you both for saying I'm not the expert here I don't have the answers even hitting moments in the podcast. You're like, yeah, I got to be honest. I don't know how to answer that question ⁓ But it's sort of like here's what I'm figuring out. Here's what guides me. Here's what I have learned And so I think it's hard

to take a stance, especially one that is generally, I wouldn't say against the most common way of looking at it, but if you're in tech and you're not pro-AI, you're not in the majority, right? And I'm not saying you're anti-AI, but just saying anything that's even mildly critical. So choosing to do so and choosing to do so when you're like, and also I don't have all the answers and I'm still willing to talk about is to me very brave. And so I really appreciate you coming and hanging out, sharing what you do know with us, sharing your history, sharing your experience.

Reuben "YoBigRube" Johnson:
Thank you.

Matt Stauffer:
⁓ And look forward to continuing to learn as you do figure more things out and you share them with all of us I will tell you you're the first person who's ever told me follow me on LinkedIn as the answer to that question I've I've done hundreds of podcast episodes and you're the first one that said LinkedIn So I love that but we'll definitely keep up with you on LinkedIn everything and just thanks for coming and hanging out

Reuben "YoBigRube" Johnson:
Thank you so much, Matt. It has been a pleasure. Someone like you who's doing all the things you're doing and with everything you do in software and your level of expertise, it means a lot. It really does. And you're a great person, for your listeners out there. I know he sounds like a great person on the podcast, because he is, but in person, I don't want to say real life, because this is real life. In the...

Outside of the podcasting this guy is a great person. He is a really really genuinely good person if you're ever so fortunate Like Matt is a really really good person

Matt Stauffer:
room.

Okay, ⁓ I hope you all are just... Yeah, I hope you are listening and not watching on YouTube so you won't see the flush in my cheeks. ⁓ Reuben I appreciate you, ⁓ And for the rest of you, thank you for hanging out. We'll see you next time.

Reuben "YoBigRube" Johnson:
Take it and let it land, man.

No, I appreciate you too.

Matt Stauffer:
So right after I hung up with Ruben, I remembered that I was supposed to get back to the thing that I did for the first few episodes and failed to continue doing, which is share ways that people have said they're practically using AI in their day-to-day life. And I just completely forgot for the last few episodes. I remembered for this episode, had it up in front of me and then hung up. So sorry, a little post-op, post script, there we go. So I've got two people here.

That I don't think I've shared these before Kayla my VA and I were actually gonna make a list so I don't keep doing this But I've got two in front of me that I'm very interested in first one is Kyle Zantos says I made my best friend a Jeopardy quiz game with 10,000 real questions to practice for real Jeopardy and it's jakes Jeopardy .com and I guess Jake is Kyle's friend who wants to go on Jeopardy and they're like, yeah I'll made this thing for you. I thought it was really fun one of the things we've talked about with AI is the idea for it to be like a

like personal software tooling. And I'm like, this is personal software. It is for Jake and maybe other people will like it, but it's become to the point where it's like, it's worth building even it's just for me. It's worth building if it's just for Jake, which is something I think is fun as a programmer. And then there's another one here, which I want to hear y'all's comments about. I've got to hear what y'all think about this. He says, I use Claude to write stories about my son and his friends, which are then read by my late mother's cloned voice in 11 labs.

and uploaded to custom cards for his Yoto player, which is a little like audio player for kids. And then I generate custom pixel art with ChatGPT. And I'm like, I'm pretty sure there's a Black Mirror episode about this. ⁓ I'm very curious to hear what y'all think. This is both really creative and also sweet and also kind of scary, you know? And so anyway, Kyle, thank you so much for sharing this. Kyle also mentioned making his toddler a hub of custom games that marry his interests and his learning level.

Quite a few more. ⁓ So yeah, I'll put those in the show notes as well. That's from Kyle's Zantos on Twitter So once again, thank you all for hanging out and once again, see you next time

Creators and Guests

Matt Stauffer
Host
Matt Stauffer
CEO of Tighten, where we write Laravel and more w/some of the best devs alive. "Worst twerker ever, best Dad ever" –My daughter
Reuben Johnson
Guest
Reuben Johnson
Narrative Consultant for High-Stakes Stories | Co-founder at FlyDuo™
A Framework for Approaching Worrying Technology
Broadcast by