The Nexus of Technology & Theology

The Pragmatic AI Podcast is sponsored by Tighten. T-I-G-H-T-E-N. We will take your AI ideas, prototypes or even vibe-coded apps, and we'll take them to production. Scalable and secure. Check us out at tighten.com.

Matt Stauffer:
Welcome to Pragmatic AI, where we talk about using AI in the real world, what works, how to use it well, and when it causes more harm than good. Practical tools, real trade-offs for builders, business leaders, and today also potentially pastors or theologians. My guest today is my good friend, Nick Peterson, Assistant Professor of Homiletics and Worship at Christian Theological Seminary in Indianapolis, Indiana. That is harder to say than I expected. Indianapolis, Indiana. Nick, because I can't talk, can you talk? Can you say hi to the people and tell them who you are and what you do?

Nick Peterson, PhD:
Hello everybody. Thank you so much for having me here today. Yeah, I teach in theological education as a primary kind of vocation. So I'm working with students across a range of programs. Some of my students are doctoral students who are looking to do sort of advanced degrees that are directly connected to the

Matt Stauffer:
Of course, my friend.

Nick Peterson, PhD:
the context where they serve. Others of my doctoral students or PhD students who are looking at careers potentially in teaching or serving, you know, in kind of denominational leadership context. Some working with master's students who are training to be pastors in some context and in other contexts. They're training to be a professional licensed therapist.

My school offers two ⁓ sort of credentialing counseling programs, a marriage and family therapy, and then mental clinical health counseling as well. ⁓ And so it's a pretty broad spectrum of students that I work with. ⁓ In addition to my work in the sort of classroom where as a theological educator, I also am ordained in the African Methodist Episcopal Church and serve as executive pastor at a congregation at Bethel ⁓ AME here

in Indianapolis, Indiana, which is a mouthful to say. And I also do some grant work. So I am the grant director for a grant that the institution just landed with the Lilly Foundation. And that project is called Testify. And basically what we're trying to do in that project is really enliven testimony practices within Black churches across the country. So it's a nutshell. I'm married. I got two kids.

Matt Stauffer:
Why it's so hard to say?

Nick Peterson, PhD:
Yes.

Matt Stauffer:
It's a big nutshell. ⁓

Bethel AME is the most classic name for a church I think I've ever heard. Right, that's what I'm for an AME. Okay, ⁓ so I heard three things there for the folks. I heard that you are a professor, teaching, it's some combination of theology and also counseling. You are an executive pastor and you are also a grant writer.

Nick Peterson, PhD:
it is. It is, especially AME right? You know, it.

Matt Stauffer:
So I imagine the question of what AI's interaction with your day-to-day life looks different in each of those. We've had my friend Blessing came on a couple weeks ago and she's like I've seen people using AI to try to write out their like personal spiritual practices and I was just like oh my god, I had never thought that that would be a thing that people would choose to do so Normally, I ask like what's your relationship to AI and I imagine that your relationship in each of those spaces might be a little different So do you want to give me like a

broad overview or do you want to take it one silo at a time? What makes the most sense for you?

Nick Peterson, PhD:
Yeah, probably a combination. So one is AI has been a really, really big question as of late, both within theological education and in sort of, you know, worshiping communities. And so the running, you know, readily in my Facebook threads and conversations that I'm having with colleagues who are pastoring ⁓ and even my colleagues who teach are like the AI sermon.

Matt Stauffer:
Yeah.

Nick Peterson, PhD:
Right. So I recall this is maybe two years ago now at the American Academy of Homiletics in one of the breakout groups. ⁓ We had a discussion about preaching and AI. And so just right then and there, I put in a prompt in chat GPT to say, hey, write a sermon on XYZ passage, you know, in this particular style. And it was really, really, really bad.

Matt Stauffer:
Ugh.

⁓ thank God.

Nick Peterson, PhD:
It was clear that it

was sort of like, was pulling tropes. And what I shared with the group at the time was like, this is as bad as it'll ever be, right? Because as much input as these sort of large language models are getting and how they are able to then sequence that, right? All of the stuff that's happening behind the scenes, which is way above my thinking and pay grade.

Matt Stauffer:
huh.

Yeah.

Nick Peterson, PhD:
This is just as bad as it'll get. About a week and a half ago, I was with a colleague who was like, my gosh, I'm stuck with my sermon. I don't know what to do. And again, just out of curiosity, I put in a passage ⁓ and I was like, well, what are some of your points, right? What do you have? And I put in what they had in the point and, you know, told the thing to draft a sermon. And sure enough, it was much, much better than what

Matt Stauffer:
Yeah, yeah.

Nick Peterson, PhD:
what was produced two years ago. ⁓ A couple of months ago, there was this sort of trend that was happening on social media where people were saying, put this prompt in your chat GPT. And it was something like, based off of everything you know about me, create a caricature of me in all of my various roles and responsibilities. And what was telling,

Matt Stauffer:
Uh-huh.

Nick Peterson, PhD:
was how many of my religious colleagues, chat GPT showed them at pool pits in churches, Bible studies, sermon manuscripts, right? And they like, they presented them sort of like, ooh, you know, this really nailed me. Right? And I'm thinking like, this is telling us that your sermons and your Bible studies, yeah, it's like, you know, we know where.

Matt Stauffer:
Look how cool this is. huh.

Yeah, why does chat know that about you, my friend?

Nick Peterson, PhD:
much of these ideas and things are coming from. ⁓ So there's no question, right, that these technologies are being used and that they are, right, technologies, but...

Matt Stauffer:
Yeah.

Nick Peterson, PhD:
The whole apparatus of our human relations, from my perspective, is really grounded in technology. So it was a technology that produced language in the first place. That's like a primal technology. It was a technology that moved from oral tradition to written tradition, right, from the thing being spoken generation to generation to literacy, learning how to read texts. It was a technology in learning how to copy texts that had already been written.

It was a technology to then translate those texts within particular contexts. It was a technology to collect them all and bind them into a volume, to make them into a canon. It was a technology to then develop theologies to wrap around those texts ⁓ and then to train people on how to.

exegete those texts appropriately in light of the theology that's wrapped around them. It was a technology to develop songs and memetic devices to both reinforce the theology and so on and so forth. ⁓ So the propagation of religion has fundamentally been oriented and in conversation with technology, right? So the technology that, you know, Martin Luther was able to take

use of with the Gutenberg press, right, and how that allowed for now the Bible to not just be something that was held by the magisterium, but something that could be held by the wealthy elite and subsequently means that their interpretations could have weight. ⁓ And then you take that into another iteration with King James and what it meant to ⁓ move the Bible out of these kind of traditional classical languages. ⁓

into a certain intonation of English ⁓ language syntax and grammar and linguistic aesthetics. So for those of us who grew up in evangelical charismatic traditions, like nothing tops King James English ⁓ in that realm. Then you have radio, you have television. All of these have been spaces where ⁓ religion has ⁓

Matt Stauffer:
preach man it's it's it means something yeah

Nick Peterson, PhD:
made use of technology. So I don't see AI any different in terms of how people are negotiating that. ⁓ It has a more clear impact on content development than say some of those other kinds of technologies, right? So when you are on radio, the discipline of your content was really based off of time, right? So now if I'm gonna preach and we only have 30 minutes, I can't preach my typical 55 minute sermon. I gotta figure out.

you know, what needs to fit in this 30 minute window. But then you have the development through, you know, ⁓ the developments that take place through television and televangelism is I can still preach 50 minutes and I'm just going to show you 20 minutes and give you the 20 minutes that whets your appetite. So now you can figure out how do I buy the CD, the tape to get the other 30 minutes of the sermon that's going to change my life. So these technologies have always influenced both how we engage, produce,

receive, consume, ⁓ ya know, theological content. ⁓ What I'm curious, you know, ⁓ as, you know, folks continue ⁓ with engagement with technology is at what point does it sort of bring, again, kind of catholicity to our own understanding of religious. ⁓ So it means universal.

Matt Stauffer:
You're gonna have to tell me what Catholicity means.

Nick Peterson, PhD:
So that's what Catholic means, it's universal. So there is a way from the Western tradition that this large language model ⁓ really moves toward a kind of Hegelian ⁓ end of history. So we've synthesized everything that can be synthesized and there's no more to be said. ⁓ So that.

Matt Stauffer:
huh.

Nick Peterson, PhD:
All I mean that that's big, small, that's everything in between, you know, in terms of a four minute redux in preaching. But one of the things that I have in terms of where I've seen this in preaching be effective ⁓ is as an analytical tool, right? So if you have a sermon manuscript, right, you have something that you've written and you want it to be engaged on its own quality.

Matt Stauffer:
Yeah.

Okay.

Nick Peterson, PhD:
you can prompt this AI to do an assessment. Do these points make sense? Does this flow make sense? ⁓ If you have, for instance, research questions around early Temple Judaism ⁓ and you feed that into a prompt in the same way that you would Google, there are things that may come back that could be helpful. ⁓ And even in, again, over the last couple of years, the extent to which AI can point you to resources ⁓ and or correctly.

Matt Stauffer:
Yeah

Nick Peterson, PhD:
identify resources for you has increased. ⁓ So I think that there are definitely ways to use it to be helpful, but as with most things, there are easy ways for it to be abused. So ⁓ I can tell still at this point with my students, if you've relied on AI to produce your content, your ability to actually synthesize at the register with which

the AI has synthesized it is incomparable. So you can't access the kind of language that the AI has pulled and actually understand even what it's telling you as you've uploaded things to it for it to synthesize. So it becomes telling because it's like, yeah, I know the content. So I know how this is sort of processing, but you don't know the content and you've produced something.

Matt Stauffer:
Interesting. Yeah.

Nick Peterson, PhD:
that would be equivalent to what I would produce. ⁓

Matt Stauffer:
Yeah. It's interesting

because there's an article that just came out today in the programming world talking about how programmers who use AI for their programming often have similar velocity, but their comprehension is significantly diminished compared to other people. So I'm like, and I'm so sorry to interrupt you, but the thing I was most interested about everything you've said so far is all these things about technology. And I love the idea that like every time a new technology comes along, everyone's like, this is going to ruin the world. And then it turns out it doesn't, know, somebody quotes that person saying,

Newspapers are bad or whatever, you know, the printing press is bad But I do have a ⁓ thing that I think I want to tease out from what you said here so far which is that there are some things that you said that sit right with me when it comes to religion and teachers and stuff like that, is they are writing the message, but they're maybe asking for help evaluating their delivery of the message or they're help have asking for help, you know coming up with historic sources, which we would use technology for those anyway, you would have asked your friend to review it or you would have

maybe even thrown it through some other automated tool or you would have used Google, which is a technology. But I feel like up until now, the onus of actually coming up with the content either is you or the worst case scenario is you're kind of cribbing it from somebody else's work or somebody else's book or something like that. there's a human behind the message. ⁓ Do you have a line of your comfort where it goes to like, I'm asking it for help for the message I'm generating versus it's generating the message itself?

And if you do have a line, where is that line of like where you're like, this is a good thing and this is a bad thing or is it too early or too difficult to have a good thing, bad thing line for it?

Nick Peterson, PhD:
I think it's probably both, right? So in the, for the better part of the 20th century, if you were the traditional doctoral student, white male married student, you had potentially a spouse who did your typing, who, you know, made sense of your notes. You had a research assistant who

Matt Stauffer:
Mm-hmm.

Nick Peterson, PhD:
would pull the books from the library, could read and draft annotated bibliographies for you to sort of redirect where your specific focus needs to land. You could have access to a research librarian who you could just go and meet with and say, I'm interested in X, Y, and Z. And they would craft a bibliography for you to work with that you could then assign your graduate student fellow to engage.

that you could then have your typist maneuver. ⁓ In many respects, the distribution of labor and the capacity to distribute that labor across the number of sources has always been present, right? In the critical side of my work, I call this kind of like, it's a plantation economy where you have all of these other kinds of.

Matt Stauffer:
Yeah.

Nick Peterson, PhD:
you know, resources, we would understand them as human resources who produce value that then gets to sort of be beneficial for namespace plantation, right? ⁓ And so on and so forth. So there is this kind of undergird, this underbed of a kind of Marxist critique about ⁓ alienation from labor.

Matt Stauffer:
Yeah.

Nick Peterson, PhD:
right, in terms of the people who do the groundwork don't get the same kind of benefit ⁓ as the person who is understood as the owner of that. In some regards, what AI does on the front end ⁓ is that it reduces ⁓ what we understand as the human load by virtue of extensive access and cataloging of human language and how humans have used language to produce knowledge.

⁓ So I really do, oftentimes, you know, the difference between talking about algorithms as opposed to talking about large language models, I think is really, really helpful. Because really what we are asking AI to do is to perfect our language. ⁓ Whether that's a computer language, whether that's textual language.

We're asking it to crack the code on what we do and what we mean when we use and when we access language. And so to that end, it is...

Matt Stauffer:
Mm-hmm.

Nick Peterson, PhD:
I think it does pose a challenge for how much we understand the human as human because of its use of language and its dependency on language to produce what we call society.

I think that that's kind of one way to think about what that line is, is that we've never really been good at accounting for labor. We don't treat horses and oxen and all of those things with the level of respect that we treat the person who owns them. But they can only garner that respect because they're able to exploit the labor of these beasts of the field. And so in the same way, AI will allow us to exploit access.

Matt Stauffer:
Mm-hmm.

Mm-hmm. Yeah.

Nick Peterson, PhD:
to these large databases of language and the kind of back-end programming that makes them sensible for producing the kind of outputs that we want.

Matt Stauffer:
⁓ I understood all the words and I understood understood almost all the concepts everything you said I want to acknowledge for the listener Nick is brilliant. If you don't see at the bottom it says Nick Peterson PhD He's one of the smartest people I know ⁓ and of course you it's coming coming up brilliant because I'm just sitting here and like I'm like to me I'm like my assumption is there's a very simple line, right? like if I if I am supposed to be presenting material that I wrote so I can get a

degree, if I'm supposed to be preaching at the front of a church or whatever else, it should be my ideas and maybe something helped me get those ideas and maybe something, you know, helped me research that. But in the end, it's got to be my idea. ⁓ And so you're like, what does being my idea even mean? And I'm like, God. But.

Nick Peterson, PhD:
Right, so this

is the, it is a notion of possession. As a scholar, the way I'm trained is that I have to attach my ideas to others' ideas, right? So we lead with what we may call a literary review. Such and such said this, such and such said this, such and such said this. This was responding to this, and then this person said this other thing about this. So you have to do all of this scaffolding to say all of these ideas.

Matt Stauffer:
huh.

Mm-hmm.

Nick Peterson, PhD:
have existed in this constellation, and then you say, I'm sort of doing this thing over here, which is connected to this, this, this, and this, and this, but this is now my unique contribution. And so what happens is that people have to carve out space to sort of say the idea is original, but sometimes the originality of an idea is that I took a concept from another discipline, and I said, what would happen to my discipline if we ask this question in this way?

Matt Stauffer:
Mm-hmm.

Nick Peterson, PhD:
or if we took the issues that this other discipline has sort of named for us and we brought them to bear within our own kind of field. And so now I'm doing pioneering work, but it's not pioneering work because I've thought of something new. It's only pioneering work because I've reconfigured the ingredients. So none of us are really working ex nihilo, right? We're not working from nothing. We're just like, you know, it's it's a.

Matt Stauffer:
Okay, so if you got one of your students, yeah, no, I love that.

And we like to think that we are.

So I appreciate very much your acknowledgement that no matter how deep and thoughtful, it's still all building on other people's stuff. If somebody submitted something to you and you're like, as a master student or whatever else, you're required for delivering this thing. And you were like, this smells like AI and in a way that I'm not happy with.

What is the problem? Is the problem not that AI generated the idea, but the person doesn't understand it? Like what is the actual core concern that you have there if someone's relying too much on AI, given all the nuance that you just shared?

Nick Peterson, PhD:
Yeah, it is. ⁓ It's twofold. One, as an educator, I want my students to be educated. ⁓ And part of what accountability means in a classroom is for the student to be able to track their own learning and for the professor to sort of be able to affirm that traction of the learning. ⁓ And so if one short circuits that,

Right. If one has ⁓ a means by which to deliver something that would be impressive to the professor without it really having been work for the student, I'm not at a loss because I know what I know. Right. But the student is at a loss because they don't know what they don't know. And, you know, even though they may get the good grade, they've short circuited the formation.

Matt Stauffer:
Yeah. Right. Yeah. Yeah.

Nick Peterson, PhD:
⁓ And so I think that, and that's across the board, right? That was the, that's the feeling of ⁓ our fascination with advancement. And so my fear is that because AI can do so much and if students are just even subtly using it to do their thinking, then all we're gonna do is increase the expectations. ⁓ And that means that we constantly have to augment reality.

Matt Stauffer:
it.

Nick Peterson, PhD:
So it's not that we run races barefoot. It's now you got a pillow cushion of a shoe that has this technology in it and it has this, this, this. And so now we've been able to shave off, ⁓ you know, seconds in that. ⁓ So I think that there are trade-offs, but some of what AI could be like is, you know, given a toddler, you know, a lethal weapon. ⁓ And it's not that.

Matt Stauffer:
Uh-huh.

Nick Peterson, PhD:
the toddler has to be bad, but it's that they just don't have the capacity to understand what it means to be responsible with that kind of resource. And so I think AI is really, really beneficial when you do have a good sense of how you learn and what you're learning and ⁓ the kinds of engagement that you want. My best use of AI is really through, you know, falsifiability.

Matt Stauffer:
Yeah, tell me.

Nick Peterson, PhD:
I have an idea, I have a thought, right? This is kind of like scientific reasoning. So I have an idea, I have a thought, and I'm putting it in, I'm like, help me understand what the limitations are here. Now, some of what that is, is like, that's exactly what I would ask a colleague, that's exactly what I would ask, what would be expected in a peer review. But now, through AI, I don't have to wait the six to eight weeks for that.

Matt Stauffer:
Okay.

⁓ huh.

Right.

Nick Peterson, PhD:
to someone for the AI bot to be like, hey, you have a red herring here in this argument, right? Like these are the rhetorical fallacies that are at work in the way you structure the argument. Or even better yet, you don't have an argument, you have a gesture. And so it's like, okay, these are things that are helpful for my own thinking and helping me to think about my thinking, but that's not, okay, now do the thinking for me, right?

Matt Stauffer:
Yeah. Huh.

Yeah.

Nick Peterson, PhD:
⁓ And I think that that's where it's troublesome for students is that when you have the AI do the thinking for you, it can think at a register that outpaces what you don't even know you don't know.

Matt Stauffer:
That's fascinating.

Yeah.

Yeah. ⁓ One of the things when you and I were texting months ago, I think it was, that you first introduced to me is the idea of asking the AI to look at something as if it was looking at it from a particular perspective. And you'd said, hey, I write this thing, I build this thing, imagine you're ABC. It's not like ABC is not going to eventually evaluate the thing, but you get AI pretending to be ABC to evaluate the thing ahead of time, which allows you to be more ready for what's coming.

⁓ I imagine that grant writing is a part of that academic writing is a part of when you need to use that sort of stuff ⁓ What is your? Have you found that like you are using AI in the same way across your three different roles or do you engage with it different in each of different spaces?

Nick Peterson, PhD:
Yes and no. So some of what I want and how I oftentimes use AI is build me a curriculum. I know nothing about X, right? Help me cultivate a pathway of learning for X thing, right? And that becomes a way for me to, in my own pacing, figure out a course or a plan of study.

Matt Stauffer:
Okay.

huh.

Wow. Uh-huh.

Nick Peterson, PhD:
⁓ In terms of the grant writing where I found AI to be really helpful, ⁓ again, is in that gap between intuition and evidence, right? ⁓ I have a sense that this is helpful, that this is beneficial, that this has value. ⁓ But where I am stuck is if I'm communicating that in a way.

Matt Stauffer:
Okay.

Hmm. Yeah.

Nick Peterson, PhD:
that's legible to someone

who's thinking about this from whether or not I'm going to do X, Y, and Z on the back end of the funding. So that side of it becomes a way for me to sort of now, what is the way that I need to learn how to communicate differently than how I typically communicate, right? And so particularly, even as someone especially who teaches, you know, in a communications.

Matt Stauffer:
Yeah, it's good.

Nick Peterson, PhD:
⁓ area, right, through homiletics and liturgies. We're always thinking about arguments, we're thinking about persuasion and all of that. ⁓ But the means and modes of persuasion vary across different contexts, right, and the ways that one is able to make that make sense. And so I knew in my own understanding that a lacuna for me, a gap for me in thinking about grants ⁓ was really in trying to understand how does a grant maker

prioritize what their own values are for how they want their resources to be spent. I'm very good at knowing my ideas. I'm really good at knowing why I think it's important. But if I can't readily ascertain from your perspective what it is that you are saying and what is that you are thinking is important, then I'm trafficking a kind of risk in terms of thinking that everything that's legible to me is going to be legible to you.

Matt Stauffer:
you able to identify that? ⁓

Yeah.

Yeah.

Yeah, yeah.

Nick Peterson, PhD:
And so the large language model, because it is a large language model, can actually start to sort the language that matters within a particular category. So I haven't done this, but I'd be even curious to write, like, what are the action verbs that are most prominent in grants?

Matt Stauffer:
in successful grants? Got it, okay. Sure, that's true, Yeah, huh.

Nick Peterson, PhD:
in successful grants and in failed grants. It may be better to know the ones that are in failed grants, right? ⁓

That's something that 30 years ago I could not ask.

outside of having a relationship with someone who has 30 years of experience working with grants, right? So for me, it's like, I'm not just interested in can you produce a result. I wanna sort of get behind the thing and understand why that would be a result. ⁓ So for me, it really is the ability of the large language model to work with large amounts of language that's helpful.

Matt Stauffer:
Yeah,

Yeah.

Yeah.

So earlier you mentioned the idea, which is totally foreign to me, of ⁓ in the past people who were in school basically like having the research assistant and often their spouses, stuff like that. And you mentioned the plantation model. ⁓ When I hear the word plantation, I go, ew, gross, right? Like that's not something I want to continue. If you had not used the word plantation, I would have used this to kind of...

move into a thing where I'm like, this is a great way to replace that system. And now we all have that same system, but it's AI doing all those pieces. But I'm like, I don't know what to be. I don't know if you're basically I just I'll be dumb.

Would you say that replacing those same roles with AI, so instead of having a graduate assistant who's the research, you got AI to do the research. Instead of having a spouse who does the typing and helps make it make sense, you have AI helping it make sense, right? Like replacing those with AI, but continuing that same world in which I'm not doing all the effort to produce the final output, but there are things helping me in the beginning, right, in research, in the after, the turning my ideas into reality. Is that a pejorative? Is that a negative?

thing or are we like no this is just how it works and the the plantation model is like you know do you see my discomfort like talk me through this

Nick Peterson, PhD:
Yeah, yeah, yeah.

the part of, right, the plantation model is only a discomfort because of what we want the human being category to mean and do. But the plantation model is the model for the corporation. It's the model of how do we contain surplus value in a very particular way? ⁓ And how do we get the help that we need with the least amount of cost? ⁓

Matt Stauffer:
huh.

Nick Peterson, PhD:
That's the, I mean, what the plantation perfects is a certain kind of capitalist orientation to means and modes of production. So it's very efficient in that respect. So to maintain fidelity to the logic of the plantation is not just about.

how we think of human beings and how they should be treated in relationship to each other. But it is about the way we really value the possession of output through sole proprietorship, right? And this is the currency in every discipline, right? This is when I write a book, it's my name that goes on. And even though I have my acknowledgements, why I get to talk about all the people who without them, I would not have been able to do this, but that's three or four patients at the beginning, but the rest gets to fall into, I get the credit for that.

Matt Stauffer:
Yeah.

Uh-huh.

huh. Yeah.

Nick Peterson, PhD:
⁓ And I think that that notion of credit is part of why we have to have to, and I'm saying that loosely, why we have to maintain the logic of plantation is that we really do have to have a means by which to have credit. ⁓ What the large language model poses as a problem or what it does well is it obfuscates the labor behind its production.

Matt Stauffer:
Right.

Nick Peterson, PhD:
Right? So in that respect, it maintains a kind of ⁓ honor economy that is mapped into the plantation. Right? So nobody knows who the 300 workers are on the plantation that allows it to be so fruitful, but they know that it's the Jones plantation. So nobody knows, right, how much, you know, we may have leveraged office assistants, secretaries, spouses.

Matt Stauffer:
Right. Right.

Nick Peterson, PhD:
of students, right? This is oftentimes a problem in higher education is that, you know, as a graduate professor, I may have students who have wonderful ideas and their idea, they can't articulate it as well as I can. And I can just kind of pick up their idea and thank them for being in my seminar on X, Y, and Z. And it sort of comes across as, you know, really, really beneficial. But it's my I get the credit because my name is on the book.

Matt Stauffer:
Yeah.

Hmm.

Yeah.

So it does sound like the battle way is the...

Nick Peterson, PhD:
So AI just allows that cloud

to be even more lebulous.

Matt Stauffer:
Yeah, but it does sound like the battle way is just moving into the good old, the bad new way, right? Like in the past, human beings were not getting credit for their contribution towards the work of one person, whether that one person was plantation owner, whether that one person was the academician who's saying, I made this paper, or in the future, whether it is anybody saying, I did this, not acknowledging the fact that...

chat GPT or Claude I mean I keep mentioning this in the podcast but like my book that I wrote by hand was scooped up into Claude and there's a lawsuit now against me and all the other authors against Claude being like hey you stole our stuff so somebody's gonna write some blog post or something that says I had this idea they had Claude help them Claude pulled mentions from my book and my book is not distributed anywhere so is it just the same thing going on and on and on or is it worse with AI because at least we have those three pages at the beginning of the book or the paper that aren't there now or like

How do you look at it? it any different? Are we in the exact same situation? We've always been in is there a is there a better way going forward?

Nick Peterson, PhD:
This

gets at some of the notion of the universal, right? So the universal means that in some regards we move toward indifferentiation So there won't be a need for any one person to publish anything as long as we continue to feed the engine, right? So as long as my inputs go into Claude, regardless of what Claude puts as the output, it means that all of my thinking becomes a part of this sort of universal mechanism. ⁓

Matt Stauffer:
Mm-hmm.

Nick Peterson, PhD:
So, in one way, the reality of these sort LLMs is that they will make obsolete the need to articulate anything from an individual perspective, because it's collected our individual perspectives to the extent to which it's null and moot to draw that differentiation. But in the case like with your lawsuit, the reason why that matters is that your name on a book entitles you to certain kinds of royalties.

Matt Stauffer:
Yeah.

Nick Peterson, PhD:
Right. It entires

Matt Stauffer:
Yeah.

Nick Peterson, PhD:
you to certain means and modes of enumeration. And so if it weren't for the need to have enumeration or whatever that economy is in my economy, it's not that my books are going to make money. But if my books are cited by other scholars, it gives me traction within the field. It's like, this is a person who's doing amazing thinking. ⁓ But if the the machines are able to do that at scale and with even more efficiency than I can.

Matt Stauffer:
Right. Yeah. Yeah.

Yeah.

Nick Peterson, PhD:
then it will matter less that my name shows up if all of my production gets to be a part of how the machine continues to perfect its efficiency.

Matt Stauffer:
But you did mention we live under capitalism, right? And I love what you just said, because before you pointed out the fact that my lawsuit matters because of being compensated, I was feeling this discomfort because I'm like the idea that the machine consumes all knowledge. have this universality, this catholicity, and we can just put all of our...

put all of our effort into, let's call it Claude, and then we can all get out of Claude. It's this very sharing socialist way of working. That's great, except capitalism, right? And so I'm just like, and this is, I'm not trying to, like, I apologize, because people, I'm trying to get people from all political perspectives and stuff like that, so I'm not trying to be, you know, like going too far in one direction, but at the same time, I'm just looking at this and I'm like, if I could still get paid, if I could still, whatever, if I could still provide for my family and...

Nick Peterson, PhD:
you

Matt Stauffer:
All my contributions were missing. I didn't need Tighten to get work that it gets because of my reputation for being an educator, right? Which I do and so I need people to know that I created this stuff because that's how I pay for my bills, right? If I didn't need that this catholicity this universality thing is wonderful, but it seems like the way it is today What happens when all knowledge is scooped up and no longer attributed and it's all present in this one place that is owned by a few giant tech corporations

Nick Peterson, PhD:
Yeah.

Matt Stauffer:
It's just like, it's just making it worse. This is not heading towards a place where nobody has to worry about anything. It's heading towards a place where the consolidation of power and wealth is even greater than it was before.

Nick Peterson, PhD:
A plantation.

Matt Stauffer:
So

I'm like, so how do we use these tools? even if, because if I don't use the tool, I don't think it has a practical impact. How do we adapt to the new world given that being the future that it's looking like? How do you feel about the future, Nick? Are you hopeful? Uh-huh.

Nick Peterson, PhD:
Well, think, right, it's not just the future, it's the present and the past, right? So the ways

that we can see this ⁓ scenario, right, is through ⁓ the age of enlightenment, but also the age of discovery, right? So this is the fact that you could have 13 % of the global land reserves in European countries determine what the other...

87 % of the globe will look like what it will speak, what its laws would be, and all of that. This is how you have the dominance of Great Britain throughout the 17th, 18th, and parts of the 19th century. ⁓ As this small consolidation of aisles in the Northern Atlantic that has the Americas, that has the entire Western coast of Africa.

that has southern Africa, that has India, and that has this whole other mass called Australia. ⁓ This little small set of aisles, right? So the notion of colonialism, I mean, this is really what we're talking about, is what happens when we have a colonialism that's grounded in knowledge. This is also the birth of the museum.

where you get all of the goodies from other parts of the world and you house them, know, where you want to house them and make them available, how you want to make them available. And you can tell the stories that you want to tell about them. So we're never outside, right, of that kind of limitation. ⁓ I think the lure or where it really becomes dangerous is the extent to which our participation in the process is grounded in a certain kind of social experimental enjoyment. ⁓

Right? So this is the social media phenomena. Like, my gosh, like if you get a social media page, you can say what you want to say. You can get your ideas, your opinions, your thoughts. It's, you know, ⁓ made access to, you know, public, the public sphere, egalitarian, right? Anybody can post. Everybody can post. Everybody can say what they want to say. ⁓ And on one part, it seems like, wow, like, look at how amazing this is sort of.

flat-lined access to the public square. ⁓ But we know it's done a lot more than that, right? Particularly for the small class of people who own these domains and spaces and what that has meant for ⁓ our own limitations in being community, ⁓ but also what it has meant for people who otherwise felt they didn't have a voice until they found.

the 700 other people in the country or world or their city, you who also have similar interests and perspectives. So I don't know that it's reasonable to think that there's an outside that we can get, you know, from or to, or that it will save us in ways that we need saving.

Matt Stauffer:
Yeah.

Nick Peterson, PhD:
I think the other side of this as well is when we talk about the cost, right? This is the other side of a kind of plantation loggage is that you just have to be really good at not doing math well in terms of the cost. So you have to have all of this flexible and fungible equipment that has to be valued at certain ways so that you can maintain ⁓ a coherent narrative about the need for this order and this structure. ⁓

but also not account for the ways that it doesn't add up. ⁓ So I'm of the mindset that we will likely find ourselves in environments that are no longer hospitable for human life long before we as humans opt to live other ways.

Matt Stauffer:
trying to keep up.

Okay,

so I You it's so interesting because I asked these questions thinking that I know the the question and then you're like well your question belies an under underlying assumption or ignorance about All these really deep and thoughtful things and I'm like, I'm still trying to process it as you're talking So I apologize that I'm a little bit behind ⁓ I want to go back and ask that question again and ask you to try and

Nick Peterson, PhD:
Yeah.

Matt Stauffer:
be in the world that I'm in. So one of the things that happens often, know, a political thing will happen where people will say, my God, this terrible thing is happening. And everyone's like, that terrible thing has been happening for hundreds of years. It's just now impacting you. Is that what is happening when someone, you know, American upper class, white collar tech worker says, no, what's this going to happen? Are you saying, well, that's been happening to everybody else? Or is it, is that not kind of the situation? Is it, is it different than that? Like basically, do you think the world is measurable?

worse in 20 years as a result of AI for everybody for some people for nobody like how is it measurely better for everybody for some people for nobody like what's the what you know

Nick Peterson, PhD:
Yeah,

yeah, the short my prediction is that in the way that we understood our generation, right, those of us who were born in the latter part of the 20th century, ⁓ the last quarter of the 10th, 20th century, we were sort of given the message, you go to school, you get a good education, you get a good job, you stay on that job, and then you retire.

And you you may move to Florida, you may not, you know, that kind of, you know, you'll take vacations, so on and so forth. What we, what we learned as we moved into the late nineties, early 2000s, like that is not the case, right? So the way that my grandparents were on jobs for 30 plus years or the way that my parents, you know, were on a job for 20 years that changed names multiple times because it was bought about something else. ⁓ The notion of a pension and all of those things are, are no longer accessible.

Matt Stauffer:
Yes.

Nick Peterson, PhD:
But also, I I grew up in the Midwest, right? And so my family, you know, and the areas where I lived were all connected to the auto industry. And so my dad coming of age in the seventies, the idea was like, you get a good job at General Motors or Ford or Chevy and you're set, you know? And then when the American auto industry crashed, I mean, you could see all across the Great Lakes, right? Along the highway 80 corridor.

Matt Stauffer:
You save her life. Yeah.

Nick Peterson, PhD:
and then coming south into the 70 corridor, right? From Buffalo all the way across, right? Into the St. Louis's and these other kinds of places ⁓ where when those jobs dried up.

Matt Stauffer:
Yeah. Yeah.

Nick Peterson, PhD:
They dried up. Right.

And so I think what became the kind of upper middle class job of the 21st century in terms of thinking, in terms of tech, ⁓ in terms of the medical profession and some of those things, that is going the same way. ⁓ That in the same way that they were able to find cheaper labor to make cars in other parts of the world, they'll find cheaper labor to produce code. Right.

Matt Stauffer:
Mm-hmm.

Nick Peterson, PhD:
which is its own kind of universal language ⁓ in this iteration. So I think in terms of the working class and what it means to think about your livelihood through working for another ⁓ is already sort of inscripted in the notion of how do you become obsolete?

Right? So I have a friend, his wife is an interior designer and she's gotten multiple offers from places like Home Depot or Lowe's to train their AI bots on interior design. ⁓ And to sort of, here's something, we'll give you a prompt to design something and we'll give the AI the same prompt and see, and then you kind of evaluate what it should have done better.

Matt Stauffer:
Wow.

tell it what it should have done better and everything. Yeah.

Nick Peterson, PhD:
And we'll pay you, you know, $15,000, you know, to do this work. And it may seem like, wow, this is really great. But it's like, ultimately, you're training a much cheaper replacement, you know, in that respect. So I think that that's the same thing with the large language models. Like the more we use them, the better they get. And the better they get, the more we use them. And the more we use them, the better they get. the more, you know, the better they get, the more reliably we'll use them.

Matt Stauffer:
replacement. ⁓

Yeah. Yeah.

Nick Peterson, PhD:
And from the owner class, right, of these various platforms, it's great because if we start to lose our own knowledge, right, if we, and this is the sort of basics, like we, when we lose the ability to farm, we become more dependent on a grocery store. When we lose the ability to care for our children and our elderly, we become more dependent on nursing homes and on schools to cover all of these gaps to the extent that we don't know how to do those things anymore.

Matt Stauffer:
Mm-hmm, yeah.

Right.

Yeah.

Nick Peterson, PhD:
And now the purveyors, right, those who have ⁓ found a way to own knowledge to those things are the plantation owners that we then have to beg and plead to make a space for us.

It's the cold town, right? So what happens when, you know, ⁓ Anthropic decides that they're going to do a city and, you know, they'll need human engagement to sort of feed that city and we're in, we're under employee to be those human Asians.

Or better yet, we are so dependent on these various platforms that like our entertainment, we have our subscriptions, right? So I subscribe to the thing and God forbid the internet crashes or something. And now it's like, I can't do anything because my email is ran by my email bot.

Matt Stauffer:
you

Mm-hmm

I I Feel bad because I feel like there's so to the listeners listeners have been giving me crap for ages because they're like we want these to be longer I planned for them to all be longer Unfortunately, Nick and I booked this long before I did that so we both have to go at the the hour marks So we have to cut this short ⁓ and then I especially feel bad because I'm taking a brilliant ⁓

theoretical academic person and I'm asking you to dumb it down, but I want to ask you to dumb it down one more time and then we're going to start wrapping, which is if there were one action that you hope the listeners would take in response to what you've shared today or not, but if there's just one action in the world of AI and technology and our relationship with these things you said, if everybody just did this one action, do you have an answer for that? Do you know? And I know that's a lot to ask of you, but you're like, if everybody did this, we'd be a little bit better.

Nick Peterson, PhD:
So yeah, my one option would be buy paper books.

Matt Stauffer:
Okay. I love that.

Nick Peterson, PhD:
In large part because if we move into some type of crisis where we don't have access to our cloud-based media, ⁓ if we can still read and we still have something in print, ⁓ print has proven durable throughout human history.

Matt Stauffer:
huh.

Nick Peterson, PhD:
by virtue of the fact that all of our wisdom traditions have existed for thousands of years because people had ideas about putting some kind of, some notion of something to pen to paper in the various ways that paper is understood or art is understood even in that sense ⁓ so that that knowledge could be passed on. So buy buy hard copy.

Matt Stauffer:
Yeah.

Hmm. Yeah.

physical media. It's funny

because ⁓ you know my wife Imani is a filmmaker and she's like ⁓ I'm

I don't like the fact that streaming just means that even we buy something we're really just buying a license and they can take it away at any point. And so she's been having us buy DVDs, Blu-rays, CDs and stuff like that again. And at least that doesn't depend on the internet. That doesn't depend on somebody else approving us, right? They have the physical devices, but it still depends on technology in some ways. Whereas a printed book, I'm like, as long as I have light and the ability to read, I can engage with the printed book. So I'm like, yeah, I like that a lot. ⁓ Okay, so.

Nick Peterson, PhD:
Yes.

Matt Stauffer:
There is so much more we could talk about with apologies to the listeners who want us to keep talking about them. But if someone is like this Nick Peterson guy, I need to know more about what he's doing. What does it look like for them to follow you, engage with your materials, or interact with the type of stuff you're talking about?

Nick Peterson, PhD:
Um, so I have a website, but I'm not very good at keeping it up to date. That's brother Nick.com. Uh, so that'll be one. And I, oddly enough, the project, the grant that I'm working in is very much about spiritual formation and digital spaces. But I primarily use, uh, my own digital spaces as pressure valve releases for humor. Uh, and, um, and other kinds of random thoughts that come to mind.

Matt Stauffer:
Mm-hmm.

Alright.

Nick Peterson, PhD:
So Nick Peterson on like Facebook is where I am as well. And then I think it's the Nick Peterson experience on Instagram. In terms of my academic work, I am working on a monograph. That'll be my first book. And so hopefully that'll come to the world within the next year or so. But outside of that, the idea is like, okay, then when I have a book, then I will sort of...

Matt Stauffer:
Okay.

Yeah, certainly

be out there. Yeah. Well, we'll make sure that all those things are linked in the show notes and any other questions that I have. I'm texting you later being like, you got to explain this piece again. I will make sure that whatever you teach me, I will link those in the show notes again. So all the resources that I beg out of you after this, I will make sure to share them with show notes for everybody else as well.

Nick Peterson, PhD:
Right, like a splash in the puddle.

Awesome.

Matt Stauffer:
Is there anything that you wanted us to talk about today that we didn't get a chance to? Obviously there's tons, but is there any individual topic that we are like, you know what, a couple minutes on this would be worth it.

Nick Peterson, PhD:
Bye!

No, I mean, I think that there's a lot to talk about. I, to be honest, right, I do think about a lot of this theoretically. And so I welcome the opportunity to, again, to bring this to a kind of a ground level for how this tracks. But much of what my own thinking is trying to orient it around is like, what are the patterns that we think we've escaped that we actually haven't?

Matt Stauffer:
Yeah.

Nick Peterson, PhD:
So that's oftentimes where I'm trying to make connections with how we make use of various technologies. Like, what are we thinking this solves without recognizing that we've already been trying to solve a thing?

Matt Stauffer:
Yeah, I mean, the amount of history that you've shared today is just amazing to me. I'm like, yeah, I guess it makes sense that that happened and that you all have studied it as people who know more than me in the day to day. I just tend to interact with it. I'm like, this is the first time this has ever happened and we will engage with it. And you're like, no, this is in the history of this thing that's been 400 years going on. This thing that's 2000 years old, this thing that's been there since the beginning of history. I'm just like, oh, yeah, OK. That definitely there's a lot of learning and translation to get to the point where I say I can apply that to my everyday, but I'd rather do that than just

just be like, every experience I have is entirely unique. So I appreciate that. ⁓ Nick, you're incredible. ⁓ I still miss you and I wish you came back to Atlanta, but still, thank you for hanging out with us for a little bit, sharing your knowledge. And yeah, I really appreciate you.

Nick Peterson, PhD:
Thank you, I appreciate you as well, man.

Matt Stauffer:
All right, for the rest of you, we'll see you next time.

Nick Peterson, PhD:
Ciao, thank you.

Creators and Guests

Matt Stauffer
Host
Matt Stauffer
CEO of Tighten, where we write Laravel and more w/some of the best devs alive. "Worst twerker ever, best Dad ever" –My daughter
Nick Peterson, PhD
Guest
Nick Peterson, PhD
Assistant Professor of Homiletics and Worship at Christian Theological Seminary
The Nexus of Technology & Theology
Broadcast by