← All posts

AI Ethics

AI for Pastors in 2026: What's Helpful, What's Risky, and Where to Draw the Line

· 7 min read

A pastor I know recently told me: "Half my elders think I should be using AI more. The other half think I shouldn't be using it at all. And honestly, I'm not sure which side I'm on."

That's the moment most pastors are in. The tools are here. The ethical maps are not.

This post is one pastor-friendly framework. Not a final word — there isn't one yet — but a practical lens for the next time you're staring at a blinking cursor at 11pm on Saturday wondering whether it's okay to ask Claude for help.

The framework has three categories: helpful, risky, and off-limits. The exact lines move depending on your tradition. The categories themselves are usable across most.

A premise to start with

The objection to AI in ministry isn't really about technology. It's about formation.

A pastor's voice — the way you handle a passage, the way you press an application, the way you sit with a grieving family — is not a skill you developed by typing alone. It was formed over years of reading, prayer, conversation with mentors, mistakes preached and repented, weddings and funerals and counseling sessions you'll never write down.

The fear of AI is, at root, the fear that this formation gets short-circuited. That a pastor who outsources the wrestling outsources the becoming.

That fear is legitimate. It's also not what most pastors are actually doing with these tools. So the real question isn't "is AI okay" — it's "where does AI accelerate formation, and where does it bypass it?" That's the line we're trying to find.

HELPFULacceleratesDrafting routine emailsSummarizing long docsEditing your manuscriptRepurposing sermonBrainstorming angles RISKYhandle with careCounseling prepSermon outline draftsBible interpretation QsLetters to specific peoplePosts that quote you OFF-LIMITSbypasses formationFull AI-written sermonsAI-direct counselingProphetic speechFake testimoniesHidden AI assistance

What can AI help pastors with? (Helpful)

These are uses where the pastor is still doing the thinking. AI is doing the typing.

1. Drafting routine communications. Weekly email newsletters. Funeral programs. Volunteer thank-you notes. A welcome letter to a first-time visitor. None of these require pastoral discernment in the writing — they require it only in the editing. Letting a model produce the first draft you then revise is functionally the same as a pastor 30 years ago having a secretary type from dictation.

2. Summarizing long documents. A denominational policy paper, a 200-page commentary chapter, a 3-hour staff meeting transcript. Asking AI for a 200-word summary you then verify against the source is a research speed-up, not a discernment shortcut.

3. Editing your own sermon manuscript for clarity. You wrote the sermon. You wrestled the text. Now you ask AI: "Where in this manuscript am I being too academic? Where am I burying the lead? Where does the application get vague?" The model is functioning as a coach, not a co-author.

4. Generating discussion questions, devotionals, and social posts FROM your sermon. This is the pattern Sermoneer is built on. The pastor preached. The artifact already exists. AI extracts and reformats — it doesn't invent content. Ethically equivalent to a communications director writing the small group guide based on listening to your sermon. (See Why Your Sermon Dies at Noon on Sunday for the long version.)

5. Brainstorming sermon angles you might miss. "I'm preaching on Mark 4. Give me five angles a pastor in [your context] might miss." You then ignore three, file one for later, and use one as a starting point for your own study. Same use as paging through three different commentaries to be jolted out of your default reading.

The common thread: the pastor stays the author. AI is a tool in the workshop, not the carpenter.

When is AI risky for pastors? (Risky)

These are uses where AI can contribute, but where outsourcing too much will quietly hollow out your ministry.

1. Pastoral counseling preparation. Asking AI "what are common questions to ask someone walking through anticipatory grief" — fine. Asking AI to write the response you give to the specific person sitting in your office tomorrow — not fine. The first is research. The second is replacing pastoral presence with text.

2. Sermon outlines from scratch. A pastor in a crunch week asks AI for a sermon outline on a passage. The risk isn't that the outline is bad. The risk is that the pastor stops doing the wrestling that makes them the pastor of this congregation. Once a quarter, in a hospital-week crisis? Probably fine. Every week? You're slowly becoming a delivery vehicle for someone else's exegesis.

3. Bible interpretation questions. "What does the Greek word parakletos mean?" Fine — it's a lookup. "What does this passage mean for our church?" Risky — that's pastoral discernment, not lookup. AI doesn't know your church, your moment, or your tradition's interpretive commitments.

4. Letters to specific people in your congregation. A condolence note. A confrontation letter. A reconciliation request. AI can give you a structure, but the actual words should come from you. The recipient will know the difference, even if they can't articulate it.

5. Social media posts that quote you. A clip of you preaching, captioned with words AI wrote, posted under your name — that's a misrepresentation, even if the words are fine. If something is published as your voice, you should have written it (or at minimum, deeply edited what AI drafted, not just rubber-stamped). This applies whether the post is a sermon clip or a carousel built from your sermon.

The common thread in this middle zone: the more pastorally specific the audience, the less AI should be in the actual words.

What should pastors never use AI for? (Off-limits)

A short list. Most pastors will agree on these even across very different traditions.

1. A fully AI-written sermon delivered as your own. Not because the AI's exegesis is bad. Because preaching is a relational act. Standing in front of your people on Sunday and reading words you didn't think through is a quiet betrayal of the trust they extended when they showed up.

2. Pastoral counsel given by AI directly to the congregant. Pointing a hurting person to ChatGPT instead of sitting with them is not delegation. It's abdication. Pastoral presence is not a deliverable that can be outsourced.

3. Prophetic speech. "What does God want to say to our church right now?" is not a prompt. The discernment of the Spirit's voice — through Scripture, prayer, community, your tradition — is the irreducible work of the pastor. AI cannot do this. Treating it as if it can is dangerous.

4. Generating fake testimonies, reviews, or congregant content. Self-evident, but worth saying. AI-written testimonials posted as if from real congregants is fraud, even when the underlying claim is true.

5. Pretending AI assistance didn't happen, when asked directly. If a congregant asks you "did you write this?" — answer honestly. The sin isn't using AI. The sin is concealing it.

How do you decide when AI is appropriate?

Five questions to ask before any specific AI use:

  1. Am I still the one doing the discernment work, or am I outsourcing the wrestling?
  2. Would I be comfortable telling my elder board exactly how I used this tool?
  3. Is the audience for this output a specific person or congregation I shepherd, or is it general communication?
  4. Am I editing the output enough that it carries my voice, or am I rubber-stamping?
  5. Is this saving me time so I can pastor more deeply, or so I can pastor less?

That fifth one is the hardest. It's also the one that matters most. AI in ministry is a question of whether it gives you back your week to spend on what matters — or whether it lets you avoid what matters.

A note on Sermoneer

Since we're a tool used by pastors, we should be transparent about where we sit on this map.

Sermoneer takes your already-preached sermon and produces drafts of the formats your congregation needs anyway: a small group guide, a devotional, social cards, shorts. The pastor is the author. The sermon is the source. The tool is doing the typing.

Sermoneer doesn't write sermons. It doesn't generate exegesis. It doesn't produce content from a topic prompt. It only works on text the pastor has already preached. That's a deliberate limit — the kind of limit that keeps us in the Helpful column and out of the Risky one.

If you find yourself wishing it would do more (write the next sermon, give pastoral counsel to a congregant, generate content without a source), we'll consistently say no. Those belong to you.

The line we're drawing is the line we hope you draw too: AI accelerates work you would do anyway. It does not replace the work that makes you a pastor.

Try Sermoneer

One sermon → small group, devotional, cardnews, and shorts in 90 seconds.

Start free