As someone who was raised extremely religious, strayed to the polar opposite, and is now trying to find my way in between the two, I do find this interesting. While the understanding of LLMs and when/how to apply them makes sense, I would argue that they fit right alongside human interpretation of scripture. Consider that many pastors "teaching" scriptures aren't even formally educated.
Arguing that you can't use an LLM for Christian apologetics because it "might not be true" overemphasizes the definition of "truth" when it comes to scripture and those teaching Christian apologetics, which is entirely influenced by what doctrine you subscribe to.
But this isn't just claiming to produce some benign facts, it is trying to make claims on absolute truths with consequences as dire as "going to hell".
Even if you don't believe, the creators certainly intend for their bot to have eternal consequences. Like selling an LLM with the claim it can give advice better than most doctors and should be used as such, the intent behind the apologetics bot is just as reckless and conceited.
That's not really what the article is talking about. The article is referring to the fact that you can ask for specific verses in some version/translation (that are KNOWN) and the potential for the LLM to confidently generate a completely fabricated or subtly different copy.
And going a step further, any follow-up questions to the LLM will be using this incorrect copy as the source for interpretation causing it to go even further in the wrong direction.
Incidentally - this was occurring using a custom fine-tuned model with an added layer of RAG.
> Arguing that you can't use an LLM for Christian apologetics because it "might not be true" overemphasizes the definition of "truth" when it comes to scripture and those teaching Christian apologetics, which is entirely influenced by what doctrine you subscribe to.
But the author is pretty explicit about wanting a high standard (e.g. insisting on using the best sources possible), and doesn't think that using LLMs is compatible with that goal.
Did you read the article?
The main example he gives is a simple factual matter about the words a specific early Christian manuscript. The LLM invented new text that’s not at all what’s in the manuscript.
He also convincingly argues people performing poor apologetics is no excuse to deploy an LLM performing poor apologetics.
Don't feel obligated to ask if you are not comfortable, but why are you trying to find a middle ground? When you say middle ground, what does that mean for you? Does that mean you have some faith but maybe not in a particular sect?
> now trying to find my way in between the two
What's your motivation for doing that?
There's a large gap between:
1. This interpretation might not be "true" but it is a good-faith effort that respects the text.
2. This LLM is fabricating verses, chapters, and even books of the Bible.
If you've used LLMs much, you know that #2 is not only possible, its quite common. This is the kind of "might not be true" that you should be aware of when using an LLM for apologetics—or any effort where "truthiness" is important.