There was a paper recently about using LLMs to find contradictions in Wikipedia, i.e. claims on the same page or between pages which appear to be mutually incompatible.
https://arxiv.org/abs/2509.23233
I wonder if something more came out of that.
Either way, I think that generation of article text is the least useful and interesting way to use AI on Wikipedia. It's much better to do things like this paper did.
> Unfortunately, these models virtually always fail to properly source claims and often introduce errors.
A quote for the times.
May be a bit of a sisyphean task, though...
This is hardly surprising given - New partnerships with tech companies support Wikipedia’s sustainability. Which relies on Human content.
https://wikimediafoundation.org/news/2026/01/15/wikipedia-ce...
I opened a random page with the label: https://en.wikipedia.org/wiki/Ain%27t_in_It_for_My_Health
Curious, what are the signs that this particular page has been written by an AI?
I’m not saying it wasn’t, I’m probably not seeing something and wondering what to look for.
I enjoyed the recent talk looking at the reasons people add generated content: https://media.ccc.de/v/39c3-ai-generated-content-in-wikipedi...
The Sanderson wiki [1] has a time-travel feature where you read a snapshot just before a publication of a book, ensuring no spoilers.
I would like a similar pre-LLM Wikipedia snapshot. Sometimes I would prefer potentially stale or incomplete info rather than have to wade through slop.
I wish they also spent on the reverse: automatic rephrasing of the (many) obscure and very poorly worded and/or with no neutral tone whatsoever.
And I say that as a general Wikipedia fan.
It is really good that they are taking steps to remove this stuff. You can usually tell right away when something was not written by a human.
Contrarian take: Wikipedia could use more AI, as well as less.
A major flaw of Wikipedia is that much of it is simply poorly written. Repetition and redundancy, ambiguity, illogical ordering of content, rambling sentences, opaque grammar. That should not be surprising. Writing clear prose is a skill that most people do not have, and Wikipedia articles are generally the fruit of collaboration without copy editors.
AI is perfectly suited to fixing this problem. I recently spent several hours rewriting a somewhat important article. I did not add or subtract information from the article, I simply made it clearer and more concise. I came away convinced that AI could have done as good a job - with supervision, of course - in a fraction of the time. AI-assisted copy-editing is not against Wikipedia rules. Yet as things stand, there are no built-in tools to facilitate it, doubtless because of the ambient suspicion of AI as a technology. We need to take a smarter approach.
Signed up to help.
On PickiPedia (bluegrass wiki - pickipedia.xyz), we've developed a mediawiki extension / middleware that works as an MCP server, and causes all of the contributions from the AI in question to appear as partially grayed out, with a "verify" button. A human can then verify and either confirm the provided source or supply their own.
It started as a fork of a mediawiki MCP server.
It works pretty nicely.
Of course it's only viable in situations where the operator of the LLM is willing to comply / be transparent about that use. So it doesn't address the bulk of the problem on WikiPedia.
But still might be interesting to some:
It may be that AI made Wikipedia worse (I have no idea), but Wikipedia itself made several changes in the last 5 years which I hate. The "temporary account" annoys me; the strange side bars that are now the new default also annoy me. Yes, they can be hidden, but why are they shown by default? I never want them; I don't want to use them either. And some discussion pages can not be modified either - I understand that main articles can not so easily be changed, but now discussion pages as well? This happened to me on a few pages, in particular for "ongoing events". Well, I don't even visit ongoing events at a later time usually, so I give feedback or I WANT to give feedback, then I move on. With that changed policy, I can now skip bothering giving any feedback, so Wikipedia becomes less interesting as I give feedback on the QUALITY - what to improve. And so forth. It is really sad how different people can worsen the quality of a project such as Wikipedia. Wikipedia is still good, but it was better, say, 6 years ago.
Although Wikipedia has no firm rules (WP:PILLARS), the admins reference the policies (that aren’t rules) when reverting content and banning. So here’s what I gathered
* no new Articles from LLM content (WP:NEWLLM)
* Most images wholly generated by AI should not be used." (WP:AILLM)
* “it is within admins' and closers' discretion to discount, strike, or collapse obvious use of generative LLMs" (WP:AITALK)
There doesn’t seem to be an outright ban on LLM content as long as it’s high quality .
Just an amateur summary for those less familiar with Wikipedia policy. I encourage people to open an account, edit some pages and engage in the community. It’s the single most influential piece of media that’s syndicated into billions of views daily, often without attribution.
[dead]
[dead]
Inb4 wikipedia is lost to the same narrative control as MSM
I don't see how this is going to work. 'It sounds like AI' is not a good metric whatsoever to remove content.
Isn't having a source the only thing that should be required. Why is AI speak bad?
I'm a embarrassed to be associated with US Millennials who are anti AI.
No one cares if you tie your legs together and finish a marathon in 12 hours. Just finish it in 3. Its more impressive.
EDIT:
I suppose people missed the first sentence:
>Isn't having a source the only thing that should be required.
>Isn't having a source the only thing that should be required.
>Isn't having a source the only thing that should be required.
I found the page Wikipedia:Signs of AI Writing[1] very interesting and informative. It goes into a lot more detail than the typical "em-dashes" heuristic.
[1]: https://en.wikipedia.org/wiki/Wikipedia:Signs_of_AI_writing