logoalt Hacker News

Someone1234last Saturday at 11:00 PM13 repliesview on HN

They will no doubt blame this on AI, somehow (ChatGPT release: late 2022, decline start: mid 2020), instead of the toxicity of the community and the site's goals of being a knowledgebase instead of a QA site despite the design.

PS - This comment is closed as a [duplicate] of this comment: https://news.ycombinator.com/item?id=46482620


Replies

nospicelast Saturday at 11:09 PM

Right. I often end up on Stack Exchange when researching various engineering-related topics, and I'm always blown away by how incredibly toxic the threads are. We get small glimpses of that on HN, but it was absolutely out of control on Stack Exchange.

At the same time, I think there was another factor: at some point, the corpus of answered questions has grown to a point where you no longer needed to ask, because by default, Google would get you to the answer page. LLMs were just a cherry on top.

show 2 replies
f311alast Saturday at 11:41 PM

People overestimate the impact of toxicity on number of monthly questions. The initial growth was due to missing answers. After some time there is a saturation point where all basic questions are already answered and can be found via Google. If you ask them again they are marked as dups.

show 1 reply
zahlmanyesterday at 8:00 AM

> the site's goals of being a knowledgebase instead of a QA site despite the design.

A Q&A site is a knowledge base. That's just how the information is presented.

If you want a forum — a place where you ask the question to get answered one-on-one — you have countless options for that.

Stack Overflow pages have a different design from that explicitly to encourage building a knowledge base. That's why there's a question at the top and answers underneath it, and why there are not follow-up questions, "me too" posts, discussion of annoyances related to the question, tangential rants, generic socialization etc.

Jeff Atwood was quite clear about this from the beginning.

rorylawlessyesterday at 12:15 AM

The downward trend seems to start ~2017, and was interrupted by a spike during the early months of COVID-19. I'd be interested to know what drove that jump, perhaps people were less hesitant to post when they were working from home?

show 1 reply
brickerslast Saturday at 11:07 PM

If you ignore the early pandemic bump, it even looks like the decline started in late 2017, though it's more variable than after the bump

nicceyesterday at 1:06 AM

I wonder what is the role of moderating duplicate questions. More time passes - more existing data there is and less need for new questions. If you moderate duplicate questions, will they disappear from these charts? Is this decline actually logical?

2020 there was new CEO and moderator council was formed: https://stackoverflow.blog/2020/01/21/scripting-the-future-o...

crystal_revengeyesterday at 12:17 AM

Many people are pointing out the toxicity, but the biggest thing that drove me away, especially for specific quantitative questions, was that SO was flat out wrong (and confidently so) on many issues.

It was bad enough that I got back in the habit of buying and building a library of serious reference books because they were the only reliable way to answer detailed technical questions.

fabian2klast Saturday at 11:19 PM

There is an obvious acceleration of the downwards trend at the time ChatGPT got popular. AI is clearly a part of this, but not the only thing that affects SO activity.

dpkirchneryesterday at 1:01 AM

I wonder if we can attribute some $billion of the investment in LLMs directly to the toxicity on StackOverflow.

macNchzlast Saturday at 11:24 PM

Ironically they could probably do some really useful deduplication/normalization/search across questions and answers using AI/embeddings today, if only they’d actually allowed people to ask the same questions infinite different ways, and treated the result of that as a giant knowledge graph.

I was into StackOverflow in the early 2010s but ultimately stopped being an active contributor because of the stupid moderation.

wraptileyesterday at 2:34 PM

Toxic community is mostly a meme myth. I have like 30k points and whatever admins were doing was well deserved as 90% of the questions were utterly impossible to help with. Most people wanted free help and couldn't even bother to put in 5 minutes of work.

mellosoulsyesterday at 5:25 AM

Use of GPT3 among programmers started 2021 with GitHub Copilot which preceded ChatGPT.

I agree the toxic moderation (and tone-deaf ownership!) initiated the slower decline earlier that then turned into the LLM landslide.

Tbf SO also suffered from its own success as a knowledgebase where the easy pickings were long gone by then.

IshKebablast Saturday at 11:07 PM

It is sort of because of AI - it provided a way of escaping StackOverflow's toxicity!

show 1 reply