I won’t try to pretend otherwise, I’ve had some mixed feelings about Stack Overflow for a long time. I think the concept of the place, ask a question, get an answer, is great. No question. How that was handled, tone of moderation, other stuff, maybe it wasn’t all handled as well as it could have been. However, it was a model that proved itself. That is, until AI came along.
First, go read this. It’s a tough story. I see the same thing happening in a lot of places. Heck, I go to several different AI engines for answers more than I go to anyplace else (and then, yes, I validate those answers through other types of research, testing, experimentation, but critics might be surprised at how accurate AI answers can be… depending).
From where I sit (admittedly, in the cheap seats), this is what I’m seeing. First up, yeah, probably Stack Overflow is dead. And that sucks. For a few reasons. First and foremost, it truly was useful. A lot of the people hanging out there were some of the most knowledgeable and capable technologists out there. One example, Paul White. He’s probably forgotten more about execution plans and the optimizer than I ever knew. Even that is undoubtedly an understatement and my own ego talking. I regularly learned stuff from him over the years and I’m truly grateful (although, I never showed it properly. Sorry Paul.). If the site goes away, we all lose those people. However, even further, AI is losing. Let’s face it, they trained off all the hard work done by the SO people. Those well-polished questions with well-polished answers? That’s what built the AI. Uhm, when a new language, data stack, whatever, comes out… how’s the AI going to train up on all the best answers to the most common questions? Oof. No more Stack Overflow to give you a free, full, excellent, training data set.
Second thing, this is a bit of an opportunity. Understand, I don’t know how best to exploit this opportunity, but I see it plain as day. You have a question, you want an answer. Here’s the AI to give you that. Ah, but now you want to discuss this, either from foundations or more advanced? Back to the AI? Yeah, maybe. Do you know how to validate the AI answers (other than tossing them into production and hoping for the best?)? Really, you know what you want? A little human connection. Insights. Guidance. Interaction. Something beyond just an answer to a question. Think discussion forums. Think user groups and meetups. You know, primitive, old-fashioned, caveman, face-to-face communications. The AI can’t do that. Well, it sort of can, but not that well. Yeah, people will use it that way, but inevitably, many (can’t say most) will turn to actual humans.
You still have to overcome some of the problems that come with that. How much gatekeeping do you do to let people ask questions at your group? That was one of the things that lead to Stack Overflow being created. Can you maintain civility, maybe even lead with kindness? No? Mmmmm. Heck, how good are you at the tech in question? That’s going to matter as well. And just, can we be empathic about people asking the same questions repeatedly? Cause the AI sure can. See, not going all Pollyanna on this.
So, yeah, I’m saddened by Stack Overflow, but I think we have an opportunity if we can just figure out what to do with it.
This is scary on many levels as WAY too much faith and trust is being placed in these LLM’s being called AI when they really aren’t. Yes they are a “form” of AI and while we’d hope the more tech savvy amongst us would know better, I’ve seen others view these as AI as being everything an AI would be short of self-aware; just imagine how the non-tech savvy think of these.
I’ve conversed with Steve Jones (Voce of the DBA) from RedGate Tools on his own site about this and the dangers of it. The biggest concern I have isn’t how many devs (as well as others) believe these things are more than they are but the CEO’s who can only see the cost cutting benefits, By replacing humans with these fancy bots just as they did a few decades back with offshoring they can lower costs not so as to keep the business profitable but to pad their own bonuses. Just as with offshoring, it sounds good and works initially but once more than a small percentage are doing it the cost benefits are negated since you no longer have a cost advantage and your actions have now made things worse in the long run.
We’d like to believe that at least within in Tech/Dev work the higher-ups wouldn’t fall prey to the same bad habits executives of large corporations in other industries follow, placing emphasis on increasing their bonuses (via cost cutting) over improvements in quality, customer service and productivity but they absolutely will. What I see is a golden age at first that after a short period goes very wrong. In your example with the training, if resources like Stack Overflow are ended because of these bots then at some point the bots knowledge will fall behind with no real sources to pull from and stay up to date. This is akin to firing all the teachers in all the universities to use LLM’s and after a few years, those in studies where information does change will fall behind and with no way to catch up. It’s leaping without looking “far enough” ahead.
This is really a war on 2 fronts. We have those placing too much faith trust in something that doesn’t deserve that level of faith & trust and the decision makers who’s focus is going to be on how to increase their next bonus. This isn’t monolithic, meaning some will abstain from these foolish mis-steps but will that be enough to prevent a technical dark age? I don’t believe this is hyperbolic because we as humans, when in a collective, will tend to ignore issues until they reach the point that they became very difficult to adress and that’s assuming they can be. We are too reactive and not proactive enough. Thankfully we have people like you who are asking now “Is this really wise” which is a start.
I fall pretty heavily to the “LLM/AI is just another tool” camp. That said, tools matter. How you use them. What you use them for. Their longevity, etc.. It’s not dismissive to say it’s just another tool. Further, yeah, no idea what management is going to do.
I really wonder though, without someone else building the knowledge, how the heck an LLM is going to deal. BTW, I say this as I’ve just had an LLM hallucinate an answer for me on the very first question (usually takes two).