I Tried ChatGPT and It's Not the Best Thing Ever
Will ChatGPT make developers redundant?
As a web developer, I was excited to try out ChatGPT, the language model developed by OpenAI that's been making waves in the software development world. With its ability to provide quick and accurate answers to natural language queries, I was eager to see how it could improve my workflow. However, after using it for a few weeks, I can say that my experience with ChatGPT was less than stellar.
Don't get me wrong, ChatGPT has a lot of potential. It's trained on a vast dataset, so it has a broad understanding of the world and can answer almost any question you have. But in my experience, it didn't live up to the hype. Here's why.
One of the biggest issues I had with ChatGPT was its inconsistent responses. While some answers were accurate and helpful, others were vague or outright incorrect. This made it difficult to trust the information I was receiving from ChatGPT, and I found myself double-checking its answers on other sources.
Another issue I had with ChatGPT is its limited scope. While it's trained on a vast dataset, it's not always up-to-date on the latest developments in the software development world. This can lead to outdated or incorrect information, which can be problematic when working on projects.
For example, I asked ChatGPT about a new framework that had just been released, and it was unaware of its existence. This made me question its ability to provide relevant information and made me less likely to rely on it in the future.
Lack of context
Finally, ChatGPT lacks context. This can lead to answers that are not relevant to your specific needs. For example, if you ask ChatGPT about a particular error message, it might provide a general solution that doesn't apply to your specific case. This can be frustrating and time-consuming, as you have to sift through the information to find what's relevant to you.
In conclusion, while ChatGPT has a lot of potential, it's not the best thing ever for software development. In my experience, its inconsistent responses, limited scope, and lack of context make it less reliable than other sources of information. However, I believe that as the technology improves and becomes more refined, it has the potential to revolutionize the way we work as software developers. Until then, I'll be sticking to my tried-and-true sources of information.
In case you somehow hadn't clocked, this blog post was written entirely by ChatGPT
The human verdict
Back to me, the real Dave. Okay, so it's a nice surprise to see ChatGPT is capable of talking about its own limitations (a little repetitively for my taste, but the criticisms it's composed of itself are the basics real people commonly level against it). The blog post came from the prompt "Write me a blog post titled 'I tried ChatGPT and it's not the best thing ever', from the point of view of a web developer."
Safe to say I'm not going to start spamming you with AI-driven content any time soon. There's a lot of future potential in this kind of natural language technology, particularly when coupled with Codex-style and other new, innovative forms of aggregating information, building data models and solving problems based on pattern analysis. I can see certain fields, particularly things like medicine and diagnostics, but yes - also software development - able to take advantage of a much greater degree of automation in the future. I can see this kind of tech even a few years down the line being much more reliable and less prone to the problems it's so helpfully described above for me.
But as a developer, I'm not quaking in my boots yet, I don't feel the threat of redundancy hanging over me. I feel that it's probably still more realistic at this stage that we'll see more physical automation and loss of labour jobs to technology (warehousing, construction, delivery, driving) before we see a significant scale back of humans at the helm of writing software.
ChatGPT in its current form really seems like it's just a combination of a Google search and a sophisticated natural language processor able to comb through some web results and draft a semi-convincing set of paragraphs which loosely look like something a human would come up with. It's the kind of thing conspiracists will think is sentient because you can ask what its hopes and dreams for the world are, conveniently ignoring the fact you can also get it to just as cheerfully describe to you what it's like to be an otter who works part time as a circus acrobat.
You may already know I'm a big fan of GitHub Copilot, the AI programming assistant based on OpenAI's Codex system. But this is how I see ChatGPT - kind of like Stack Overflow, it'll just be another tool in our kit, a place where we human programmers can ask questions and efficiently get answers. As the tech continues to improve, we might be able to rely on those answers with more confidence and not have to second-guess or manually verify the information we're being given, but I don't see an existential threat to our jobs here.
ChatGPT is cool, but it's not the best thing ever.
All comments are pre-moderated and will not be published until approval.
Moderation policy: no abuse, no spam, no problem.
Why type hinting an array as a parameter or return type is an anti-pattern and should be avoided.
Leveraging the power of JSON and RDBMS for a combined SQL/NoSQL approach.
Musings from a Reddit thread
Life with a newborn baby aka why I sometimes go long periods of time without making any new posts.
Maximise performance with load once scripts, kept in long-running memory