The 2010s were quite the rollercoaster, especially if you were in new media. The cycle of layoffs followed by a chorus of “learn to code, your industry is dying” was… memorable, to say the least. Eventually, succumbing to the pressure (or perhaps as a strategic pivot), I did just that. I dove into the world of code, seeking the seemingly stable shores of “web development.” Fast forward to today, and the rise of sophisticated AI has thrown a new curveball: whispers suggesting coding jobs themselves are becoming obsolete. It feels almost comical – my career shift seems perfectly timed to coincide with the AI chatbot revolution, these programs that have also “learned to code” and, in some ways, are already surpassing human capabilities in certain coding tasks.
For those unfamiliar, code can appear as an intimidating, almost alien language. Yet, the proponents of AI proclaim that this barrier is crumbling. Why wrestle with complex syntax to simply display white text on a black background when you can ask an AI chatbot in plain English to do it for you? The chatbot obligingly provides the code, complete with instructions.
Having experimented with these AI chatbots, it’s clear they’re not infallible. Mistakes happen – often enough that a solid understanding of code is still invaluable for corrections. However, the iterative process of discussing errors with the AI, allowing it to learn and refine its output, hints at a future where AI could seemingly handle user needs independently. This raises a critical question: is the role of the human developer destined for the history books?
It’s easy to get swept up in the doom and gloom surrounding AI’s potential job displacement. The loudest voices in the tech world, often those profiting most from AI advancements, are eager to paint a picture of a robotic utopia where human skills and knowledge become relics of the past. But this narrative fundamentally confuses how to do something with why you do it, and the deeper understanding that comes with that.
AI chatbots haven’t unlocked some secret code; they’ve ingested and processed the vast ocean of publicly available resources and open-source materials created by humans, for humans to learn from. While a user might bypass the learning curve by relying on a chatbot’s pre-digested knowledge, they risk forfeiting the crucial understanding of the decisions the AI makes, the reasoning behind them, the quality of the output, and, most importantly, the broader landscape of possibilities.
One of the most rewarding – and often overlooked – aspects of web design and development is the element of creative, lateral thinking. There’s rarely a single, objectively “correct” approach to a problem. A human developer considers the myriad contexts a user might encounter on a website, how to guide their interaction, the desired emotional response, and even practicalities like device performance and potential issues. An AI, trained on aggregated web data to predict and conform to patterns, lacks this nuanced, holistic approach. A user solely reliant on AI assistance risks missing this broader perspective as well.
Personally, I’ve been fortunate to work on projects where my value isn’t just in possessing coding knowledge or generating creative concepts, but in the synergy between the two. Learning to code has been not only professionally rewarding but also surprisingly enjoyable. That spark of excitement when a seemingly outlandish idea actually works is unique. I genuinely believe in the browser as a dynamic, creative canvas. These are the kinds of projects I pursue, regardless of immediate financial gain.
While AI may impact certain facets of my earning potential, I refuse to believe that my craft is reducible to simply typing prompts into a chatbot. No programmer should.
There’s a concerted push, particularly from Silicon Valley, to convince us that the human mind is predictable, easily replicated, and ultimately unremarkable. They want us to believe that creative fields are simply sets of equations and keywords, justifying their billions invested in AI that can churn out imitations of creative work – be it passable Harry Potter/Balenciaga mashups or derivative art.
When asked about the potential applications of AI, OpenAI co-founder Greg Brockman offered a telling glimpse into their vision of the future of entertainment. He suggested using AI to rewrite the ending of “Game of Thrones,” allowing users to customize narratives and even insert themselves into the story.
But this capability – to imagine and reimagine – has always resided within the human mind. It reveals a profound lack of imagination on the part of AI proponents that they are asking us to imagine… having imagination. They seem unable to grasp the inherent satisfaction of creating art, the desire to craft our own narratives instead of outsourcing the process to a machine. They lack the confidence in their own ideas to even create “Game of Thrones” fan fiction without algorithmic assistance.
The most fervent enthusiasm for AI often stems from those who see it as a powerful cost-cutting tool, a way to liberate capital from the perceived constraints of human labor. The notion that human cultural output is finite, neatly packaged, and ready to be fed into AI models to “take it from here” is absurd.
Feeding every piece of art ever created into a machine to produce a homogenized, average result is not artistic expression. It might be a clever parlor trick, a fleeting novelty, but AI’s ability to generate convincing imitations rests entirely on the foundation of original creations born from human thought, skill, and passion.
The specter of AI will undoubtedly be used as a scare tactic by those who equate creative worth solely with monetization. But they are fundamentally wrong. A machine lacks the capacity for self-expression, the intrinsic human drive to communicate: this is who I was, this is what I felt, and this is what I stood for. We possess this drive, and in all our endeavors, we must resist attempts to diminish the immeasurable value of our humanity.
– Tristan Cross is a Welsh writer based in London