The buzz around Artificial Intelligence (AI) is impossible to ignore, especially for those charting a new course into the tech world. Lately, I’ve been hearing from many aspiring developers, particularly those in the early stages of their journey, who are feeling increasingly uneasy. The impressive capabilities demonstrated by AI tools like GPT-4 have sparked a wave of anxiety. They worry: by the time they master the fundamentals of HTML, CSS, and JavaScript, will there even be jobs left for them?
This apprehension is amplified across social media, with posts suggesting that a front-end skillset is becoming obsolete.
But I strongly disagree with this sentiment. The role of the front-end developer is far from disappearing. In fact, I’m pushing back against the wave of Fear, Uncertainty, and Doubt (FUD) circulating online.
In this article, I want to share my perspective on what the future likely holds for programming careers in the age of AI. Change is inevitable, but it’s not the job-eliminating apocalypse some are predicting.
Here We Go Again: Tech Evolution and Developer Resilience
Let’s take a historical perspective. Cascading Style Sheets (CSS), a cornerstone of web design, emerged in 1996 with Internet Explorer 3. Within just two years, the first “no-code” website builder, Homestead, was launched.
Homestead empowered individuals to create personalized web pages without writing a single line of code.
From almost the inception of web development, there have been predictions of developers becoming redundant due to emerging technologies. In the 2000s, WordPress was seen as a potential job killer. The 2010s brought Webflow, and the early 2020s witnessed the rise of “no-code” tools.
And to some extent, these advancements did change the landscape. Today, a local bakery, a dentist’s office, or an independent artist needing a website are unlikely to hire a developer at a significant cost to build something from scratch. They are more likely to utilize platforms like SquareSpace, select a pre-designed template, and manage their online presence for a modest monthly fee.
Yet, despite these shifts, web developers are not only still around, but in high demand.
Recently, OpenAI showcased the capabilities of GPT-4. One demonstration was particularly striking: GPT-4 could transform a hand-drawn sketch of a website into a fully functional webpage, even incorporating JavaScript to activate elements like a “Reveal Punchline” button.
This is undoubtedly impressive and holds considerable potential for rapid prototyping. However, it’s crucial to recognize the context. We haven’t needed dedicated web developers to create these kinds of basic pages for decades. There’s a vast difference between this simple HTML output and the complex code that modern front-end developers craft daily.
Looking Ahead: AI’s Role in Shaping the Future of Programming
Many AI demos focus on narrow applications: generating simple HTML pages or isolated JavaScript functions – tasks that a developer could handle in a short time.
But AI technology is rapidly evolving. If this pace continues, will AI soon be capable of building entire applications in a matter of years, potentially making a career switch to programming less appealing?
While I’m not an expert in Large Language Models (LLMs), the underlying technology powering GPT-4, I have a general understanding of their mechanics.
At their core, LLMs are sophisticated text prediction engines. Given a prompt, they leverage machine learning to predict the most probable sequence of characters that should follow.
Companies like OpenAI invest substantial resources in refining these models. Human reviewers evaluate and “grade” the model’s outputs, enabling continuous learning and improvement.
If you’ve interacted with tools like ChatGPT or Bing’s AI-powered search, you’ve probably noticed that responses are often accurate in many aspects, perhaps around 80%, but delivered with unwavering certainty.
LLMs lack the capacity to validate their assumptions or rigorously test hypotheses. They cannot independently verify the truthfulness of their statements. They operate on probabilities, estimating the character strings that are most likely to logically follow the given prompt.
Consequently, responses can sometimes contain nonsensical or inaccurate elements, which the OpenAI team terms “hallucinations.”
While accuracy will undoubtedly improve as the technology matures, it’s unlikely to reach perfection. This inherent imperfection poses a challenge when considering AI’s potential to replace developers. Someone without programming expertise might struggle to discern accurate code from flawed or hallucinated code. The inability to reliably identify these errors is a significant hurdle.
One might argue that GPT-4’s demo showcased self-correction capabilities. By simply copying and pasting error messages, the AI could identify and rectify issues.
However, not all hallucinations manifest as explicit errors. For instance, I recently used GPT-4 to generate a React component. While the output was surprisingly functional, it contained accessibility flaws. A non-developer user might overlook these subtle but critical issues, which would negatively impact the end-user experience, particularly for users with disabilities.
Accessibility is just one concern. What about security vulnerabilities potentially embedded in AI-generated code? And who bears the responsibility when these flaws lead to significant problems?
Furthermore, there is a monumental difference in scale between generating a 50-line HTML document and developing a production-ready web application. Even a relatively small JavaScript application like this blog can comprise around 65,000 lines of code spread across over 900 files. This figure excludes the written content and solely accounts for TypeScript/JavaScript code.
Even with a hypothetical 95% accuracy rate in AI code generation, debugging such a massive codebase would be an incredibly complex undertaking. It would be akin to a developer spending months building a large project without running a single test until completion – a developer’s worst nightmare scenario.
AI is not magic; its capabilities are limited by its training data. Code snippets are abundant online and often generic. In contrast, every code base is unique and tailored to specific project needs. Large, open-source codebases are relatively scarce. How can AI effectively learn to build substantial, real-world projects based on limited examples of complete systems?
We are rapidly approaching a point where non-programmers can use chatbots to create small, self-contained projects – similar to what tools like Webflow are currently used for. This is a positive advancement.
However, I believe we are still a considerable distance from major tech companies replacing their development teams with prompt engineers. Several potentially insurmountable challenges impede this wholesale replacement from becoming reality. Therefore, switching careers to programming remains a viable and promising path.
Augmenting, Not Replacing: The Future of Developers and AI
Despite the tone of this discussion, I am genuinely optimistic about the potential of AI. 😅
I believe the most probable future involves integrating tools like GPT-4 into developer workflows to amplify the capabilities of skilled programmers.
Think about other professions: Carpenters weren’t replaced by power tools, accountants weren’t replaced by spreadsheets, and photographers weren’t replaced by digital cameras or smartphones. In fact, the number of professional photographers has been increasing year over year. The U.S. Bureau of Labor Statistics projects a 9% year-over-year job growth for photographers over the next decade, compared to the 5% average across all industries. Similarly, I don’t foresee LLMs replacing developers.
I initially considered whether the total number of developer jobs might decrease. If AI makes each developer significantly more productive, wouldn’t we need fewer of them?
Not necessarily. Currently, there’s a significant imbalance, with demand for software developers far exceeding supply. In every company I’ve worked for, we’ve had a backlog of projects and features, limited primarily by developer availability.
Imagine if developer productivity doubled overnight due to AI tools. More bugs would be fixed, more features would be released, and businesses would become more profitable. There’s no shortage of software projects to be built, so it’s unlikely we’d run out of work for developers.
In fact, AI augmentation could actually increase the total number of developer jobs.
Currently, numerous companies don’t employ in-house software developers at all. I previously worked at Konrad Group, an agency that develops web applications for various companies, many of them household names. Given the high cost of software development, it’s often more economical for these companies to outsource their development needs to agencies rather than build and maintain their own in-house teams.
These Fortune 500 companies make these decisions based on the current cost of software development. Consider a hypothetical scenario: a company needs four developers at $150,000 each, totaling $600,000 annually. Outsourcing to an agency for $500,000 might be more financially sound. However, if LLMs significantly enhance developer productivity, the same work might be achievable with just two developers at $150,000 each, totaling $300,000. Suddenly, hiring in-house developers becomes a much more attractive proposition.
Economists have a term for this phenomenon: the Jevons Paradox. This concept dates back to 1865! (Thanks to Tim Grant for pointing this out).
To be clear, I’m not an economist, and this is speculative. I’m not definitively stating this will be the outcome. My point is that it’s not predetermined that AI will negatively impact software developers. No one can predict the future with certainty, and I’m weary of the prevailing narrative that assumes the worst-case scenario is a foregone conclusion.
We’re Not Alone: This Conversation is Happening Across Industries
Aaron Blaise, a seasoned animator and illustrator with nearly 20 years at Disney, contributed to iconic films like Beauty and the Beast (1991), Aladdin (1992), and Pocahontas (1995).
Recently, he shared a YouTube video, Disney animator reacts to AI animation. His perspective, after reading this article, will sound familiar: he views AI tools not as a threat, but as a means to enhance animator productivity and potentially create more opportunities in animation.
Professionals and knowledge workers across numerous industries are engaged in similar discussions. Many fear that AI like GPT-4, DALL-E 2, and Midjourney will soon render their jobs obsolete.
GPT-4 has even demonstrated the ability to pass a simulated bar exam, achieving a score in the top 10% of test-takers. Consequently, lawyers are also having the same conversations.
My personal belief is that most working professionals will integrate AI technology into their workflows, boosting their productivity and value. Certain tasks may be delegated to AI, but full jobs are less likely to be entirely replaced. Therefore, switching to a programming career, which emphasizes problem-solving and adaptability, remains a solid choice.
However, what if I’m wrong, and LLMs do completely replace software developers? In that scenario, I suspect LLMs would replace a vast majority of knowledge workers across various fields.
This isn’t a localized threat that can be avoided by switching professions. There is no safe haven. Therefore, instead of trying to predict and gamble on the uncertain future, why not focus on pursuing what you are passionate about, what genuinely interests you, and what you are naturally good at? If programming sparks your interest, the rise of AI shouldn’t deter you.
Using LLMs to Help You Learn: AI as a Learning Tool
I’ve heard from several individuals who have found ChatGPT incredibly helpful in learning technical skills. If you encounter confusion in a tutorial, you can ask the AI to clarify concepts.
This is a compelling application of AI. Essentially, ChatGPT can function as a virtual pair programmer, assisting you in understanding complex topics. You can pose specific questions and receive targeted explanations.
However, it’s crucial to use these tools judiciously. There’s an effective and an ineffective approach to leveraging AI for learning.
The ineffective approach is treating it like a GPS navigation system. When driving, I input an address into my GPS and blindly follow its directions. While I usually reach my destination, this process requires minimal mental engagement. Consequently, my sense of direction has significantly deteriorated. I now struggle to navigate without a synthesized voice guiding me. 😬
Instead of treating AI like a GPS, I recommend adopting the mindset of a jury member evaluating a defendant – in this case, the LLM, taking the stand.
Listen attentively to what it presents, but don’t automatically accept it as absolute truth. Maintain skepticism and critically analyze every piece of information.
Instead of blindly copy-pasting AI-generated code, examine it line by line, ensuring complete comprehension. Seek clarification from the AI on unclear aspects. And cross-reference anything that seems questionable with authoritative sources, such as official documentation. Remember that LLMs exhibit 100% confidence but not 100% accuracy.
By adopting this critical and engaged approach, I believe LLMs can be invaluable learning aids. 😄
A Message to Aspiring Developers: Embrace the Future
My primary motivation for writing this article is to address those learning web development who are experiencing anxiety and discouragement, feeling that investing time and energy in these skills is futile given the perceived threat of AI-driven obsolescence. If you are contemplating whether it still makes sense to switch career programming, I hope this provides some reassurance.
I cannot guarantee that everything will remain unchanged. AI will likely influence how we work. Having started with HTML/CSS/JS in 2007, I’ve witnessed significant evolution in the field. Developers have always needed to be adaptable, evolving in tandem with technology.
However, nothing I’ve observed so far suggests that our jobs are genuinely at risk of disappearing. I’ve tried to envision a scenario where non-developers can build entire web applications without any understanding of web technologies, and I consistently encounter numerous practical obstacles, even assuming future AI iterations significantly reduce hallucinations.
I could be wrong. I don’t possess a crystal ball 🔮. For all I know, the sun might explode tomorrow. But I sincerely doubt that we are on the verge of web developers becoming obsolete. And I worry that many potential developers are prematurely abandoning their aspirations without sufficient cause.
I don’t want you to look back in five years, when software developers are potentially even more in-demand, and regret giving up on pursuing your dreams. ❤️
Last updated on
January 28th, 2025