Skip to content

Do LLMs Dream of Electric Sheep?

If you think that LLMs are the new programming languages, you may be missing that instructing computers isn't the challenge in programming.

4 min read
Do LLMs Dream of Electric Sheep?
people, arguing, writing, shouting, at the table, angry, scared
🦉
Before Growth is a weekly column about startups and their builders prior to product–market fit.

I think most of the people who claim that LLMs are the new programming languages don’t understand that telling computers what to do is not the difficult part when it comes to programming.

I say this with relative certainty, knowing that nothing is ever truly certain when it comes to groundbreaking technology, and here’s why.

  • I’m an engineer! I don’t hesitate to incorporate new tools like ChatGPT or GitHub Copilot in my daily work. I use the former as my primary pair programmer every day and the latter as a convenient way to avoid boilerplate when needed.
  • I’ve spent considerable time pondering chatbots over the past decade, having founded a chatbot startup in 2015 during the initial wave of conversational UIs. Looking back, none of the products developed during that period managed to gain mainstream popularity. Even Amazon’s Alexa never established a scalable business model, generating an annual loss of about 10 billion dollars. It wasn’t until ChatGPT, that the first mass-market product of this kind emerged, quickly becoming the fastest-growing consumer app in history. That’s huge.
  • In 2016, I authored an article for a publication discussing startups that claimed chatbots would become the next programming language everyone could use, simply because it involved natural language. I must admit, I feel a bit proud for being ahead of the curve on this topic, especially since nobody’s that great at getting technology trends right until they seem obvious in retrospect—myself included.

I’ve changed my perspective since then, and several factors have contributed to this shift. One of these factors is the moderate success of no-code apps like Airtable. While these businesses are doing well helping people automate processes at work or in their personal lives, they haven’t really disrupted the engineering profession. I suspect that LLMs will follow a similar trajectory.

Another aspect to consider is that equating engineering to just “coding” is like saying writing is only about typing words. Sure, everyone can type, but not everyone is a writer. This is because writing demands a certain mindset and the belief that you have something interesting to share, which some people simply don’t possess. (Ego!) Similarly, being an engineer involves thinking through flows, processes, design patterns, and tech stacks, even if AI writes the code for you.

For example, try asking someone without an engineering background to generate a list of acceptance criteria and possible test cases that they could use to train a large language model to write code for them. Chances are, they won’t get very far. Most aren’t naturally inclined to think in such a way. They will consider it boring and tedious; they’ll prefer interacting with people, real people, over thinking through edge cases and commands with a chatbot. Not everyone’s a nerd—that’s good!

The only exception I can imagine would be if AI generated the entire app from scratch for you, but I don’t think we’re there just yet. When I pair with ChatGPT, it behaves like an intern with photographic memory, which, don’t get me wrong, is extremely impressive given the state of affairs just a few years ago. It’s still an intern, though.

I personally believe that the future will see a continued evolution of product-oriented engineering instead—and a further merging of the three primary product roles: software engineers, designers, and project managers. Over the past two decades, these roles have increasingly overlapped. It’s common for engineers to manage projects independently and be better at handling user experience, for designers to create code-friendly components and design systems that Figma translates into CSS or React, and for project managers to automate workflows or transition into product managers, conducting user interviews like UX researchers, and so on.

I think that the boundaries between these roles will continue to blur, as generative AI will assist people in supplementing their skill gaps. While specialization will still exist, as we abstract more and more, we’ll eventually approach a unified profession: a software builder of sorts, capable of delivering projects end-to-end with a passable level of quality in each area.

On a somewhat related topic: enthusiasts predict that chat will soon become the new operating system, a revolution similar to iOS and mobile. My experience has shown that chatbots work much better in use cases that don’t require, for example, browsing and comparing product lists, but are just single actions where the purchased products are either very specific or highly interchangeable, and the user doesn’t think deeply about their features. A good example could be purchasing a specific book through Alexa, where there is no alternative, or buying groceries using the same chatbot.

Frankly, I don’t think this will change; I see bots as a layer that will replace the need to have as many complex workflows in a program menu. True enough, state-of-the-art LLM-based chatbots handle multi-level data transformations other apps can’t easily work with. But anyone with an Echo speaker at home has probably noticed that sometimes, after setting an alarm, Alexa casually mentions that if you want, you can also check tomorrow’s weather using a specific voice command. This annoys me, but… it’s the only way to help users find new features if they don’t show enough imagination.

Ask any designer worth their pay and they’ll tell you that good universal UIs not only present information—they also allow it to be easily discovered. Command-based UIs are bad at that… and honestly, lots of other stuff. I doubt they’re going to replace screen-based interfaces. This doesn’t mean they’re worthless, though. For instance, Zapier—an automation suite—was valued at $5 billion in 2021, equivalent to approximately $3.5 billion in 2023. If ChatGPT “simply” becomes a 10x better Zapier, that’ll be a product line potentially worth $35B. But is it going to decimate entire industries? We’ll see.

Both Uber and Airbnb are twice as large but taxis and hotels still exist.

They felt pressure, though.

🙌
Thanks for being a part of Before Growth! The newsletter has grown mainly through word of mouth. Would you mind sharing it with a couple of friends who might like it too?
AIStartups

Kamil Nicieja

I guess you can call me an optimist. I build products for fun—I’m a startup founder, author, and software engineer who’s worked with tech companies around the world.


Related posts

Chaining Prompts

Chains go beyond a single LLM call and involve sequences of calls.

Chaining Prompts

Intelligence as an API

AI models are getting hard to beat when it comes to getting simple answers right at scale.

Intelligence as an API

Pairing with AI

Raise your floor with large language models.

Pairing with AI