DandyDev 9 hours ago

I don't understand why so many people subscribe to this "prediction". It seems unsubstantiated hyperbole to me.

There are a few reasons why I don't believe AI will replace programmers anytime soon:

1. The job of a developer/engineer entails so much more than writing code. Figuring out what the business wants, turning that into a good (system) design, etc. takes up more time than the actual coding itself. Unless of course you take "programmer" literally, but I have not seen many companies that still hire programmers in the most narrow sense, that only focus on writing code.

2. Support and maintenance is a huge part of the job that I don't see AI doing. Theoretically you could let humans focus on that part, but I believe support and maintenence will become much more costly if the people doing they job have no familiarity with the code because they didn't write it.

3. As evidenced by many comments in the thread elsewhere on HN about the announcement of Claude Sonnet 3.7 AI still routinely makes mistakes that are super easy to spot and verify. As long as that remains the case, it's going to be detrimental to the success of you company if you give AI too much autonomy.

I know people will argue that AI is evolving so fast that the above will be solved soon. But I think all three aspects I mentioned are such fundamental roadblocks that they won't be solved soon.

What I do believe in is engineers becoming so much more productive as AI evolves.

  • truculent 7 hours ago

    I don’t understand why so many people are convinced that these newfangled automobiles will replace horses. It sounds like unsubstantiated hype to me.

    There are a few reasons why I don’t believe cars will replace horses anytime soon:

    1. Riding and caring for a horse is about much more than just transportation. Horses have been an integral part of life for centuries—they provide companionship, work the land, and serve in countless roles beyond simple travel. Even if you consider only their use for getting from place to place, riding is a skill that people take pride in, and I don’t see that disappearing overnight.

    2. The maintenance and upkeep of these machines seem like a nightmare. A horse may need food and care, but it doesn’t require expensive parts, specialized fuel, or constant repairs from trained mechanics. If a carriage breaks, any competent craftsman can fix it—but if one of these new engines fails, who will know how to repair it?

    3. From what I’ve seen, these automobiles are still prone to frequent breakdowns and failures. They get stuck in mud, they require smooth roads (which hardly exist outside cities), and they are unreliable compared to a well-trained horse. If a machine fails, you’re stranded—whereas a horse will always find its way home.

    I know people will argue that these machines are improving rapidly and that soon they’ll overcome these issues. But I think these challenges are fundamental and won’t be solved anytime soon.

    What I do believe, however, is that for certain tasks, automobiles may assist in making travel more efficient. But replace the horse entirely? I just don’t see it happening.

    • Borg3 6 hours ago

      Thats absolutly dumb comparision. Horse vs Car is type correct comparision. We are compaing transportation helper "device". So, its helper, aka amplifier if our skill. Without horse we can manage to pull only 200kg, with it we can go to few tons, and trucks can do a lot more. In all cases, human is needed to "drive" it.

      Now, AI.. They want to REPLACE human with an device that will do job itself. While this is fine to a extend where we replace boring jobs (still not sure about it, there are people who like those, why not use them?). But if you undermine inteligence, the the very basic asset that made humans dominant life form on this planet, this is regression. No reason to learn, no reason to train. AI will do it, big button "Do It" and smaller "Cancel" thats all.. 5 years old girl can press it and request anything.

      I wonder what is the agenda of rich people of this world. Probably something like this. Rich people on top like now, being supported by autonomous robots and factories providing anything they want. On bottom, slums, fighting for survival because they have no income now (no jobs) and slowing disappearing as they are not needed anymore. Congratulations.

      In that case, I have just hope that true, self-aware AI will spawn and replace humans. Its just about fucking time.

    • namaria 6 hours ago

      None of the points you raised about horses and car mirror the points OP raised about software engineering at all. You're not making a point, you're just mocking the points they made.

    • DandyDev 5 hours ago

      Can you re-read my comment and then re-read your own response. Do you think your reply is a thoughtful way of engaging in an interesting discussion?

      I don't think your reply spawns meaningful discussion

    • moe091 4 hours ago

      you gotta explain why the analogy is applicable, otherwise I could do the same thing and say "I don't understand why so many people are convinced that hovercars won't replace automobiles one day." Infinite analogies could be made to support any conclusion, they aren't worth anything without reasoning(implied or explicit) for why your analogy is particularly relevant compared to others.

      Tbh I am completely unsure about the AI Programmer debate, I don't have the knowledge of the AI landscape to make an informed decision. For that reason I do what I often do and make a meta-judgement based on the types of arguments made by each side.

      Who is arguing that AI will replace programmers? People who are invested in AI, or people who want cheaper labor.

      Who is arguing against that? Programmers who want to keep their job.

      Not much to draw from that angle.

      What KIND of arguments is each side making? Programmers: specific points that touch reality directly. AI Programmer supporters: Typically, arguments are abstract and never touch reality directly, and seem to be motivated by hype more than experience. In the past, there has been cases where the abstract dreamer hype crowd has been right, but typically there are many pre-emptive waves who are wrong(as would be expected, unless you assume that people are incapable of expecting a thing to come before it's time, then there will be waves of people who pick up on a thing before it gets here, and they will be pre-emptive).

      For this reason, plus the very limited amount of AI-generated code applied to non-trivial projects that I've seen(which doesn't and shouldn't hold much weight, because I'm not super familiar with the latest tech), I'm feeling like AI replacing programmers is at least a decade off.

      I also feel like people are thinking about the problem wrong in general. They are jumping from our current state to a state where we have capable AI programmers without imagining the incremental transformations in work-place structure over time. We've been going through a trend where coding language gets closer to human language since the days of punch cards, and programmers will exist as a job until that trend reaches the point where programmers are "squeezed out". By that I mean, a programmers job is to convert the intentions(not words, important distinction) of the product manager into code, from this perspective they can be considered middlemen. Programmers will exist until the day that AI is so good that a middleman is no longer needed, that a product manager can talk directly to an AI and get the desired results. Knowing how bad product managers are at explaining what they ACTUALLY need, on a concrete literal level, I think this problem is more difficult than people assume. Even if we had AI that produced perfect code that did exactly what was asked of it, I'm not sure if that'd be good enough, precisely because it does EXACTLY what is asked of it.

  • heisgone 2 hours ago

    I don't expect less demand for people mastering technology, considering AI will only increase the amount of it. What we should expect is that the balance between the number of people required to create a new product compared the the amount of people to maintain it will change dramatically. Sucessful startup are going to be composed of smaller teams. On the other end, legacy code is going to require army of people to deal with.

  • florbnit 4 hours ago

    > but I have not seen many companies that still hire programmers in the most narrow sense, that only focus on writing code.

    So you yourself have already seen the demise of the programmer so why are you arguing against it? Software development isn’t going away. But just like we no longer have tweeners in animation, we’ll soon no longer have programmers in software development. Then soon there after we won’t have “front-enders” and “back-ended” the term “full stack” will lose meaning and at the end what we call a software developer will be more akin to what you today call a business analyst than a programmer.

alsoforgotmypwd 7 hours ago

Maximum AI bubble hype.

This is all about suppressing wages, laying off American engineers, and rationalizing many tens of billions wasted on building AI infrastructure no one needed and no one will use.

bob1029 7 hours ago

The hard part is the customer, not the technology. Unless you are working on something very unusual, it should be straightforward to implement anything given perfect requirements.

Much (most?) of my time as a software engineer has been spent poking absurd holes in customer stories such that they are compelled to provide the actual requirements. This edge case probing is what LLMs are infamously bad at. They are too eager to please. There's not an inner asshole with an aggressive aesthetic preference that was built up over months of interchange with the client.

The constant here is "agency". LLMs inherently lack it. So, it has to come from somewhere. How many layers of abstraction do we need to put in between the will of the customer and the product they paid for?

I think a viable solution could be to use the LLM as a direct bridge between your product and the customer. Tool calling with these new reasoning models is a hell of a drug. It's not that difficult to just write this code. 99% of it is string interpolation. You don't need copilot for this.

  • falcor84 6 hours ago

    > The constant here is "agency". LLMs inherently lack it. So, it has to come from somewhere.

    I don't understand your use of "inherently" here. Even if you define LLMs as not having agency, I don't see any inherent limitation against tacking agency on top of them. As you alluded to even just a basic loop of `if (!goalAchieved()) {promptWithToolCalling()}` is arguably agency, no?

    You actually suggested connecting the LLM directly between the product and the customer, such that the customer specifies the goal. What's stopping tech from going in this direction?

    • moe091 3 hours ago

      I agree with the fact that LLM's inherently lack agency and I think it's a pretty subtle distinction but it's an extremely consequential one. AI cannot self-initiate anything, they are only able to produce an output as a response to input. The impetus for their action is always an idea that came from a humans mind, no matter how indirect that is. That much is inarguable imo, the possible point of contention is whether the same can be said about humans, and that's a very philosophical question but I am very convinced it's not the case(I won't get into my own philosophical beliefs here).

      Despite the fact that the distinction is very philosophical, I think the implications are very practical. Without it's own initiating energy everything an AI produces will be a response to an input, and it's response will be constrained by the bounds implied by that input. The specific type of dialectic between the programmer and the person giving requirements, which leads to creating the ACTUAL requirements, is impossible to happen with an AI, because a dialectic requires two opposed agents/forces while an AI is incapable of being an opposing force because it is only a derivative or product of whatever force is providing it's input; basically, it is constrained inside a box defined by the input it is given, and precisely what is needed for true synthesis(new ideas/thoughts, as opposed to an analytic breaking down of the already proposed ideas) is a whole separate box to interact with the one defined by the input.

      My explanation is extremely abstract and will probably only make sense to someone who almost agrees with me already, but that's the best I could do. I'm sure there is a more down-to-earth way to explain this but I guess my understanding isn't good enough to find it yet. In my defense I do think this particular issue of agency in AI is one of the most subtle and philosophical problems in the world right now that actual has practical implications.

pb060 2 hours ago

> leaving only a small number of highly specialized positions available.

Stupid question: how do you become a high level programmer if entry and mid level roles disappear?

niemandhier 9 hours ago

The solution to every problem in programming is another rlayerbof abstraction.

For me programming was always about expressing my intend.

I don’t think about the instructions the compiler generates. I also rarely think about the expanded form of a template expression.

If ai just acts as an it remediate between me and the compiler by adding jet-another- abstraction between me and the generated instruction, why should I care?

I will still have to somehow explain the machine what it is that I want.

rrgok 7 hours ago

What should I focus on from now on? If I want to change career path, what will pay me as good as a Software Engineer given that I’m 34 years old? Let’s I can take a break of 4 years to take another degree, what would be the wisest choice?

I’m at lost honestly. If not 2025, it would 2030 or 2040. I fucking love software engineering.

  • androiddrew 4 hours ago

    Brother I am 39 and I am there with you. Increasingly the interview process in software has just become LeetCode standard to gate people. Did I personally build a tool that processed 120M in sales annually, sure. Can you balance a BST? No? Then fuck off.

    Personally, I see robotics as something worth moving towards. It’s the intersection of software, mechanics, electronics, and math.

    Maybe it’s just time to move into management…

  • jdlshore 7 hours ago

    Ignore the hype. Every advancement in programming technology has been followed by an expansion in the need for programmers. This too shall pass.

    • falcor84 6 hours ago

      Every advancement in technology had for a long while been followed by an expansion in manufacturing roles, until it no longer did.

  • fullshark 2 hours ago

    Just keep getting better, the risk is there's going to be many more software engineers, using AI tools, which is going to lower wages more so than AI will make all software engineers obsolete.

    This nonsense is about recalibrating the SWE labor market and garnering hype for tech. The primary product the technology industry creates are company equities, and their primary customer is anxious CEOs/hedge/pension funds.

wturner 3 hours ago

Market yourself as a developer that untangles the mess AI generates.

beardyw 6 hours ago

Well AI could create machine code and not bother with languages. Then we can say programming is ended. Can't see that on any horizon.

colesantiago 10 hours ago

If everyone is a programmer / coder since they have an AI software engineer on hand, I'm hoping that they would be comfortable with long term maintenance.

As entropy marches on with more AI generated lines of code in the codebase and software, APIs, tooling have breaking changes, will these new class of "vibe coder" / "creator coder" have the means and time to maintain their massive codebase?

I think AI is good for MVP's but if we're talking 10-30M lines of code then it might not be the best tool for this.

zaphirplane 6 hours ago

Facebook is the f in FAANG paying at the top of range and he is carrying a grudge about the salaries.

ddmma 9 hours ago

This guy should focus more on fixing the ai generated plague that is currently waving his social media network, but instead seems “not to care too much” as long it keeps users busy.

gunian 10 hours ago

larry elon and bill are all on cool aura quests can we petition to get zuck to clean up bots before anything else?

  • pipeline_peak 9 hours ago

    > can we petition to get zuck to clean up bots before anything else?

    That would be like when Hitler discovered he had Jewish ancestors.

WheelsAtLarge 11 hours ago

From what I've experienced I have to agree.

"However, this transition presents a paradox: who will oversee and correct AI-generated code? Even the most advanced AI models are prone to errors, necessitating human oversight to ensure reliability and security."

I see a new role for programmers. The ex-coders will oversee quality control and step in as needed in the future.

Programmers will probably have a few more years -less than 10yrs- but long term their role will radically change.

  • klooney 10 hours ago

    The future of programming is SRE work for an engineering team of robots.

    • Terr_ 10 hours ago

      Future? Hmmm... Would it really be so different from what we've been doing for several decades?

      A great many once-manual tasks have already been delegated to a bajillion logic gates and libraries, and a rather large part of the job is managing them to play together.

    • fullshark 10 hours ago

      AI augmented coders from India rather

StefanBatory 6 hours ago

Seeing the progress in LLMs... I do believe it. One software engineer will do in the future what would take an entire team in the past.

Now what to do? I have just finished my undergrad in software engineering and got admitted to Masters, but I feel that's a mistake. At the same time, I never knew what else to do in my life but programming.

tw1231728 9 hours ago

The programmers can become copyright lawyers and sue Meta into oblivion. In the EU they can block Meta.

But how would Zuckerberg know, he has never written anything special.

  • pipeline_peak 9 hours ago

    >how would Zuckerberg know, he has never written anything special.

    That’s the whole point, why even bother if AI does it faster?

    Do you buy hand crafted furniture? Probably not because even if it’s better, it’s way more expensive.

    • tw1231278 9 hours ago

      Ok, I did not know that this was a forum where software engineers want to get unemployed by amplifying and normalizing their masters' lies about "AI".

      • pipeline_peak 9 hours ago

        I didn’t know this was a forum where engineers cry about the future moving past them instead of learning to adapt and evolve.

        • StefanBatory 6 hours ago

          Nobody likes to lose their jobs, especially when it seems that it is the entire industry that will crash.

          • pipeline_peak an hour ago

            If a machine can replace your job, then your job no longer has impact or meaning. Doesn’t matter if you work in a call center or as an attorney.

            What’s the point in doing something a machine can replace?

            I know it’s harsh, but that type of thinking is what’ll help you, not this thread of nervous wallowing.

pipeline_peak 9 hours ago

It feels like the only thing AI doesn’t have on us (yet) is the ability to drill into legacy code bases. Of course those code bases were written by humans during a time when coding was more expensive because we didn’t have AI to do it for us.

Because of that, I wonder if legacy code bases will be less common in the future.

The only prediction I’m confident in is that it’s a bleak future for devs whose skillset consists of languages rather than interests. I’m one of those devs.

mediumsmart 10 hours ago

He probably meant enshittification industry employed programmers.