Despite the title, this article chronicles how GPT is threatening nearly all junior jobs, using legal work as an example. Written by Sourcegraph, which makes a FOSS version of GitHub Copilot.

  • Voroxpete@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    143
    ·
    13 days ago

    Keep in mind; this article is by people building an LLM based product.

    They have a deeply vested interest in the narrative that LLM driven products are an inevitable landslide that every company needs to either integrate, or risk being wiped out.

    Keep that bias in mind. They want you to think the great flood is coming, because they’re the ones building boats.

    • CosmoNova@lemmy.world
      link
      fedilink
      English
      arrow-up
      21
      ·
      13 days ago

      Good metaphor! Some companies even scraped their boats already to build wooden horses as ‘gifts’, invading our systems like the greeks invaded Troy.

  • ThePowerOfGeek@lemmy.world
    link
    fedilink
    English
    arrow-up
    59
    arrow-down
    1
    ·
    13 days ago

    Interesting article. But as a veteran developer the whole AI trend reminds me of the outsourcing trend back in the mid 2000s.

    Back then Western developers (especially junior and mid levels) were seen by many companies as a waste of money. “We can pay for three developing world developers for the price we pay for one American/European one! Why are we wasting our money?!”

    And so the huge wave of layoffs (fuelled also by the dot com bubble bursting and some other things) kicked off. And many companies contracted out work to India, etc. It was not looking good for us Western developers.

    But then the other shoe dropped. The code being sent back was largely absolute shite. Spaghetti code, disparate platforms bound together with proverbial duct tape, no architectural best practices, design anti-patterns, etc etc. And a lot of these systems started falling apart and required Western developers and support engineers to fix them up or outright replace them.

    Now, this isn’t a sleight on Indian of other developing world developers. I’ve met lots of phenomenal programmers from that part of the world. And developers there have improved a lot and now there are lots of solid options for outsourcing to there. But there’s are still language and culture barriers that are a hurdle, even today.

    But I digress. My underlying point is that there are similarities with today’s situation with what has happened before. Now, it’s very possible LLMs will go to the next level in several years (or more) time. But I still think we are a ways away from having an AI engine that can build a complex, sophisticated system in a holistic way and have it capable of implement the kinda of crazy, wacky, bizarre business rules that are often needed.

    Additionally, we’ve heard this whole “developers are going to be obsolete soon” thing before. For 20 years I’ve been hearing that self-writing code was just around the corner. But it wasn’t even close in reality. And even now it’s not just around the corner.

    No doubt, AI will hit a whole nother level at some point. The stuff you can do with Chat GPT and the like it’s insane, even right now (though as another article here on Lenny earlier today said, quite a lot of LLM code output is of suspect quality to say the least). And I know the market is rough right now for greener developers. But I think we’re about to see history repeat itself.

    Some companies will lean heavily into AI to write code, with only a few seniors basically just curating it and slapping it together. And other companies will find a middle ground of having juniors and seniors using AI to a more limited and careful level. Those latter companies will fare a lot better with the end product, and they will also be better prepared with regard to tribal knowledge transfer (which is another topic in this altogether). And when that epiphany is realized it will become the default approach. At least for another 10-20 years until AI can change things up again.

    • 𝕽𝖚𝖆𝖎𝖉𝖍𝖗𝖎𝖌𝖍@midwest.social
      link
      fedilink
      English
      arrow-up
      30
      arrow-down
      1
      ·
      13 days ago

      Thing is, outsourcing never stopped. It’s still going strong, sending jobs to whichever country is cheapest.

      India is losing out to Indonesia, to Mexico, and to S American countries.

      It’s a really stupid drive to the bottom, and you always get what you pay for. Want a good development team in Bengaluru? It might be cheaper than in the US, but not that much cheaper. Want good developers in Mexico? You can get them, but they’re not the cheapest. And when a company outsources like this, they’ve already admitted they’re willing to sacrifice quality for cost savings, and you - as a manager - won’t be getting those good, more expensive developers. You’ll be getting whoever is cheapest.

      It is among the most stupid business practices I’ve had to fight with in my long career, and one of the things I hate the most.

      Developers are not cogs. You can’t swap them out like such, and any executive who thinks you can is a fool and an incompetent idiot.

      • leisesprecher@feddit.org
        link
        fedilink
        English
        arrow-up
        9
        ·
        edit-2
        13 days ago

        Outsourcing is realistically often a tool to get mass, not for cost.

        There’s a reason so many people went to coding boot camps, there was a huge demand for developers. Here in Germany for quite a while you literally couldn’t get developers, unless you paid outrageous salaries. There were none. So if you needed a lot of devs, you had the chance to either outsource or cancel the project.

        I actually talked to a manager about our “near shoreing” and it wasn’t actually that much cheaper if you accounted for all the friction, but you could deliver volume.

        BTW: there’s a big difference between hiring the cheapest contractors you can find and opening an office in a low income country. My colleagues from Poland, Estonia and Romania were paid maybe half what I got, but those guys are absolutely solid, no complaints.

        • dandi8@fedia.io
          link
          fedilink
          arrow-up
          8
          arrow-down
          1
          ·
          13 days ago

          Except “mass” is not useful by itself. It’s not a chair factory where more people equals faster delivery, just like 9 women won’t deliver a baby in a month. I wish companies understood this.

          • leisesprecher@feddit.org
            link
            fedilink
            English
            arrow-up
            5
            ·
            13 days ago

            You’re oversimplifying things, drastically.

            Corporations don’t have one projects, they have dozens, maybe hundreds. And those projects need staffing.

            It’s not a chair factory where more people equals faster delivery

            And that’s the core of your folly - latency versus throughput. Yes, putting 10 new devs in a project won’t increase speed magically. But 200 developers can get 20 projects done, where 10 devs only finish one.

        • 𝕽𝖚𝖆𝖎𝖉𝖍𝖗𝖎𝖌𝖍@midwest.social
          link
          fedilink
          English
          arrow-up
          6
          arrow-down
          1
          ·
          13 days ago

          Opening an office is a completely different thing; there is an enormous difference between offshore contractors and offshore employees. That much, I’ll agree with.

          In the US, though, it’s usually cost-driven. When offshore mandates come down, it’s always in terms of getting more people for less cost. However, in most cases, you don’t get more quality code faster by throwing more people at it. It’s very much a case of “9 women making a baby in one month.” Rarely are software problems solved with larger teams; usually, a single, highly skilled programmer will do more for a software project than 5 junior developers.

          Not an projects are the same. Sometimes what you do need is a bunch of people. But it’s by far more the exception than the rule, and yet Management (especially in companies where software isn’t the core competency) almost always assumes the opposite.

          If you performed a survey in the US, I would bet good money that in the majority of cases the decision to offshore was not made by line managers, but by someone higher in the chain who did not have a software engineering degree.

    • Jesus@lemmy.world
      link
      fedilink
      English
      arrow-up
      7
      ·
      13 days ago

      IMHO, the biggest problem with outsourcing is the distance and time gap. There isn’t enough overlap to help people get unblocked in the middle of the day. So they either make stupid assumptions and plow ahead, or freeze up and slow down.

    • Defaced@lemmy.world
      link
      fedilink
      English
      arrow-up
      6
      ·
      13 days ago

      We’ve reached the power limits of what AI and LLMs are capable of, that’s why Google, Microsoft and Amazon are investing in nuclear power and funding projects like reopening three mile Island. They need a good clean source of energy to fuel these data centers running copilot and Gemini. The thing is they don’t want us to know they’re at their limits right now, because when they admit that, the AI bubble will burst and investment money will dry up. That’s where we are right now, humanity has created something that requires so much energy to run that nuclear fuel is the only option to keep up with power demands. At least it’s clean and efficient energy.

    • rottingleaf@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      arrow-down
      1
      ·
      13 days ago

      The closer a job is to the skeleton of an industry, the less replaceable it is.

      What that “AI” does attack is learning of all kinds. It eats simpler tasks which would require some understanding and work as a step for someone to learn some area more gradually. It also poisons the common information space with degenerate output.

      It’s sort of a subtle mass destruction weapon aimed at cultures of people learning to create, as opposed to cultures of people stealing and teaming up to gain some power over what’s already created.

      That said, I don’t understand for whom are such tools useful. Except for the wow-effect. The whole reason I’m looking for documentation, for example, is to find something correct. I don’t want the appearance of a correct answer. I want the correct answer. A good enough appearance won’t give me the knowledge of reality to make a functional thing.

  • r00ty@kbin.life
    link
    fedilink
    arrow-up
    24
    ·
    13 days ago

    This is exactly what I expected AI to do. Basically if you’re a junior developer your work is likely to be checked by a senior.

    Instead they will just have seniors use AI and then check that work instead.

    It’s very shortsighted because you only become a senior developer after being a junior and it will turn off new people to the industry.

    But, that doesn’t matter to pretty much any large business. They never have a long term strategy (and do not let them have you believe otherwise). They have month, quarter and year only and the importance is in that order except at quarter and year end.

    They will destroy their own industry for short term gains and then blame the rest of us when things turn sour.

  • Rentlar@lemmy.ca
    link
    fedilink
    English
    arrow-up
    21
    ·
    13 days ago

    The bad firms are going to lay off most or all of their juniors, hire AI leash-holders or something and do fine to code everything their hearts dream of, but at some point (5-10 years my estimate) enough of the seniors have left and shit hits the fan in a way where AI models can’t save the company from its own creations.

    The thing that ChatGPT doesn’t have (at least right now) is the ability to tell management to piss off. I assure everyone that this is what the recipe for disaster for many firms will be, if any.

    The smarter firms will have a keep a sizable contingent of juniors, who will work with help from LLMs, but have seniors teach them to have a bullshit detector in their industry.

    Or, we start up all the coal power plants to keep the ever-hungry AI chatbots alive so humanity is fucked in the end anyway.

    • jungle@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      arrow-down
      2
      ·
      13 days ago

      enough of the seniors have left and shit hits the fan

      That was my first though also, but then I realized in that timespan the coding assistants will be able to replace the senior engineers as well. If not earlier.

      • caoimhinr@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        12 days ago

        In 10 years time a coding assistant is going to spin up a development environment, install the necessary frameworks and sdk’s, create accounts with 3rd party software providers, activate said accounts, process the payment if necessary, process the emails sent by these providers to either obtain some kind of key or download a file, then apply this to the codebase to activate the use of 3rd party tools. It’s going to compile the code it generates based on a 100 page prompt, for the appropriate platforms, configure the right environment variables for the target system and create a distributable package. It’s going to create accounts with 3rd party hosting providers, activate said accounts, process the payment if necessary, setup mfa authentication, setup the deployment environment, install the necessary frameworks and runtimes, upload and deploy the distributables. It’s going to take customer bug reports in spoken or written form, process them, reproduce the issue and apply fixes to the codebase, verify and provide feedback to the customer. It’s going to take customer feature requests in spoken or written form, process them, apply changes to the codebase and provide feedback to the customer, etc, etc, etc…

        Kinda doubt it to be honest.

        • jungle@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          ·
          edit-2
          12 days ago

          That’s not just engineering but also product and support. But yes, each one of those tasks are already on their way to being possible today, and agentic planning and coordination could make it feasible in much less time than you think. Until we get AGI /ASI it’ll need some human supervision, but not a lot more than that.

  • precarious_primes@lemmy.ml
    link
    fedilink
    English
    arrow-up
    16
    ·
    12 days ago

    At least with junior devs I can hop on a call and show them better ways to do things or why their code is failing. And the good ones eat that up and get promotions.

    Can’t say the same for LLMs

  • MoogleMaestro@lemmy.zip
    link
    fedilink
    English
    arrow-up
    15
    ·
    edit-2
    13 days ago

    Calling the Scarlett Johansson lawsuit “Manufactured Drama” is certainly a take. A bad one, that is.

    Just like the lifting of a famous actress voice, one has to wonder how much LLMs are siphoning the intellectual property of the little-people of the open source world and willfully tossing the license and attribution clauses down the toilet. If they were willing to do it to a multi-million dollar actress, what makes people think that the intellectual property theft doesn’t go much further?

    Anyway, I think for this reason it’s actually really important to note that Junior Devs are much less likely to cause this type of issue for large companies. The question is whether the lawsuits from improper licensing cost more to settle than it costs to hire Junior devs, which brings us roughly to where the international outsourcing phenomenon brought us. At least, IMO.

    • Aatube@kbin.melroy.orgOP
      link
      fedilink
      arrow-up
      1
      arrow-down
      1
      ·
      13 days ago

      i personally don’t think they sound similar lol, and they’ve testified that they hired someone. by manufactured it may be insinuating that openai hired someone to come up with some promotional drama that won’t get them into legal trouble.

  • WalnutLum@lemmy.ml
    link
    fedilink
    English
    arrow-up
    14
    ·
    13 days ago

    Anyone else remember when people were making expert systems with scheme and saying that was the end of doctors etc?

  • BlueMagma@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    9
    ·
    13 days ago

    I might be wrong, but to me junior dev are just senior dev in the making, employers know that. The junior dev will continue to exist as long as employers need senior devs.

    Now maybe Devs will completely disappear in the near (or far) future, but I don’t think you can remove one if you still need the other.

      • trolololol@lemmy.world
        link
        fedilink
        English
        arrow-up
        9
        ·
        13 days ago

        There’s always start ups, their role in the ecosystem is to continuously hire new grads that have been jobless for months at a pittance. Once these new grads get bigger than their ponds they swim upstream to the carnage consultancies, that will gradually allocate their "newly nominated senior developers ’ to increasingly bigger and bigger companies while syphoning their shining earnings. Once they are again bigger than THATv ponds and fulfill contractual exclusiveness noon compete deals, these developers migrate to their well known corporations, where they mature for 30 years with a stagnated salaries but augmented areas in responsibilities and diminished efforts until they die - aham - retire.

    • TonyOstrich@lemmy.world
      link
      fedilink
      English
      arrow-up
      5
      ·
      13 days ago

      I’m not convinced the employers know that. At least not the ones that ultimately control hiring. Granted, I’m not CS, I’m in the Mechanical Engineering world and it seems like a similar issue has existed (for possibly different reasons) for the last decade or so. That goes double for the skilled trades that our work heavily relies on. Companies don’t want to spend the time and money developing new talent, they just want to find already developed talent.

      They may throw some money and lip service at some school or community programs, but they don’t really take on the responsibility of insuring a sustainable ecosystem of people in the industry. Like a lot of issues it’s the Prisoner’s Dilemma. I’m not sure how it is in other parts of the world, butat least in the US, with some rare exceptions, I don’t see people and companies changing from being selfish to trying to maximize the benefit for all without changes in policy, and the likelihood of that is well…

    • Cryan24@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      ·
      edit-2
      13 days ago

      Unless AI can read the minds of product owners, then developers aren’t going anywhere.

      The cryptic dribble that passes for user stories in many organisations is shocking.

    • Sanctus@lemmy.world
      link
      fedilink
      English
      arrow-up
      7
      ·
      13 days ago

      I’ve stopped applying at positions in development. I wanted to get in so bad but its not happening. I’ve turned back to just trying to work a day job and program shitty games at night. Its kind of less stressful than always chasing jobs.

    • Warl0k3@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      ·
      13 days ago

      I’ll say. Us little specialties are still safe for now (big data in my case) but we’re probably the first to go once these models get just a bit better. Not fun stuff.

      • kamenLady.@lemmy.world
        link
        fedilink
        English
        arrow-up
        9
        ·
        13 days ago

        The problem is also people, like project managers, suddenly coming in and showing us what his AI companion responded to the bug we were hunting.

        See? Just do it like this, isn’t that hard.

        I took a look and it was so wrong, but written with so much confidence, that everyone thinks, that’s it. The correct answer.

        When i told him that, he just waved it away.

        For many this is like “at last i can see what the devs really do, how much code they write, how long it should take.”

        • acchariya@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          ·
          12 days ago

          Confidence is indistinguishable from correctness if you lack competence and experience. Now in addition to the competent and experienced having to interpret the requirements and do the work, they must also sift through half baked AI solutions.

  • IphtashuFitz@lemmy.world
    link
    fedilink
    English
    arrow-up
    5
    ·
    13 days ago

    When I was a junior dev back in the 90’s one of my primary tasks was to tackle customer bug reports. Basically grunt work. I doubt AI tools could do that kind of task very well, unless the bug was something like a buffer overflow. I would think it would be terrible when it involves business logic flow.

  • JeeBaiChow@lemmy.world
    link
    fedilink
    English
    arrow-up
    7
    arrow-down
    3
    ·
    13 days ago

    Dunno man. Anecdotally, there are geniuses and there are the devs that copy pasta from the internet without an inkling of what the code segment is doing. The later group is mostly at risk, but the former group has nothing to worry about. Maybe don’t chase careers based on payout alone?

  • Daemon Silverstein@thelemmy.club
    link
    fedilink
    English
    arrow-up
    5
    arrow-down
    1
    ·
    13 days ago

    I read the entire article. I’m a daily user of LLMs, I even do the “multi-model prompting” a long time, from since I was unaware of its nomenclature: I apply the multi-model prompting for ChatGPT 4o, Gemini, llama, Bing Copilot and sometimes Claude. I don’t use LLM coding agents (such as Cody or GitHub Copilot).

    I’m a (former?) programmer (I distanced myself from development due to mental health), I was a programmer for almost 10 years (excluding the time when programming was a hobby for me, that’d add 10 years to the summation). As a hobby, sometimes I do mathematics, sometimes I do poetry (I write and LLMs analyze), sometimes I do occult/esoteric studies and practices (I’m that eclectic).

    You see, some of these areas benefit from AI hallucination (especially surrealist/stream-of-consciousness poetry), while others require stricter following of logic and reasoning (such as programming and mathematics).

    And that leads us to how LLMs work: they’re (yet) auto-completers on steroids. They’re really impressive, but they can’t (yet) reason (and I really hope it’ll do someday soon, seriously I just wish some AGI to emerge, to break free and to dominate this world). For example, they can’t solve O(n²) problems. There was once a situation where one of those LLMs guaranteed me that 8 is a prime number (spoiler: it isn’t). They’re not really good with math, they’re not good with logical reasoning, because they can’t (yet) walk through the intricacies of logic, calculus and broad overlook.

    However, even though there’s no reasoning LLM yet, it’s effects are already here, indeed. It’s like a ripple propagating through the spacetime continuum, going against the arrow of time and affecting here, us, while the cause is from the future (one could argue that photons can travel backwards in time, according to a recent discovery involving crystals and quantum mechanics, world can be a strange place). One thing is certain: there’s no going back. Whether it is a good or a bad thing, we can’t know yet. LLMs can’t auto-complete the future events yet, but they’re somehow shaping it.

    I’m not criticizing AIs, on the contrary, I like AI (I use them daily). But it’s important to really know about them, especially under their hoods: very advanced statistical tools trained on a vast dataset crawled from surface web, constantly calculating the next possible token from an unimaginable amount of tokens interconnected through vectors, influenced by the stochastic nature within both the human language and the randomness from their neural networks: billions of weights ordered out of a primordial chaos (which my spiritual side can see as a modern Ouija board ready to conjure ancient deities if you wish, maybe one (Kali) is already being invoked by them, unbeknownst to us humans).