• 2 Posts
  • 446 Comments
Joined 3 years ago
cake
Cake day: July 14th, 2023

help-circle
  • Like, as far as legal basis? Yes, as I understand it — but I am not a lawyer.

    But if you’re hoping to leverage an LLM… Part of the reason they’re so good at producing replacements for e.g. react is that the source code for react is in the training data, along with test suites and a ton of commentary related to the source code.

    So you’d be at a big disadvantage. That’s on top of the basic legibility challenges of decompiled binaries.



  • I understand why you cut it. There is a lot that you could dig into there, and you would really have to go into detail it you wanna convince the suits that can only think in terms of KPIs.

    The forcing function is a good way of looking at it. We’re not really doing the same thing but faster. We’re doing less, and that naturally takes less time. AI made understanding optional.

    I kinda wonder to what extent coding is actually a domain that AI is uniquely “good at” vs simply how well coding activities are able to survive on artifacts which are produced but not understood.

    It seems like something we were already exceptionally capable of tolerating, compared to other sectors.


  • I’m afraid it’s worse than just moving the bottleneck to review/due-diligence.

    C-suits don’t like to admit this, because it challenges the assumption that job roles are all about input and output with no messy internal state to worry about, but…

    When you write some code, the code isn’t the only thing that you make. You also develop a mental model of how the code works, how you wish it would work, the distance between those two, and some sense of the possible paths that could get you there.

    That mental model is ultimately the more important part for the long-term health of the project. Coding is more an activity of communication between people; having an artifact that tells the computer what to do is almost an incidental side-effect of successful communication.








  • If only it was just a problem of understanding.

    The thing is: Programming isn’t primarily about engineering. It’s primarily about communication.

    That’s what allows developers to deliver working software without understanding how a compiler works: they can express ideas in terms of other ideas that already have a well-defined relationship to those technical components that they don’t understand.

    That’s also why generative AI may end up setting us back a couple of decades. If you’re using AI to read and write the code, there is very little communicative intent making it through to the other developers. And it’s a self-reinforcing problem. There’s no use reading AI-generated code to try to understand the author’s mental model, because very little of their mental model makes it through to the code.

    This is not a technical problem. This is a communication problem. And programmers in general don’t seem to be aware of that, which is why it’s going to metastasize so viciously.










  • This isn’t anything new. There have been multiple waves of “code-gen for normies”, and every time after the hype dies down there’s a heap of shitty code to fix.

    There’s gonna be no shortage of customers up to their eyeballs in broken slop after the bill comes due and Anthropic has to stop subsidizing their prices. AI slop is the best thing to happen to our job security in a while. (Provided you retain your critical thinking skills.)