• kescusay@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    arrow-down
    2
    ·
    5 hours ago

    This is a recipe for SQL injections, race conditions, memory leaks, and keys being placed directly in code.

    Trust the output of an LLM at your peril. Literally.

      • kescusay@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        arrow-down
        1
        ·
        4 hours ago

        Unless you’re checking every line and have a good enough and comprehensive enough understanding of the codebase to spot subtle bugs it will try to introduce that aren’t caught by your tests, you’re still opening yourself up to problems.

        • mx_smith@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          2 hours ago

          Your right and it is something that is part of the workflow. You really should only do this process on a language your are really familiar with so you know exactly how you would do it without the bots assistance.