I have fleas. https://www.snand.org/

  • 0 Posts
  • 51 Comments
Joined 3 years ago
cake
Cake day: June 28th, 2023

help-circle



  • 4grams@awful.systemstoLemmy Shitpost@lemmy.worldWebsite
    link
    fedilink
    English
    arrow-up
    45
    ·
    edit-2
    3 days ago

    This goes back some years, back when the ping of death was still a thing. I used to hang out in IRC channels and someone decided they needed to show me what a real hacker could do. The dork asked for my IP which was hilarious to begin with, so I replied “127.0.0.1”. About 2 seconds later I see them disconnect from IRC.

    A minute goes by and they are back online, spitting mad. Tells me i’m lucky their computer crashed but I’d better get ready…and disconnected again.

    Back again and folks are dying laughing thanks to my 1337 teenage hacker skills but eventually someone spills that 127.0.0.1 is localhost. Instantly I’m talking to zero cool again and was too scared to give out my actual address. Being a hardened nerd, this time I complied.

    I was on slackware and had already figured out their game from the get go; oh and I actually knew how to find an IP. So right in the middle of this future titan of industry’s insults and threats…they disconnect one last time. 😎




  • I am playing with it, sandboxed in an isolated environment, only interacting with a local LLM and only connected to one public service with a burner account. I haven’t even given it any personal info, not even my name.

    It’s super fascinating and fun, but holy shit the danger is outrageous. Multiple occasions, it’s misunderstood what I’ve asked and it will fuck around with its own config files and such. I’ve asked it to do something and the result was essentially suicide as it ate its own settings. I’ve only been running it for like a week but have had to wipe and rebuild twice already (probably could have fixed it, but that’s what a sandbox is for). I can’t imagine setting it loose on anything important right now.

    But it is undeniably cool, and watching the system communicate with the LLM model has been a huge learning opportunity.