• 4grams@awful.systems
    link
    fedilink
    English
    arrow-up
    10
    ·
    edit-2
    4 days ago

    I am playing with it, sandboxed in an isolated environment, only interacting with a local LLM and only connected to one public service with a burner account. I haven’t even given it any personal info, not even my name.

    It’s super fascinating and fun, but holy shit the danger is outrageous. Multiple occasions, it’s misunderstood what I’ve asked and it will fuck around with its own config files and such. I’ve asked it to do something and the result was essentially suicide as it ate its own settings. I’ve only been running it for like a week but have had to wipe and rebuild twice already (probably could have fixed it, but that’s what a sandbox is for). I can’t imagine setting it loose on anything important right now.

    But it is undeniably cool, and watching the system communicate with the LLM model has been a huge learning opportunity.

    • baltakatei@sopuli.xyz
      link
      fedilink
      English
      arrow-up
      2
      ·
      4 days ago

      Reminds me of a quote from Small Gods (1992) about an eagle that drops vulnerable tortoises to break their shell open:

      But of course, what the eagle does not realize is that it is participating in a very crude form of natural selection. One day a tortoise will learn how to fly.