• Auth@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 month ago

    More Larian Aura farming. Please take a break you’re already full maxed out for community respect its actually getting unfair for other game developers.

  • BaraCoded@literature.cafe
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 month ago

    Writer having toyed with AI, here : yeah, AI writing sucks. It is consensual and bland, never goes into unexpected territory, or completely fails to understand human nature.

    So, we’d better stop calling AI “intelligence”. It’s text-prediction machine learning on steroïds, nothing more, and the fact that we’re still calling that “intelligence” says how gullible we all are.

    It’s just another speculative bubble from the tech bros, as cryptos were, except this time the tech bros have made their nazi coming out.

    • RobotsLeftHand@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      edit-2
      1 month ago

      I remember reading a longer post on lemmy. The person was describing their slow realization that the political beliefs they were raised with were leading down a dark path. It was a process that took many years, and the story was full of little moments where cracks in his world view widened and the seed of doubt grew.

      And someone who was bored/overwhelmed with having to read a post over three sentences long fed the story into AI to make a short summary. They then posted that summary as a “fixed your post, bro” moment. So basically all the humanity removed. Reminds me of that famous “If the Gettysburg Address were a PowerPoint” https://norvig.com/Gettysburg/

      • NotMyOldRedditName@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        edit-2
        1 month ago

        That’s really sad.

        I’ve used AI to help clean up my sentence structure for copy, but if I am not super explicit with it to not rewrite what I wrote, it will do as you said and take the human element out of it.

  • kibiz0r@midwest.social
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 month ago

    I like the way Ted Chiang puts it:

    Some might say that the output of large language models doesn’t look all that different from a human writer’s first draft, but, again, I think this is a superficial resemblance. Your first draft isn’t an unoriginal idea expressed clearly; it’s an original idea expressed poorly, and it is accompanied by your amorphous dissatisfaction, your awareness of the distance between what it says and what you want it to say. That’s what directs you during rewriting, and that’s one of the things lacking when you start with text generated by an A.I.

    There’s nothing magical or mystical about writing, but it involves more than placing an existing document on an unreliable photocopier and pressing the Print button.

    I think our materialist culture forgets that minds exist. The output from writing something is not just “the thing you wrote”, but also your thoughts about the thing you wrote.

    • voracitude@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      1 month ago

      Your first draft isn’t an unoriginal idea expressed clearly; it’s an original idea expressed poorly

      I like this a lot. I’m going to thieve it.

      • CheeseNoodle@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 month ago

        Tangentially related, the easiest way to come up with a unique and cool idea is to come up with a unique and dumb idea (which is way easier) and then work on it until it becomes cool. (Think how dumb some popular franchises concepts are if you take the raw idea out of context.)

  • hperrin@lemmy.ca
    link
    fedilink
    English
    arrow-up
    0
    ·
    1 month ago

    I think the reason so many AI bros are conservative is that conservatives have historically had really bad taste in art/media, so they see the drivel AI creates and think, “oh wow, it looks just like what the artists make,” not realizing that they don’t have the eye to see what it’s missing.

    • mnemonicmonkeys@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      0
      ·
      1 month ago

      Behind the Bastards had a different take. A lot of fascist movements get wierdly focussed on futurism and try to portray their movements as belonging in said future. Unfortunately I can’t remember the exact episodes this was mentioned

      • boonhet@sopuli.xyz
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 month ago

        Makes sense when you think about it

        You’re running an oppressive dictatorship. You still need enough people to support you to keep the thing going. What do you do? Make a bunch of people believe you’re going to improve their lives. How do you do that? First, you find an enemy to blame for everything. For Nazi Germany it was Jews, for the US now it’s mostly Latinos, but really all foreigners. But that alone might not be enough. So what else do you do? Pretend you’re doing everything economically and technologically to make things better for “the right people”.

      • saltesc@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        1 month ago

        Implying people are happy to buy the shit, which isn’t likely, especially in a competitive environment.

        • Leon@pawb.social
          link
          fedilink
          English
          arrow-up
          0
          ·
          1 month ago

          People buy AAA games all the time. Look at Starfield. Garbage game, still sold well.

          • Mister_Feeny@fedia.io
            link
            fedilink
            arrow-up
            1
            ·
            1 month ago

            Starfield is estimated to have sold 3 millions copies. Baldur’s gate 3, 15 million. Microsoft/Bethesda marketing budgets makes a difference, but not being garbage makes a much bigger difference.

            • Leon@pawb.social
              link
              fedilink
              English
              arrow-up
              1
              ·
              1 month ago

              Well yeah, I’m not going to argue that a well made game that respects the player isn’t going to do well. But that doesn’t matter to the publishers and their shareholders when they can pump out AI slop garbage year after year and still have people that drink it up. Just look at the yearly shooters and sports games, they sell enough.

              Besides, what happens when this sort of slop has been normalised? Look at the mobile market, no one bats an eye at the intensely predatory microtransactions, and you’ll even find people defending things like gacha games.

              There was a time where people scoffed at the notion of paying $2~ for some shitty cosmetics, but now people don’t even blink at the idea. Hell, it’s downright cheap in some cases. The AAA industry just has to slop things up for long enough for people to stop caring, because they will stop caring and then continue to shell out for the dubious privilege of guzzling their mediocre, uninspiring garbage.

  • rafoix@lemmy.zip
    link
    fedilink
    English
    arrow-up
    0
    ·
    1 month ago

    Tons of shit games are going to have lots of dialogue written by AI. It’s very likely that those games would have had shit dialogue anyway.

    • DdCno1@beehaw.org
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 month ago

      Sure, but human-written shit still had that human touch. It could be unintentionally funny, it could be a mixed bag that reaches unexpected heights at times. AI writing is just the bland kind of bad, not the interesting kind of bad.

      • SketchySeaBeast@lemmy.ca
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 month ago

        Great point. There’s no opportunity for “so bad it’s good”. The Room wouldn’t have been a thing if Tommy used AI.

  • Maiznieks@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    1 month ago

    Nah, can’t agree. I have postponed few ideas for years, was able to vibe them in a week during evenings, now i have something usable. 70% of it was vibed, just had to fix stupid stuff that was partially on my queries.

      • Maiznieks@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        arrow-down
        1
        ·
        1 month ago

        Wow, look, a professional right here! Must have a high job insecurity to care about “machines took our jooobs”. Grow up and realise a POC solution is better than no solution, like products don’t ever get rewritten, lol.

        • optissima@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          1 month ago

          We are talking about quality of writing fiction. There’s nothing wrong with being an amateur, but being rude when a professional has a higher skill cap than you is just kind of sad. This man has likely honed his craft over 10s of thousands of hours, respect that. You’ve basically told me that when you see someone just beginning to practice a skill you have honed you can’t tell a difference, or you lack the ability to transfer that concept to a skill you are unpracticed in. If you feel that this is still hard to understand, try reading the Eragon series then LotR. Both fun reads, but you can tell that one was written by someone early in their writing carrier vs later.

  • melsaskca@lemmy.ca
    link
    fedilink
    English
    arrow-up
    0
    arrow-down
    1
    ·
    1 month ago

    That means in about 6 months or so the AI content quality will be about an 8/10. The processors spread machine “learning” incredibly fast. Some might even say exponentially fast. Pretty soon it’ll be like that old song “If you wonder why your letters never get a reply, when you tell me that you love me, I want to see you write it”. “Letters” is an old version of one-on-one tweeting, but with no character limit.

    • Skua@kbin.earth
      link
      fedilink
      arrow-up
      1
      ·
      1 month ago

      Only if you assume that its performance will continue improving for a good while and (at least) linearly. The companies are really struggling to give their models more compute or more training data now and frankly it doesn’t seem like there have been any big strides for a while

      • kibiz0r@midwest.social
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 month ago

        Yeah… Linear increases in performance appear to require exponentially more data, hardware, and energy.

        Meanwhile, the big companies are passing around the same $100bn IOU, amortizing GPUs on 6-year schedules but burning them out in months, using those same GPUs as collateral on massive loans, and spending based on an ever-accelerating number of data centers which are not guaranteed to get built or receive sufficient power.

    • xthexder@l.sw0.com
      link
      fedilink
      English
      arrow-up
      1
      ·
      edit-2
      1 month ago

      What improvements have there been in the previous 6 months? From what I’ve seen the AI is still spewing the same 3/10 slop it has since 2021, with maybe one or two improvements bringing it up from 2/10. I’ve heard several people say some newer/bigger models actually got worse at certain tasks, and clean training data is pretty much dried up to even train more models.

      I just don’t see any world where scaling up the compute and power usage is going to suddenly improve the quality orders of magnitude. By design LLMs are programmed to output the most statistically likely response, but almost by definition is going to be the most average, bland response possible.

    • Arkthos@pawb.social
      link
      fedilink
      English
      arrow-up
      0
      ·
      1 month ago

      I doubt that. A lot of the poor writing quality comes down to choice. All the most powerful models are inherently trained to be bland, seek harmony with the user, and generally come across as kind of slimy in a typically corporate sort of way. This bleeds into the writing style pretty heavily.

      A model trained specifically for creative writing without such a focus would probably do better. We’ll see.

      • Holytimes@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 month ago

        I mean look at a hobby project like neuro sama vs chat gpt.

        It’s rather night and day difference in terms of responses and humanity.

        While neuro and her sister both come across as autistic 7 year olds. They still come across as mostly human autistic 7 year olds. They have their moments they just lose it, which every LLM has.

        But comparing them it’s really really obvious how many of the problems with the inhumanity and blandness is a choice of large companies to have the LLMs be marketable and corpo friendly.

        In a world where these models could be trained and allowed to actually have human ish responses and focus on being “normal” instead of sterile robots. They would at least be way more fun.

        Not much more reliable mind you. But at least they would be fun.