Artificial regurgitation
Link
Message thread
Matthias Zöchling wrote on
Everything—everything—that comes out of these “AI” platforms is a “hallucination.”
Spot on, @beep@follow.ethanmarcotte.com, as always.
“Hallucinating.”
Matthias Zöchling wrote on
Continued from previous comment.
And if you are interested in an ever so thorough take on the subject of AI, I highly recommend this @btconf@mastodon.social talk by @tink@w3c.social:
“There is no spoon”
Matthias Zöchling wrote on
Continued from previous comment.
Having watched Léonie’s talk this morning, I was about to write something along the lines of: If AI can help a blind person “see” the world around them in real time, I’m all for it. But if it helps you write an e-mail that nobody cared about in the first place, it’s just a waste of energy.
But apparently I can simply quote @hdv@front-end.social:
I do think usefulness can outweigh ethical issues in some very specific use cases, eg if users want to make use of these tools to remove accessibility barriers they’re affected by
So now I got two more blog posts to read. I’ll start with
“Is ‘ethical AI’ an oxymoron?”
…
Matthias Zöchling wrote on
Continued from previous comment.
… and then move on to @adactio@mastodon.social’s
“Uses”
with the following opening line:
I don’t use large language models.
I feel seen.
Ethan Marcotte wrote on
In reply to: @CSSence@mas.to.
Thank you, Matthias—I really appreciate that.
(And this is a wonderful thread you’ve pulled together. Thanks for that, too.)
Brian Dear wrote on
In reply to: @CSSence@mas.to.
They thought they were creating Artifical Intelligence.
What they actually were creating is Artificial Hallucination.
Simon Jones wrote on
In reply to: @CSSence@mas.to.
Agree. I made the same point at a CMS conference I was on a panel a few weeks ago. Everything is a hallucination.
Comment 8 is unavailable
Legal or technical reasons may be the cause.
Matthias Zöchling wrote on
In reply to: @beep@follow.ethanmarcotte.com.
You’re too kind, Ethan.
So yeah, ethics aside, it only took copyright infringements and vast amounts of energy to produce hallucinations. And if we use these hallucinations to generate new content, it’ll become new training data. Garbage in, garbage out, on a loop. AI regurgitation, reminds me of reuploading to Youtube over and over again.
At least we already see the resistance, people putting “100% human-made” stickers on their blogs.
Brian Dear wrote on
In reply to: @CSSence@mas.to.
So in the end, youtube videos of youtube videos uploaded as youtube videos which in turn get uploaded as youtube videos, each generation worse like going from 4K to HD to VHS to a blurry mess; in the end we get the "gray goo" everyone feared was coming with nanobots, only it’s the “AI slop” version… hopefully only digital and not rendered in actual molecules…
Get involved
Have your say on Mastodon, or simply share this thread.