And then suddenly this is not something that fascinates people anymore… in 10 years as non-synthetic becomes the new bio or artisan or whatever you like.
Humanity has its ways of objecting accelerationism.
Put another way, over time people devalue things which can be produced with minimal human effort. I suspect it's less about humanity's values, and more about the way money closely tracks "time" (specifically the duration of human effort).
I strongly disagree. How many clothes do you buy that have 100 thread count, and are machine-made, vs hand-knit sweaters or something?
When did you ask people for directions, or other major questions, instead of Google?
You can wax poetic about wanting "the human touch", but at the end of the day, the market speaks -- people will just prefer everything automated. Including their partners, after your boyfriend can remember every little detail about you, notice everything including your pupils dilating, know exactly how you like it, when you like it, never get angry unless it's to spice things up, and has been trained on 1000 other partners, how could you go back? When robots can raise children better than parents, with patience and discipline and teaching them with individual attention, know 1000 ways to mold their behavior and achieve healthier outcomes. Everything people do is being commodified as we speak. Soon it will be humor, entertainment, nursing, etc. Then personal relations.
Just extrapolate a decade or three into the future. Best case scenario: if we nail alignment, we build a zoo for ourselves where we have zero power and are treated like animals who have sex and eat and fart all day long. No one will care about whatever you have to offer, because everyone will be surrounded by layers of bots from the time they are born.
PS: anything you write on HN can already have been written by AI, pretty soon you may as well quit producing any content at all. No one will care whether you wrote it.
> PS: anything you write on HN can already have been written by AI, pretty soon you may as well quit producing any content at all. No one will care whether you wrote it.
People theoretically would care, but the internet has already set up producing things to be pseudo-anonymous, so we have forgotten the value of actually having a human being behind content. That's why AI is so successful, and it's a damn shame.
What exactly is the value of having a human behind content if it gets to the point that content generated by AI is indistinguishable from content generated by humans?
The fact that anyone would ask this question is incredible!
It's so we can in a fraction of those cases, develop real relationships to others behind the content! The whole point of sharing is to develop connections with real people. If all you want to do is consume independently of that, you are effectively a soulless machine.
A few may be interested in developing a relationship. The vast majority though is only interested in consuming content and moving on. I fail to see how that preference makes one a "soulless machine". But maybe I'm already a lost cause shrugs.
I think "indistinguishable" is a receding horizon. People are already good at picking out AI text, and AI video is even easier. Even if it looks 100% realistic on the surface, the content itself (writing, concept, etc) will have a kind of indescribable "sameness" that will give it away.
If there's one thing that connects all media made in human history, it's that humans find humans interesting. No technology (like literally no technology ever) will change that.
You'll be surprised by how many already mistake AI generated content for human creation[0], when the tech is still so young. And it cuts both ways since people also mistake human creations for AI content [1]. I see no reason why it won't eventually get to the point where they're fully indistinguishable, given AI is continually being trained to be better. Bits are bits, whether arranged by man or AI.
> People are already good at picking out AI text, and AI video is even easier.
Source? My experience has been that people at most might be “ok” at picking up completely generic output, and outright terrible at identifying anything with a modicum of effort or chance placed into it.
> My experience has been that people at most might be “ok” at picking up completely generic output, and outright terrible at identifying anything with a modicum of effort or chance placed into it.
Bold of you to assume any effort is placed into content when the entire point of using AI in the first place is to avoid this.
> Bold of you to assume any effort is placed into content when the entire point of using AI in the first place is to avoid this.
I mean, i've seen people using it in that way yes. These are normally the same people I saw copying and pasting the first google result they found for any search as an answer to their customers/co-workers etc. qOr to whom you would say "Do not send this to the customer, this is my explanation to you, use your own words, this is just a high level blah blah" and then five minutes later you see your response word for word having gone out to a customer with zero modification or review for appropriateness.
I equally see a very different kind of usage, where its just another tool used for speeding up portions of work, but not being produced to complete a work in totality.
Like sadly yes, i've now see sales members with rando chrome extensions that just attach AI to everything and they just let it do whatever the fuck it wants, which makes me want to cry...but again, these people were already effectively doing that, they are just doing it faster than ever.
If a fish could write a novel, would you find what it wrote interesting, or would it seem like a fish wrote it? Humans absorb information relative to the human experience, and without living a human existence the information will feel fuzzy or uncanny. AI can approximate that but can't live it for real. Since it is a derivative of an information set, it can never truly express the full resolution of it's primary source.
All that may or may not be the case, but whether or not it is, given that AI is trained on the works of humanity, it only stands that it'll inevitably get to the point where the content it creates will evoke the same response as if created by a human. "Roses are red" is composed of the same bit sequence, regardless of creator.
> What exactly is the value of having a human behind content if it gets to the point that content generated by AI is indistinguishable from content generated by humans?
What would be the point of paying for AI content if nobody did anything to produce it? Just take that shit!
There won't even be any "paying" or "taking" per se. What is generated by one can be generated by another, making those concepts generally moot in that regard.
>PS: anything you write on HN can already have been written by AI
Yeah in some broad sense, the same as we've always had: back in the 2010s it could have been generated by a Markov chain, after all. The only difference now is that the average quality of these LLMs is much, much higher. But the distribution of their responses is still not on par with what I'd consider a good response, and so I hunt out real people to listen to. This is especially important because LLMs are still not capable of doing what I care most about: giving me novel data and insights about the real world, coming from the day to day lived experience of people like me.
HN might die but real people will still write blogs, and real people will seek them out for so long as humans are still economically relevant.
I have both machine-made and hand-knit sweaters. In general, I expect handmade clothes to be more expensive than machine-made, which kinda proves my point. I never said machine-made things had zero value. I said we will tend to devalue them relative to more human-intensive things.
Asking for directions is a bad example, because it takes very little time for both humans and machines to give you directions. Therefore it would be highly unusual for anyone to pay for this service (LOL)
> Humanity has its ways of objecting accelerationism.
Actually, typically human objection only slows it down and often it becomes a fringe movement, while the masses continue to consume the lowest common denominator. Take the revival of the flip phone, typewriter, etc. Sadly, technology marches on and life gets worse.
Does life get worse for the majority of people or do the fruits of new technology rarely address any individual person’s progress toward senescence? (The latter feels like tech moves forward but life gets worse.)
Of course, it depends on how you define "worse". If you use life expectancy, infant mortality, and disease, then life has in the past gotten better (although the technology of the past 20 years has RARELY contributed to any of that).
If you use 'proximity to wild nature', 'clean air', 'more space', then life has gotten worse.
But people don't choose between these two. They choose between alternatives that give them analgesics in an already corrupt society creating a series of descending local maximae.
TikTok’s main attraction are the people, not just the videos. Trends, drama and etc. all involve real humans doing real human stuff, so it’s relatable.
I might be wrong, but AI videos are on the same path as AI generated images. Cool for the first year, then “ah ok, zero effort content”.
Sure, humanity has its ways of objecting Accelerationism, but the process fundamentally challenges human identity:
"The Human Security System is structured by delusion. What's being protected there is not some real thing that is mankind, it's the structure of illusory identity. Just as at the more micro level it's not that humans as an organism are being threatened by robots, it's rather that your self-comprehension as an organism becomes something that can't be maintained beyond a certain threshold of ambient networked intelligence." [0]
See also my research project on the core thesis of Accelerationism that capitalism is AI. [1]
Humanity has its ways of objecting accelerationism.