Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

LLMs aren’t people, and the authors have not convinced me that they will behave like people in this context.

This was my initial reaction as well, before reading the interview in full. They admit that there are problems with the approach, but they seem to have designed the simulation in a very thoughtful way. There really doesn't seem to be a better approach, apart from enlisting vast numbers of people instead of using LLMs/agent systems. That has its own problems as well of course, even leaving cost and difficulty aside.

There’s no option to create original content...

While this is true, I'd say the vast majority of users don't create original content either, but still end up shaping the social media environment through the actions that they did model. Again, it's not perfect but I'm more convinced that it might be useful after reading the interview.



I'm not sure the experiment can be done other than to try interventions on real users of a public social media service as Facebook did in the article I linked. Of course people running those services usually don't have the incentives to test harm reduction strategies and certainly don't want to publicize the results.

> the vast majority of users don't create original content

That's true now at least most of the time, but I think it's as much because of design and algorithmic decisions by the platforms to emphasize other types of content. Early Facebook in particular was mostly original content shared between people who knew each other. The biggest problem with that was it wasn't very profitable.


> > There’s no option to create original content...

> While this is true, I'd say the vast majority of users don't create original content either, but still end up shaping the social media environment through the actions that they did model. Again, it's not perfect but I'm more convinced that it might be useful after reading the interview.

Ok, but, this is by design. Other forms of social media, places like Mastodon etc have a far, far higher rate of people creating original content.


LLMs don't learn, though.

The fundamental problem with social media (and many other things) is humans, specifically our biological makeup and (lack of) overriding mechanisms. One could argue that pretty much everything we call 'civilised behavior' is an instance of applying a cultural override for a biological drive. Without it, we are very close to shit-flinging murderous apes.

For so many of our problems what goes wrong is that we fail to stop our biological drive from taking the wheel to the point where we consciously observe ourselves doing things we rationally / culturally know we should not be doing.

Now the production side of media/content/goods evolves very fast and does not have a similarly strong legacy biological drive holding it back, so it is very, very good (and ever improving) at exploiting the sitting duck that is our biological makeup (food engineering, game engineering etc. are very similar to social media engineering in this regard).

The only reliable defense against that is training ourselves to not give in to our biological drives when they are counterproductive. For some that might be 'disconnect completely' (i.e. take away the temptations altogether), but having a healthy approach to encountering the temptations is far more robust. I am of the opinion that labeling the social media purveyors and producers in general as evil abusers is not necessarily inaccurate, but counterproductive in that it tends to absolve individuals of their responsibility in the matter. Imagine telling a heroin addict: "you can't help it, it's those evil dealers that are keeping you hooked to the heroin".


Nonsense. The vast majority of my Facebook friends post at least some original content.


Fortunately we don't have to rely on your anecdata, people actually study this stuff:

https://news.gallup.com/poll/467792/social-media-users-incli...

U.S. adults commonly engage with popular social media platforms but are more inclined to browse content on those websites and apps than to post their own content to them. The vast majority of those who say they use these platforms have accounts with them, but less than half who have accounts -- and even smaller proportions of all U.S. adults -- post their own content.

https://www.pewresearch.org/internet/2019/04/24/sizing-up-tw...

Most users rarely tweet, but the most prolific 10% create 80% of tweets from adult U.S. users

https://www.pewresearch.org/internet/2021/11/15/the-behavior...

The analysis also reveals another familiar pattern on social media: that a relatively small share of highly active users produce the vast majority of content.


That's junk science and doesn't refute the specific point I made. Facebook users are far more likely to post original content than X users. It might just be some blurry backlit vacation photos but it is original content.


They post but it doesn’t get read, all their friends feeds are just swamped with crap like theirs is.


But then we’re back to blaming the algorithm.


Algorithmic choices are likely a major contributor to the phenomenon. If posting vacation photos on Facebook gets interactions from friends and family, more people will do it. If it doesn't, fewer people will.


Yes, but the point of this particular study was that the results did not depend on the algo.


And the point of my critiques is that this particular study cannot reasonably model a real life social media system.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: