I feel like the author is missing a huge point here by fighting this. The entire reason why GPL and any other copyleft license exists in the first place is to ensure that the rights of a user to modify, etc a work cannot be ever taken away. Before, relicensing as MIT - or any other fully permissive license - would've meant open doors to apply restrictions going forward, but with AI this is now a non-issue. Code is now very cheap. So the way I see this, anyone who is for copyleft should be embracing AI-created things as not being copyrightable (or a rewrite being relicensable) hard*.
The user is the end-user of the product. If the relicensing means that someone down the line receives a close-down binary application that he cannot modify, that's a violation of the user's rights.
But it's a non-issue as said user can just have AI reverse engineer said binary. Or reimplement something with the same specs. That's what it means for code to be cheap.
It may be "cheap" at the moment. Let's revisit when the AI companies decide they need to regain a little bit of the hundreds of billions of dollars in losses they're creating.
China is always waiting for this. And the US won't allow China to get all the users who'd emigrate over increased costs, so the costs will remain low. They'll have to find ways to recoup that don't involve raising the cost of code.
That is still true, but it was more relevant back when "user" meant "programmer at another university". The "end-user" for most software is not a programmer these days.
If I release blub 1.0.0 under GPL, you cannot fork it and add features and release that closed-source, but I can certainly do that as I have ownership. I can't stop others continuing to use 1.0.0 and develop it further under the GPL, but what happens to my own 1.1.0 onwards is up to me. I can even sell the rights to use it closed-source.
You can do whatever you want. And someone else can - and will, if it's worth it - now do a deep analysis and have it reimplemented with said analysis, all via LLM. And particularly, since the law is now that AI creations can't be copyrighted, that reimplementation will be firmly in public domain. Nobody will have to even look at 1.1.0. See Valkey as one of the more recent examples, and there isn't even any LLM involved there.
Well put. It seems like a lot of people have tunnel vision on this. It's going to take a while for people to realize that copyright is obsolete, especially in its current form.