> The point was that LLMs are not well set up to find new insights unless they are already somehow contained in the knowledge they have been trained on.
The author is, to use his phrase, "deeply uninformed" on this point.
LLMs generalize very well and they have successfully pushed the frontier of at least one open problem in every scientific field I've bothered to look up.
The author is, to use his phrase, "deeply uninformed" on this point.
LLMs generalize very well and they have successfully pushed the frontier of at least one open problem in every scientific field I've bothered to look up.