That website was made by someone suffering from some cognitive dissonance. They correctly observe that LLMs “can produce convincing-sounding information, but that information may not be accurate or reliable” and then somehow immediately afterwards conclude that “summarize this for me” is the type of thing which LLMs “might” be “good at”.
Maybe you like this Website:
https://stopcitingai.com/
😬
That website was made by someone suffering from some cognitive dissonance. They correctly observe that LLMs “can produce convincing-sounding information, but that information may not be accurate or reliable” and then somehow immediately afterwards conclude that “summarize this for me” is the type of thing which LLMs “might” be “good at”.