In fairytales of yore, there were often magic incantations that stopped great forces merely through their utterance — no sooner than the words were spoken, would magical beings come to a grinding halt. In the 21st century, something very similar seems to be happening with LLMs. Twitter users have discovered that ChatGPT glitches if cleverly […]
Tag: hallucination
Economics Professor Explains Why ChatGPT Makes Up Fake Research Papers
It’s now well established that ChatGPT often makes up stuff — the official disclaimer on its site says “ChatGPT may produce inaccurate information about people, places, or facts”. But an economics professor has explained why the program does this. David Smerdon, who’s an assistant professor at the University of Queensland, has a theory around how […]