
Coffee Lecture: How to deal with the hallucination problem in AI (e)

AI-powered assistants such as ChatGPT can provide responses that contain false or misleading information, which is commonly referred to as hallucination or confabulation. Using the Scite Assistant as an example, we explore how retrieval-augmented generation increases the credibility of AI-generated responses through actual scientific literature.
Für die Coffee Lectures ist keine Anmeldung erforderlich. Loggen Sie sich zur Startzeit ein unter:
Meeting-ID: 667 6140 1859, Kenncode: 666319