Loki: Open-source solution designed to automate the process of verifying factuality
-
Updated
May 15, 2024 - Python
Loki: Open-source solution designed to automate the process of verifying factuality
✨✨Woodpecker: Hallucination Correction for Multimodal Large Language Models. The first work to correct hallucinations in MLLMs.
Awesome-LLM-Robustness: a curated list of Uncertainty, Reliability and Robustness in Large Language Models
[ICLR'24] Mitigating Hallucination in Large Multi-Modal Models via Robust Instruction Tuning
RefChecker provides automatic checking pipeline and benchmark dataset for detecting fine-grained hallucinations generated by Large Language Models.
[CVPR'24] HallusionBench: You See What You Think? Or You Think What You See? An Image-Context Reasoning Benchmark Challenging for GPT-4V(ision), LLaVA-1.5, and Other Multi-modality Models
Benchmarking the Hallucination of Chinese Large Language Models via Unconstrained Generation
😎 up-to-date & curated list of awesome LMM hallucinations papers, methods & resources.
[IJCAI 2024] FactCHD: Benchmarking Fact-Conflicting Hallucination Detection
TruthX: Alleviating Hallucinations by Editing Large Language Models in Truthful Space
This is the official repo for Debiasing Large Visual Language Models, including a Post-Hoc debias method and Visual Debias Decoding strategy.
Code & Data for our Paper "Alleviating Hallucinations of Large Language Models through Induced Hallucinations"
Visual Correspondence Hallucination: Towards Geometric Reasoning (Under Review)
"Enhancing LLM Factual Accuracy with RAG to Counter Hallucinations: A Case Study on Domain-Specific Queries in Private Knowledge-Bases" by Jiarui Li and Ye Yuan and Zehua Zhang
CVPR2018 Face Super-resolution with supplementary Attributes
Knowledge Verification to Nip Hallucination in the Bud
An explainable sentence similarity measurement
[NLPCC 2024] Shared Task 10: Regulating Large Language Models
This repository contains the code of our paper 'Skip \n: A simple method to reduce hallucination in Large Vision-Language Models'.
Add a description, image, and links to the hallucination topic page so that developers can more easily learn about it.
To associate your repository with the hallucination topic, visit your repo's landing page and select "manage topics."