In the direction of Verifiable Textual content Technology with Evolving Reminiscence and Self-Reflection
Authors: Hao Sun, Hengyi Cai, Bo Wang, Yingyan Hou, Xiaochi Wei, Shuaiqiang Wang, Yan Zhang, Dawei Yin
Summary: Regardless of the outstanding potential of huge language fashions (LLMs) in language comprehension and era, they typically undergo from producing factually incorrect data, also called hallucination. A promising resolution to this difficulty is verifiable textual content era, which prompts LLMs to generate content material with citations for accuracy verification. Nonetheless, verifiable textual content era is non-trivial because of the focus-shifting phenomenon, the intricate reasoning wanted to align the declare with right citations, and the dilemma between the precision and breadth of retrieved paperwork. On this paper, we current VTG, an revolutionary framework for Verifiable Textual content Technology with evolving reminiscence and self-reflection. VTG introduces evolving lengthy short-term reminiscence to retain each useful paperwork and up to date paperwork. A two-tier verifier geared up with an proof finder is proposed to rethink and mirror on the connection between the declare and citations. Moreover, lively retrieval and various question era are utilized to boost each the precision and breadth of the retrieved paperwork. We conduct intensive experiments on 5 datasets throughout three knowledge-intensive duties and the outcomes reveal that VTG considerably outperforms baseline