The Knowledge Graph Conference Icon
The Knowledge Graph Conference
  • 馃彔Home
  • 馃搮Events
  • 馃懁Members
  • 馃數Announcements
  • 馃數Ask
  • 馃數Ask The Ontologists
  • 馃數Events
  • 馃數Jobs
  • 馃數Promotions
  • 馃數Share
Powered by Tightknit
Promotions

Tutorial on Unifying LLMs with KGs for QA at EDBT2025

Avatar of Chuangtao M.
Chuangtao M.
April 03, 2025

Last week, we gave a tutorial on unifying LLMs with KGs for QA at EDBT2025. This tutorial aims to furnish an overview of the state-of-the-art advances in unifying LLMs with knowledge graphs for question answering, such as KGs and text joint learning, knowledge-based instruction fine-tuning, RAG, GraphRAG, KG-RAG, Hybrid RAG, etc. The slides and other materials are now available at: https://machuangtao.github.io/LLM-KG4QA/tutorial-edbt25/

馃憤2
馃憤馃徎1

2 comments

路 Sorted by Oldest
  • Avatar of Gautam K.
    Gautam K.
    路

    Thanks for sharing, its helpful, one problem I am working on extracting triples from text using LLMs and I am struggling with evaluation. I do not have ground truth data, so can not do precision or recall techniques, do you have any suggestion for it?

  • Avatar of Chuangtao M.
    Chuangtao M.
    路

    Thanks. I am not exactly working on the triple extraction, but I think there are some benchmark dataset for triple extraction. https://aclanthology.org/2024.isa-1.10/