CRAFT: Training-Free Cascaded Retrieval for Tabular QA
arXiv:2505.14984v2 Announce Type: replace
Abstract: Open-Domain Table Question Answering (TQA) involves retrieving relevant tables from a large corpus to answer natural language queries. Traditional dense retrieval models such as DTR and DPR incur high computational costs for large-scale retrieval tasks and require retraining or fine-tuning on new datasets, limiting their adaptability to evolving domains and knowledge. We propose CRAFT, a zero-shot cascaded retrieval approach that first uses a sparse retrieval model to filter a subset of candidate tables before applying more computationally expensive dense models as re-rankers. To improve retrieval quality, we enrich table representations with descriptive titles and summaries generated by Gemini Flash 1.5, enabling richer semantic matching between queries and tabular structures.
Our method outperforms state-of-the-art sparse, dense, and hybrid retrievers on the NQ-Tables dataset. It also demonstrates strong zero-shot performance on the more challenging OTT-QA benchmark, achieving competitive results at higher recall thresholds, where the task requires multi-hop reasoning across both textual passages and relational tables. This work establishes a scalable and adaptable paradigm for table retrieval, bridging the gap between fine-tuned architectures and lightweight, plug-and-play retrieval systems. Code and data are available at https://coral-lab-asu.github.io/CRAFT/