TaxoAdapt: Aligning LLM-Based Multidimensional Taxonomy Construction to Evolving Research Corpora
Abstract
TaxoAdapt dynamically adapts an LLM-generated taxonomy for scientific literature across multiple dimensions, improving granularity and coherence compared to existing methods.
The rapid evolution of scientific fields introduces challenges in organizing and retrieving scientific literature. While expert-curated taxonomies have traditionally addressed this need, the process is time-consuming and expensive. Furthermore, recent automatic taxonomy construction methods either (1) over-rely on a specific corpus, sacrificing generalizability, or (2) depend heavily on the general knowledge of large language models (LLMs) contained within their pre-training datasets, often overlooking the dynamic nature of evolving scientific domains. Additionally, these approaches fail to account for the multi-faceted nature of scientific literature, where a single research paper may contribute to multiple dimensions (e.g., methodology, new tasks, evaluation metrics, benchmarks). To address these gaps, we propose TaxoAdapt, a framework that dynamically adapts an LLM-generated taxonomy to a given corpus across multiple dimensions. TaxoAdapt performs iterative hierarchical classification, expanding both the taxonomy width and depth based on corpus' topical distribution. We demonstrate its state-of-the-art performance across a diverse set of computer science conferences over the years to showcase its ability to structure and capture the evolution of scientific fields. As a multidimensional method, TaxoAdapt generates taxonomies that are 26.51% more granularity-preserving and 50.41% more coherent than the most competitive baselines judged by LLMs.
Community
TaxoAdapt: Aligning LLM‑Based Multidimensional Taxonomy Construction to Evolving Research Corpora 📚
We introduce TaxoAdapt, a dynamic framework that constructs and adapts multidimensional taxonomies—organized hierarchically across breadth and depth—by iteratively aligning with the evolving content of a target research corpus.
🌱 Dynamic Taxonomy Growth – Instead of static hierarchies, TaxoAdapt incrementally expands its taxonomy structure (both width and depth) in response to the topical distribution of the incoming corpus
papers
📏 Multidimensional Lens – Recognizes that papers contribute along various axes (e.g., methodology, new tasks, evaluation metrics, benchmarks), and models this complexity explicitly
arxiv.org
🤖 LLM‑Guided Classification – Leverages large language models for hierarchical classification, rooted in corpus evidence rather than just pretrained knowledge
📈 Proven Across Time & Domains – Validated on multiple CS conferences’ papers over time, TaxoAdapt yields taxonomies that are 26.5% more granularity‑preserving and 50.4% more coherent compared to strong baselines according to LLM judgments
✨ Efficient & Adaptive – Automatically adapts to new topics and shifts in research trends—cutting down reliance on manual expert curation while keeping structure current and high‑quality
This is an automated message from the Librarian Bot. I found the following papers similar to this paper.
The following papers were recommended by the Semantic Scholar API
- Scientific Paper Retrieval with LLM-Guided Semantic-Based Ranking (2025)
- Harnessing Large Language Models for Scientific Novelty Detection (2025)
- Science Hierarchography: Hierarchical Organization of Science Literature (2025)
- FoodTaxo: Generating Food Taxonomies with Large Language Models (2025)
- XtraGPT: LLMs for Human-AI Collaboration on Controllable Academic Paper Revision (2025)
- GraphGen: Enhancing Supervised Fine-Tuning for LLMs with Knowledge-Driven Synthetic Data Generation (2025)
- AutoRev: Automatic Peer Review System for Academic Research Papers (2025)
Please give a thumbs up to this comment if you found it helpful!
If you want recommendations for any Paper on Hugging Face checkout this Space
You can directly ask Librarian Bot for paper recommendations by tagging it in a comment:
@librarian-bot
recommend
Models citing this paper 0
No model linking this paper
Datasets citing this paper 0
No dataset linking this paper
Spaces citing this paper 0
No Space linking this paper
Collections including this paper 0
No Collection including this paper