- Article
BOLT: Building Open-Source LLMs for Your Target Domain via Automated Hierarchical Knowledge Distillation
- Runze Lu,
- Zhaoyu Fan,
- Guanjie Wang and
- Qingjiang Shi
Adapting open-source large language models (LLMs) to specialized domains remains a critical challenge due to domain knowledge gaps, data scarcity, and reference hallucination. Existing approaches often neglect the structural characteristics of domain...