Retrieval-Augmented Generation (RAG) enhances language model responses by incorporating external knowledge. However, its effectiveness heavily depends on the quality of the retrieved documents. Using a fixed number of retrieved documents
often fails to adapt to varying query complexity, leading either to irrelevant retrievals or to missing crucial evidence. To address this issue, we propose DyRAG, a hybrid retrieval framework that dynamically adjusts
based on query characteristics while maintaining computational efficiency. Our method improves retrieval performance by maximizing relevant information and minimizing noise. We evaluate DyRAG across recommendation, question answering, and fact-checking tasks, where it consistently outperforms fixed-
and hybrid baselines. Compared to traditional approaches, DyRAG demonstrates greater robustness and adaptability across diverse domains.