Welcome to cosmosage—an advanced AI assistant designed for those curious about the universe we live in. cosmosage was trained on thousands of papers and books and will try its best to answer your questions about cosmology. However, while cosmosage is a powerful tool for brainstorming, expanding your knowledge, and exploring new ideas, keep in mind that it's still an LLM and may sometimes produce inaccurate responses.
Unfortunately, cosmosage is no longer available for free inference. The latest model in the series is AstroSage-70B. Its weights are freely available at https://huggingface.co/AstroMLab/AstroSage-70B, but you will need a GPU with at least ~60GB of VRAM to run inference. If you know how I can get access to a GPU to make cosmosage available to the public again, please contact me. Info below.
Talk to cosmosageStay up to date with the latest developments in cosmology and instrumentation. The machine learning-powered recommendation service was adapted from Andrej Karpathy's arxiv-sanity-lite and allows you to tag papers and receive relevant suggestions. In my experience, after tagging about five papers you like, the recommendations will become relevant to your interests.
Get paper recommendationsLearn more about the development and capabilities of cosmosage by reading the following papers.
cosmosage: A natural-language assistant for cosmology
Tijmen de Haan
Published in Astronomy and Computing, Volume 51 (2025).
Achieving GPT-4o level performance in astronomy with a specialized 8B-parameter large language model
Tijmen de Haan, Yuan-Sen Ting, Tirthankar Ghosal, Tuan Dung Nguyen, Alberto Accomazzi, Azton Wells, Nesar Ramachandra, Rui Pan, Zechang Sun
Published in Scientific Reports, Volume 15 (2025).
AstroMLab 4: Benchmark-Topping Performance in Astronomy Q&A with a 70B-Parameter Domain-Specialized Reasoning Model
Tijmen de Haan, Yuan-Sen Ting, Tirthankar Ghosal, Tuan Dung Nguyen, Alberto Accomazzi, Emily Herron, Vanessa Lama, Rui Pan, Azton Wells, and Nesar Ramachandra
arXiv:2505.17592 (2025), accepted contribution to Astro4ML (ICML 2025).