Learn how leading AI teams serve open LLMs in production Register

Modal for Academics

Modal is the fastest way for researchers to develop tomorrow's cutting-edge AI models and machine learning methods. Instantly deploy experiments on the most powerful GPUs with just a few lines of code. Iterate faster with Modal.

Why choose Modal for your research?







How Credits Work

  • Credits expire after conference submission results are finalized
  • Credits are automatically applied towards compute usage - not subscription fees
  • Credits are granted once per conference

Educators looking to use Modal to teach a course - please reach out to partnerships@modal.com

“Verifying an LLM-based approach that relies on test-time compute can be challenging to scale. Through collaboration with ARC Prize, the MIT + Cornell team partnered with Modal to provide both credits and infrastructure to make this possible. Huge thanks to Modal for working with us to spin up an environment that efficiently ran our model for verification.”

Kevin Ellis & Zenna Tavares, Researchers

“Check out Tokasaurus on Modal to make Llama-1B brrr! This repeated sampling example shows off two engine features that are important for serving small models: very low CPU overhead and automatic shared prefix exploitation with Hydragen.”

Jordan Juravsky, Researcher

“We wouldn't have been able to publish Four Over Six on such a tight deadline if it weren't for Modal! We needed to run hundreds of experiments on B200s to fill out our evaluation tables, and Modal made it super easy to run them all in parallel.”

Jack Cook, Researcher

FAQs









Ready to ship faster with Modal?

Apply Here