Open-source AI for advanced coding and reasoning
Ramanujan Rnj-1 is an open-source large language model comprising two variants: Rnj-1 Base and Rnj-1 Instruct. Built on the Gemma 3 architecture, this 8 billion parameter model utilizes global self-attention and YaRN technology to extend context length to 32k tokens. Rnj-1 excels in algorithmic code generation tasks, competing effectively with leading models such as GPT OSS 20B on benchmarks like HumanEval+ and MBPP+. The Rnj-1 Instruct variant demonstrates superior performance in software engineering tasks, significantly outperforming comparably sized models on SWE-bench and exhibiting advanced capabilities in tool use as indicated by the Berkeley Function Calling Leaderboard. Additionally, Rnj-1 showcases strong mathematical problem-solving skills, achieving results comparable to top models on tasks like AIME'25 and Minerva-MATH. The model is robust to quantization, maintaining quality across various formats, which enhances its inference performance.