NVIDIA has released Nemotron 3 Super, an open-hybrid Mamba-transformer mixture-of-experts model designed for agentic reasoning [1].
This release is significant because it provides the research community with an open model capable of more advanced reasoning tasks. By combining different architectural approaches, NVIDIA seeks to push the boundaries of how AI agents plan and execute complex goals.
The company said Nemotron 3 Super is "an open‑hybrid Mamba‑transformer MoE for agentic reasoning" [1]. According to a technical report from NVIDIA Research labs, the model utilizes a mixture-of-experts (MoE) framework combined with a hybrid Mamba-transformer architecture [2]. This design is intended to improve the efficiency and capability of the model when handling tasks that require multi-step logic.
While the primary focus of Nemotron 3 Super is on reasoning and research, other NVIDIA AI developments continue to impact different sectors. Some reports have highlighted the company's work in gaming, noting that DLSS-5 represents a "breakthrough in visual fidelity for games" [3]. However, some industry observers said the latest AI push from the company is not centered on raw performance or frame rates [3].
NVIDIA made the model available through its developer website and research labs in the U.S. [1]. The company said the goal of the release is to support the broader research community in developing more capable AI agents [1].
“Introducing Nemotron‑3 Super, an open‑hybrid Mamba‑transformer MoE for agentic reasoning.”
The shift toward 'agentic reasoning' signifies a move from AI that simply predicts the next word to AI that can act as an agent to solve problems. By open-sourcing a hybrid architecture that blends Mamba and Transformer elements, NVIDIA is attempting to solve the efficiency bottlenecks of traditional transformers while maintaining their reasoning power, potentially accelerating the development of autonomous AI software.





