Skip to main content
  1. Blog
  2. Article

Canonical
on 28 April 2026

Run NVIDIA Nemotron 3 Nano Omni locally in a single command


Today, NVIDIA introduced the NVIDIA Nemotron™ 3 Nano Omni, a highly-efficient multimodal model designed to understand and reason across video, audio, images, and language. 

Canonical is enabling immediate access to Nemotrom 3 Nano Omni through inference snaps: pre-packaged AI inference runtimes distributed as snap packages for consistent deployment across systems. Developers and enterprises can deploy the model seamlessly across NVIDIA-enabled environments with a single command:

sudo snap install nemotron-3-nano-omni

With the Nemotron 3 Nano Omni delivered through inference snaps, deployment shifts from a complex integration task to a repeatable, standardized operation. A single install produces a consistent, production-grade runtime that scales across environments without rework, enabling teams to focus on building and operating agentic applications rather than managing infrastructure.

What is Nemotron 3 Nano Omni?

NVIDIA Nemotron 3 Nano Omni is an open multimodal foundation model that unifies reasoning across text, images, video, audio, and documents within a single architecture. It features a 256K token context window and a hybrid mixture of experts (MoE) architecture optimized for high multimodal throughput and accuracy. 

Nemotron 3 Nano Omni powers perception sub-agents, giving agentic systems “eyes and ears” while maintaining a unified multimodal context across steps. There is no need to stitch together separate vision, speech, and language models, reducing latency and orchestration complexity.

Easy deployment with inference snaps

Inference snaps bundle models, dependencies, and execution environments so inference workloads can run reproducibly on edge devices, workstations or servers – without manual setup.

Canonical inference snaps  provide a direct path from local installation to a production deployment. To get started, simply run:

sudo snap install nemotron-3-nano-omni

This installs a fully packaged, production-ready inference stack, including the model, runtime, and optimizations; no manual configuration required.

Inference snaps provide:

  • Zero-friction deployment: no dependency resolution, no environment drift, no custom build pipelines
  • Consistent runtime across environments: identical behavior on cloud, on-premises, and edge systems
  • Secure, confined execution: strict isolation with automatic updates and verified distribution
  • Optimized performance out of the box: pre-tuned for supported hardware
  • Simplified operations: standardized packaging reduces maintenance, patching, and upgrade complexity

For enterprises, this translates into significantly faster deployment times, saving weeks of integration and validation work, and enabling scalable AI infrastructure with a single portable artifact. 

Learn more at: https://github.com/canonical/inference-snaps 

About Canonical 

Canonical, the publisher of Ubuntu, provides open source security, support, and services. Our portfolio covers critical systems, from the smallest devices to the largest clouds, from the kernel to containers, from databases to AI. With customers that include top tech brands, emerging startups, governments and home users, Canonical delivers trusted open source for everyone. 

Learn more at https://canonical.com/ 

Learn more

Find out more about Canonical’s collaboration with NVIDIA.

Related posts


Abdelrahman Hosny
24 March 2026

Canonical welcomes NVIDIA’s donation of the GPU DRA driver to CNCF

Partners Article

At KubeCon Europe in Amsterdam, NVIDIA announced that it will donate the GPU Dynamic Resource Allocation (DRA) Driver to the Cloud Native Computing Foundation (CNCF). This marks an important milestone for the Kubernetes ecosystem and for the future of AI infrastructure. For years, GPUs have been central to modern machine learning and high ...


Canonical
16 March 2026

Meet Canonical at NVIDIA GTC 2026

Ubuntu Article

Previewing at the event: NVIDIA CUDA support in Ubuntu 26.04 LTS, NVIDIA Vera Rubin NVL72 architecture support in Ubuntu 26.04 LTS, Canonical’s official Ubuntu image for NVIDIA Jetson Thor, upcoming support for NVIDIA DGX Station and NVIDIA DOCA-OFED, and NVIDIA RTX PRO 4500 support. NVIDIA GTC 2026 is here, bringing together the technolo ...


Canonical
5 January 2026

Canonical announces Ubuntu support for the NVIDIA Rubin platform

Canonical announcements Article

Official Ubuntu support for the NVIDIA Rubin platform, including the NVIDIA Vera Rubin NVL72 rack-scale systems, announced at CES 2026 CES 2026, Las Vegas. – Canonical, the publisher of Ubuntu, is pleased to announce official support for the NVIDIA Rubin platform and the latest distributions of the new NVIDIA Nemotron 3 open models.  As A ...