Tutorials#

This section contains tutorials that help you get started with the NeMo Guardrails library.

Check Harmful Content

Check text inputs and outputs for harmful content using Nemotron Content Safety NIM.

Check Harmful Content with Llama 3.1 Nemotron Safety Guard 8B V3 NIM
Restrict Topics

Restrict conversations to allowed topics using Nemotron Topic Control NIM.

Restrict Topics with Llama 3.1 NemoGuard 8B TopicControl NIM
Detect Jailbreak Attempts

Detect and block adversarial prompts and jailbreak attempts using Nemotron Jailbreak Detect NIM.

Detect Jailbreak Attempts with NVIDIA NemoGuard JailbreakDetect NIM
Add Multimodal Content Safety

Add safety checks to images and text using a vision model as LLM-as-a-Judge.

Add Multimodal Content Safety Using a Vision Model as LLM-as-a-Judge