Ultralytics Chat Widget

A lightweight, professional chat client for docs and apps. Streams tokens, supports markdown, and uses Ultralytics brand UI.

Press ⌘ + K to open

Why this widget

  • Centered modal with blurred/lightened backdrop for focus
  • Brand-aligned pill launcher with hover zoom only
  • Copy, like/dislike, share, retry actions (dark-mode friendly)
  • Animated Send/Stop icon swap β€” arrow up ↔ square
  • Minimal custom code, easy to embed in any page

Quick start

Add the script and instantiate with your config:

<script src="https://cdn.jsdelivr.net/gh/ultralytics/llm@main/js/chat.js"></script>
<script>
  // Option 1: Full configuration (all optional)
  const chat = new UltralyticsChat({
    apiUrl: '/api/chat',
    branding: {
      name: 'My AI Assistant',
      tagline: 'Ask me anything',
      logo: 'logo.png',
      pillText: 'Chat',
    },
    theme: {
      primary: '#042AFF',
      dark: '#111F68',
      yellow: '#E1FF25',
      text: '#0b0b0f',
    },
  });

  // Option 2: Minimal (uses defaults)
  // const chat = new UltralyticsChat({ apiUrl: '/api/chat' });
</script>

Then trigger with:

chat.toggle(true); // open
chat.toggle(false); // close

Install Ultralytics

Use pip to get the latest package:

pip install -U ultralytics
Tip: Prefer isolated environments (uv, venv, or conda) for reproducibility.

Sample Benchmarks (YOLO11)

Model mAP50-95 Params FPS (A100)
YOLO11n β€” β€” β€”
YOLO11s β€” β€” β€”
YOLO11m β€” β€” β€”
YOLO11l β€” β€” β€”
YOLO11x β€” β€” β€”

Numbers depend on dataset and hardware. Replace with your own metrics.

Code Examples

Inference with Python:

from ultralytics import YOLO

model = YOLO("yolo11s.pt")
results = model("bus.jpg")
results[0].show()

Training from CLI:

yolo train model=yolo11s.pt data=coco.yaml epochs=100 imgsz=640

YOLO Inference

Run single-image, directory, or video streams. Use conf and iou to tune precision/recall tradeoffs.

Training

Fine-tune on your datasets. Track runs, compare metrics, and export best checkpoints.

Export

Export to formats like ONNX, TensorRT, CoreML:

yolo export model=best.pt format=onnx opset=13 dynamic

FAQ

How does the chat integrate with docs?
It's a self-contained client that calls your API. Pass in RAG context or knowledge base sources on the server side.

Does it support dark mode?
Yes. It follows the page's data-theme attribute and inherits CSS variables.