This document covers the video track processors available in @livekit/track-processors.
The BackgroundProcessor is a prebuilt video processor that supports blurring the background of a user's local video or replacing it with a virtual background image. It can be switched between modes on the fly.
BackgroundProcessor({ mode: 'background-blur', blurRadius: 10 })— Blur the background with an optional blur radius (defaults to 10)BackgroundProcessor({ mode: 'virtual-background', imagePath: "http://path.to/image.png" })— Replace the background with an imageBackgroundProcessor({ mode: 'disabled' })— Passthrough mode, no effect applied (useful for avoiding switching artifacts, see below)
Before using BackgroundProcessor, check for browser compatibility:
import {
BackgroundProcessor,
supportsBackgroundProcessors,
supportsModernBackgroundProcessors,
} from '@livekit/track-processors';
if (!supportsBackgroundProcessors()) {
throw new Error('This browser does not support background processors');
}
if (supportsModernBackgroundProcessors()) {
console.log('This browser supports modern APIs that are more performant');
}The simplest approach is to create a processor and attach it to a local video track:
import { BackgroundProcessor } from '@livekit/track-processors';
const videoTrack = await createLocalVideoTrack();
const processor = BackgroundProcessor({ mode: 'background-blur' });
await videoTrack.setProcessor(processor);
room.localParticipant.publishTrack(videoTrack);Calling videoTrack.setProcessor() / videoTrack.stopProcessor() on demand can produce visual artifacts during the switch. A better approach is to initialize the processor in disabled mode up front and use switchTo() to toggle effects. This avoids artifacts entirely:
const videoTrack = await createLocalVideoTrack();
const processor = BackgroundProcessor({ mode: 'disabled' });
await videoTrack.setProcessor(processor);
room.localParticipant.publishTrack(videoTrack);
async function enableBlur(radius) {
await processor.switchTo({ mode: 'background-blur', blurRadius: radius });
}
async function disableBlur() {
await processor.switchTo({ mode: 'disabled' });
}flowchart LR
A[Camera\nMediaStreamTrack] --> B[ProcessorWrapper]
B -->|VideoFrame| C[Transformer<br>e.g. BackgroundTransformer]
C -->|Transformed<br>VideoFrame| B
B --> D[Processed<br>MediaStreamTrack]
D --> E[Published to SFU]
Video processors in this package are built on two layers:
-
ProcessorWrapper— Handles the plumbing of intercepting a video track's frames, passing them through a transformer, and producing a processed output track. It manages browser compatibility (usingMediaStreamTrackProcessor/MediaStreamTrackGeneratorwhere available, with acanvas.captureStream()fallback). -
A Transformer (e.g.,
BackgroundTransformer) — Implements the actual frame-by-frame processing logic.
Note: You don't have to follow this
Transformer+ProcessorWrapperpattern. You can implement theTrackProcessorinterface directly if you prefer. However, usingProcessorWrapperis convenient because it abstracts away theMediaStreamTrack→VideoFrame→ transformer →VideoFrame→MediaStreamTrackconversion, which most use cases don't need to worry about.
To create a custom video processor using ProcessorWrapper, instantiate it with your own transformer:
import { ProcessorWrapper } from '@livekit/track-processors';
const pipeline = new ProcessorWrapper(new MyCustomTransformer(options));- BackgroundTransformer — Can blur the background, replace it with a virtual background image, or operate in a disabled passthrough state.