High-performance WASM renderer for graphical subtitles (PGS and VobSub), written in Rust.
Started as a fork of Arcus92's libpgs-js, this project is re-engineered to maximize performance and extend functionality to VobSub, which was not supported by the original library. It remains fully backward compatible (only for PGS - obliviously). Special thanks to the original project for the inspiration!
- PGS (Blu-ray) subtitle parsing and rendering
- VobSub (DVD) subtitle parsing and rendering
- WebGPU rendering GPU-accelerated rendering with automatic Canvas2D fallback
- High-performance Rust-based rendering engine compiled to WebAssembly
- Zero-copy data transfer between JS and WASM where possible
- Caching for decoded bitmaps to optimize repeated rendering
- TypeScript support with full type definitions
pgs.mp4
vobsub.mp4
npm / bun
npm install libbitsub
# or
bun add libbitsubJSR (Deno)
deno add jsr:@altq/libbitsubFor best performance with large subtitle files, copy the WASM files to your public folder so Web Workers can access them:
# For Next.js, Vite, or similar frameworks
mkdir -p public/libbitsub
cp node_modules/libbitsub/pkg/libbitsub_bg.wasm public/libbitsub/
cp node_modules/libbitsub/pkg/libbitsub.js public/libbitsub/This enables off-main-thread parsing which prevents UI freezing when loading large PGS files.
To build from source, you need:
# Install wasm-pack
cargo install wasm-pack# Build WASM module and TypeScript wrapper
bun run build
# Build WASM only (for development)
bun run build:wasm
# Build release version (optimized)
bun run build:wasm:releaseBefore using any renderer, you must initialize the WASM module:
import { initWasm } from 'libbitsub'
// Initialize WASM (do this once at app startup)
await initWasm()The high-level API automatically handles video synchronization, canvas overlay, and subtitle fetching.
import { PgsRenderer } from 'libbitsub'
// Create renderer with video element (URL-based loading)
const renderer = new PgsRenderer({
video: videoElement,
subUrl: '/subtitles/movie.sup',
workerUrl: '/libbitsub.js', // Optional, kept for API compatibility
// Lifecycle callbacks (optional)
onLoading: () => console.log('Loading subtitles...'),
onLoaded: () => console.log('Subtitles loaded!'),
onError: (error) => console.error('Failed to load:', error)
})
// Or load directly from ArrayBuffer
const response = await fetch('/subtitles/movie.sup')
const subtitleData = await response.arrayBuffer()
const renderer = new PgsRenderer({
video: videoElement,
subContent: subtitleData, // Load directly from ArrayBuffer
onLoading: () => console.log('Loading subtitles...'),
onLoaded: () => console.log('Subtitles loaded!'),
onError: (error) => console.error('Failed to load:', error)
})
// The renderer automatically:
// - Fetches the subtitle file (if using subUrl) or uses provided ArrayBuffer
// - Creates a canvas overlay on the video
// - Syncs rendering with video playback
// - Handles resize events
// When done:
renderer.dispose()import { VobSubRenderer } from 'libbitsub'
// Create renderer with video element (URL-based loading)
const renderer = new VobSubRenderer({
video: videoElement,
subUrl: '/subtitles/movie.sub',
idxUrl: '/subtitles/movie.idx', // Optional, defaults to .sub path with .idx extension
workerUrl: '/libbitsub.js', // Optional
// Lifecycle callbacks (optional)
onLoading: () => setIsLoading(true),
onLoaded: () => setIsLoading(false),
onError: (error) => {
setIsLoading(false)
console.error('Subtitle error:', error)
}
})
// Or load directly from ArrayBuffer
const [subResponse, idxResponse] = await Promise.all([
fetch('/subtitles/movie.sub'),
fetch('/subtitles/movie.idx')
])
const subData = await subResponse.arrayBuffer()
const idxData = await idxResponse.text()
const renderer = new VobSubRenderer({
video: videoElement,
subContent: subData, // Load .sub directly from ArrayBuffer
idxContent: idxData, // Load .idx directly from string
onLoading: () => setIsLoading(true),
onLoaded: () => setIsLoading(false),
onError: (error) => {
setIsLoading(false)
console.error('Subtitle error:', error)
}
})
// When done:
renderer.dispose()Both PgsRenderer and VobSubRenderer support real-time customization of subtitle size and position:
// Get current settings
const settings = renderer.getDisplaySettings()
console.log(settings)
// Output: { scale: 1.0, verticalOffset: 0 }
// Update settings
renderer.setDisplaySettings({
scale: 1.2, // 1.2 = 120% size
verticalOffset: -10 // -10% (move up 10% of video height)
})
// Reset to defaults
renderer.resetDisplaySettings()Settings Reference:
-
scale(number): Scale factor for subtitles.1.0= 100% (Original size)0.5= 50%2.0= 200%- Range:
0.1to3.0
-
verticalOffset(number): Vertical position offset as a percentage of video height.0= Original position- Negative values move up (e.g.,
-10moves up by 10% of height) - Positive values move down (e.g.,
10moves down by 10% of height) - Range:
-50to50
Both PgsRenderer and VobSubRenderer provide real-time performance metrics:
// Get performance statistics
const stats = renderer.getStats()
console.log(stats)
// Output:
// {
// framesRendered: 120,
// framesDropped: 2,
// avgRenderTime: 1.45,
// maxRenderTime: 8.32,
// minRenderTime: 0.12,
// lastRenderTime: 1.23,
// renderFps: 60,
// usingWorker: true,
// cachedFrames: 5,
// pendingRenders: 0,
// totalEntries: 847,
// currentIndex: 42
// }
// Example: Display stats in a debug overlay
setInterval(() => {
const stats = renderer.getStats()
debugOverlay.textContent = `
FPS: ${stats.renderFps}
Frames: ${stats.framesRendered} (dropped: ${stats.framesDropped})
Avg render: ${stats.avgRenderTime}ms
Worker: ${stats.usingWorker ? 'Yes' : 'No'}
Cache: ${stats.cachedFrames} frames
`
}, 1000)Stats Reference:
| Property | Type | Description |
|---|---|---|
framesRendered |
number | Total frames rendered since initialization |
framesDropped |
number | Frames dropped due to slow rendering (>16.67ms) |
avgRenderTime |
number | Average render time in milliseconds (rolling 60-sample window) |
maxRenderTime |
number | Maximum render time in milliseconds |
minRenderTime |
number | Minimum render time in milliseconds |
lastRenderTime |
number | Most recent render time in milliseconds |
renderFps |
number | Current renders per second (based on last 1 second) |
usingWorker |
boolean | Whether rendering is using Web Worker (off-main-thread) |
cachedFrames |
number | Number of decoded frames currently cached |
pendingRenders |
number | Number of frames currently being decoded asynchronously |
totalEntries |
number | Total subtitle entries/display sets in the loaded file |
currentIndex |
number | Index of the currently displayed subtitle |
libbitsub automatically uses WebGPU for GPU-accelerated rendering when available, with automatic fallback to Canvas2D:
import { PgsRenderer, isWebGPUSupported } from 'libbitsub'
// Check WebGPU support
if (isWebGPUSupported()) {
console.log('WebGPU available - GPU-accelerated rendering enabled')
}
// Configure WebGPU preference
const renderer = new PgsRenderer({
video: videoElement,
subUrl: '/subtitles/movie.sup',
preferWebGPU: true, // default: true
onWebGPUFallback: () => console.log('Fell back to Canvas2D')
})Options:
preferWebGPU(boolean): Enable WebGPU rendering if available. Default:trueonWebGPUFallback(function): Callback when WebGPU is unavailable and falls back to Canvas2D
For more control over rendering, use the low-level parsers directly.
import { initWasm, PgsParser } from 'libbitsub'
await initWasm()
const parser = new PgsParser()
// Load PGS data from a .sup file
const response = await fetch('subtitles.sup')
const data = new Uint8Array(await response.arrayBuffer())
parser.load(data)
// Get timestamps
const timestamps = parser.getTimestamps() // Float64Array in milliseconds
// Render at a specific time
const subtitleData = parser.renderAtTimestamp(currentTimeInSeconds)
if (subtitleData) {
for (const comp of subtitleData.compositionData) {
ctx.putImageData(comp.pixelData, comp.x, comp.y)
}
}
// Clean up
parser.dispose()import { initWasm, VobSubParserLowLevel } from 'libbitsub'
await initWasm()
const parser = new VobSubParserLowLevel()
// Load from IDX + SUB files
const idxResponse = await fetch('subtitles.idx')
const idxContent = await idxResponse.text()
const subResponse = await fetch('subtitles.sub')
const subData = new Uint8Array(await subResponse.arrayBuffer())
parser.loadFromData(idxContent, subData)
// Or load from SUB file only
// parser.loadFromSubOnly(subData);
// Render
const subtitleData = parser.renderAtTimestamp(currentTimeInSeconds)
if (subtitleData) {
for (const comp of subtitleData.compositionData) {
ctx.putImageData(comp.pixelData, comp.x, comp.y)
}
}
parser.dispose()For handling both formats with a single API:
import { initWasm, UnifiedSubtitleParser } from 'libbitsub'
await initWasm()
const parser = new UnifiedSubtitleParser()
// Load PGS
parser.loadPgs(pgsData)
// Or load VobSub
// parser.loadVobSub(idxContent, subData);
console.log(parser.format) // 'pgs' or 'vobsub'
const subtitleData = parser.renderAtTimestamp(time)
// ... render to canvas
parser.dispose()constructor(options: VideoSubtitleOptions)- Create video-integrated PGS renderergetDisplaySettings(): SubtitleDisplaySettings- Get current display settingssetDisplaySettings(settings: Partial<SubtitleDisplaySettings>): void- Update display settingsresetDisplaySettings(): void- Reset display settings to defaultsgetStats(): SubtitleRendererStats- Get performance statisticsdispose(): void- Clean up all resources
constructor(options: VideoVobSubOptions)- Create video-integrated VobSub renderergetDisplaySettings(): SubtitleDisplaySettings- Get current display settingssetDisplaySettings(settings: Partial<SubtitleDisplaySettings>): void- Update display settingsresetDisplaySettings(): void- Reset display settings to defaultsgetStats(): SubtitleRendererStats- Get performance statisticsdispose(): void- Clean up all resources
load(data: Uint8Array): number- Load PGS data, returns display set countgetTimestamps(): Float64Array- Get all timestamps in millisecondscount: number- Number of display setsfindIndexAtTimestamp(timeSeconds: number): number- Find index for timestamprenderAtIndex(index: number): SubtitleData | undefined- Render at indexrenderAtTimestamp(timeSeconds: number): SubtitleData | undefined- Render at timeclearCache(): void- Clear decoded bitmap cachedispose(): void- Release resources
loadFromData(idxContent: string, subData: Uint8Array): void- Load IDX + SUBloadFromSubOnly(subData: Uint8Array): void- Load SUB only- Same rendering methods as PgsParser
loadPgs(data: Uint8Array): number- Load PGS dataloadVobSub(idxContent: string, subData: Uint8Array): void- Load VobSubloadVobSubOnly(subData: Uint8Array): void- Load SUB onlyformat: 'pgs' | 'vobsub' | null- Current format- Same rendering methods as above
interface VideoSubtitleOptions {
video: HTMLVideoElement // Video element to sync with
subUrl?: string // URL to subtitle file (provide this OR subContent)
subContent?: ArrayBuffer // Direct subtitle content (provide this OR subUrl)
workerUrl?: string // Worker URL (for API compatibility)
preferWebGPU?: boolean // Prefer WebGPU renderer if available (default: true)
onLoading?: () => void // Called when subtitle loading starts
onLoaded?: () => void // Called when subtitle loading completes
onError?: (error: Error) => void // Called when subtitle loading fails
onWebGPUFallback?: () => void // Called when WebGPU is unavailable
}interface VideoVobSubOptions extends VideoSubtitleOptions {
idxUrl?: string // URL to .idx file (optional, defaults to subUrl with .idx extension)
idxContent?: string // Direct .idx content (provide this OR idxUrl)
}interface SubtitleDisplaySettings {
// Scale factor (1.0 = 100%, 0.5 = 50%, 2.0 = 200%)
scale: number
// Vertical offset as % of video height (-50 to 50)
verticalOffset: number
}interface SubtitleRendererStats {
framesRendered: number // Total frames rendered since initialization
framesDropped: number // Frames dropped due to slow rendering
avgRenderTime: number // Average render time in milliseconds
maxRenderTime: number // Maximum render time in milliseconds
minRenderTime: number // Minimum render time in milliseconds
lastRenderTime: number // Last render time in milliseconds
renderFps: number // Current FPS (renders per second)
usingWorker: boolean // Whether rendering is using web worker
cachedFrames: number // Number of cached frames
pendingRenders: number // Number of pending renders
totalEntries: number // Total subtitle entries/display sets
currentIndex: number // Current subtitle index being displayed
}interface SubtitleData {
width: number // Screen width
height: number // Screen height
compositionData: SubtitleCompositionData[]
}
interface SubtitleCompositionData {
pixelData: ImageData // RGBA pixel data
x: number // X position
y: number // Y position
}Licensed under either of
- Apache License, Version 2.0
- MIT license