Skip to content

Node Reference

Fabric workflows are built by composing nodes — self-contained operations that process inputs and produce outputs. Each node declares its inputs, outputs, runtime requirements, and default configuration.

Nodes that use Docker containers (FFmpeg, Whisper, yt-dlp, Python ML) require the tool images to be built first:

Terminal window
just docker-build-tools
CategoryDescription
AIText generation, extraction, classification, and embedding via AI providers
Media & SourceVideo/audio ingestion, transcoding, and media processing
Video AnalysisScene detection, face reframing, and video analysis
Text ProcessingTitle generation, description writing, and text refinement
Workflow ControlFan-out, fan-in, conditional branching, and human approval
State PersistenceKey-value storage for workflow state
OtherHTTP calls, asset storage, and publishing

Every node has:

  • Operation — canonical identifier (e.g., ai.generate, source.ingest)
  • Inputs — typed ports that accept data from upstream nodes or workflow context
  • Outputs — typed ports that produce data for downstream nodes
  • Runtime — how the node executes (AI provider, Docker container, native, webhook, etc.)
  • Policy — timeout, retries, and failure behavior
  • Configuration — node-specific settings (model, provider, format, etc.)

Nodes are composed into workflows using the SDK or JSON definitions. Each node page includes usage examples in TypeScript, Python, and raw JSON.