hooks/cluster
Hook Clustering — embed and cluster hooks to find recurring patterns.
Category: hooks
Source: workflows/hooks/cluster.py
Input Schema
Section titled “Input Schema”| Field | Type | Default | Description |
|---|---|---|---|
classified_hooks | any[] | — | |
openai_api_key | object | — | |
regenerate | object | — | When set, this run is a regeneration. Workflows may read direction / keep / extra_instructions to modulate prompts; the engine persists parent_run_id and parent_variant_index as run lineage columns. |
variants | integer | 1 | Number of independent variant executions (1–10). When > 1, the engine runs the workflow N times with different sampling, producing N outputs. |
Output Schema
Section titled “Output Schema”No schema defined.
Task Pipeline
Section titled “Task Pipeline”embed_hooks → cluster_hooks| Task | Description |
|---|---|
embed_hooks | Generate embeddings for classified hooks using OpenAI. |
cluster_hooks | Cluster embedded hooks using K-means to find recurring patterns. |
Run-spec example
Section titled “Run-spec example”Save the YAML below as my-run.yaml, edit the values, and run with the CLI or POST it to the API. Required fields are uncommented; optional knobs are documented above the input: block — copy any line under input: and uncomment to set.
workflow: hooks/cluster
# Optional fields — copy any line(s) under `input:` and uncomment to set:# classified_hooks: []## openai_api_key: null#
input: {}Run it locally:
fab-workflow --from-file my-run.yamlOr submit over the wire — the same file is the request body:
curl -X POST 'https://gofabric.dev/v1/workflows/runs?name=hooks/cluster' \ -H 'Authorization: Bearer fab_xxx' \ -H 'content-type: application/yaml' \ --data-binary @my-run.yamlEvery workflow also accepts the universal WorkflowInput fields — variants (1–10 fan-out) and regenerate (creative-direction hints with run lineage). See Run-specs (YAML / TOML / JSON) for the full top-level shape (metadata, priority, bundle, parent, etc.).
Warnings
Section titled “Warnings”- Last user task
cluster_hookshas no Pydantic return type — workflow output schema is null. Declare a WorkflowOutput subclass and pass it to Flow(output=…) for a strict contract.