Skip to content

research/trend-analyst

Trend Analyst — cross-platform social media trend research with scoring.

Category: research
Source: workflows/research/trend_analyst.py

FieldTypeDefaultDescription
content_pillarsstring""Comma-separated content pillars
nichestringrequiredThe niche or industry to analyze
platformsstring"x,instagram,tiktok,linkedin,youtube"Comma-separated platforms to analyze (x, instagram, tiktok, linkedin, youtube, reddit)
regenerateobjectWhen set, this run is a regeneration. Workflows may read direction / keep / extra_instructions to modulate prompts; the engine persists parent_run_id and parent_variant_index as run lineage columns.
target_audiencestring""Who the content is for
trends_per_platforminteger5Number of trends to return per platform (3-5)
variantsinteger1Number of independent variant executions (1–10). When > 1, the engine runs the workflow N times with different sampling, producing N outputs.

No schema defined.

plan_trend_research → ingest_x_trends → ingest_instagram_trends → ingest_tiktok_trends → ingest_linkedin_trends → ingest_youtube_trends → ingest_reddit_trends → merge_trend_sources → score_trends → format_trend_output
TaskDescription
plan_trend_researchParse platforms, validate, and build trend-focused queries per platform.
ingest_x_trendsIngest trending content from X/Twitter.
ingest_instagram_trendsIngest trending content from Instagram.
ingest_tiktok_trendsIngest trending content from TikTok via yt-dlp or web search fallback.
ingest_linkedin_trendsIngest trending content from LinkedIn via web search.
ingest_youtube_trendsIngest trending YouTube content via yt-dlp (metadata + comments).
ingest_reddit_trendsIngest trending Reddit content via subreddit discovery + crawling.
merge_trend_sourcesJoin all platform branches into a unified trend_docs list.
score_trendsIdentify and score trending topics from collected platform data.
format_trend_outputReshape scored trends into the final output schema, grouped by source.

Save the YAML below as my-run.yaml, edit the values, and run with the CLI or POST it to the API. Required fields are uncommented; optional knobs are documented above the input: block — copy any line under input: and uncomment to set.

workflow: research/trend-analyst
# Optional fields — copy any line(s) under `input:` and uncomment to set:
# Comma-separated content pillars
# content_pillars: ""
#
# Comma-separated platforms to analyze (x, instagram, tiktok, linkedin, youtube, reddit)
# platforms: "x,instagram,tiktok,linkedin,youtube"
#
# Who the content is for
# target_audience: ""
#
# Number of trends to return per platform (3-5)
# trends_per_platform: 5
#
input:
# The niche or industry to analyze
niche: ""

Run it locally:

Terminal window
fab-workflow --from-file my-run.yaml

Or submit over the wire — the same file is the request body:

Terminal window
curl -X POST 'https://gofabric.dev/v1/workflows/runs?name=research/trend-analyst' \
-H 'Authorization: Bearer fab_xxx' \
-H 'content-type: application/yaml' \
--data-binary @my-run.yaml

Every workflow also accepts the universal WorkflowInput fields — variants (1–10 fan-out) and regenerate (creative-direction hints with run lineage). See Run-specs (YAML / TOML / JSON) for the full top-level shape (metadata, priority, bundle, parent, etc.).

  • Last user task format_trend_output has no Pydantic return type — workflow output schema is null. Declare a WorkflowOutput subclass and pass it to Flow(output=…) for a strict contract.
  • Task merge_trend_sources has no Pydantic types — contract is opaque to consumers.