Direct Reply Node

Direct Reply Node

Instantly injects a pre-calculated or static string explicitly back to the user interface, bypassing slow latency bounds.

Functionality

This node serves as the most efficient response mechanism possible. While LLM Nodes require latency scaling to generate semantic output, the Direct Reply Node simply pipes a defined string (e.g. "Great! I am now beginning my database indexing, please wait...") instantly to the user's chat UI WebSocket pipeline.

Configuration Parameters

  • Reply Message: The explicit static text response, or {{ variable }}.
  • Update Flow State: Append the final resolved payload explicitly into the overarching $flow.state data map.

Inputs & Outputs

  • Inputs: Raw text mappings or runtime variable references via handlebars.
  • Outputs: Asynchronous messaging webhook signals to the client-facing UI directly bypassing downstream dependency graphs.

results matching ""

    No results matching ""