A flow, or Talk Track, is a structured interaction made up of steps, each containing a prompt and one or more actions. You use flows when an interaction requires:

  • Collecting user input (e.g. name, phone number, confirmation code)
  • Performing validation or retries
  • Calling internal or external APIs
  • Routing, handoff, or escalation
  • Following a specific sequence to meet business logic or regulatory requirements

Use individual knowledge base entries for short, factual answers. Use a flow when interaction needs structure, memory, or logic.

How flows work

Each flow is made up of:

  • Steps: Self-contained conversation states, made up of:
    • Text prompts, just like knowledge base entries.
    • Global functions and transition functions: Logic blocks that validate input, call APIs, store values, and move the conversation forward.

That means as each step is sequentially processed, the LLM sees:

  • The current step’s text prompt.
  • A list of available functions with names, descriptions, and arguments.

What the LLM doesn’t see:

  • Previous step prompts.
  • Any system context, unless it’s surfaced in the prompt or state.

Because previous steps are not visible, each prompt must be self-contained. Provide all necessary context and clear guidance in each step.

LLM interaction model

When the agent is inside a flow step, this is the input order:

  1. System prompt (includes Rules and About agent configuration).
  2. Any relevant knowledge base entries (if applicable).
  3. The conversation history.
  4. The step prompt (which is appended last).

Key techniques

  • Transition functions control the flow’s routing logic.
  • Use few-shot prompting to clarify expected inputs or edge cases.
  • Set ASR biasing to improve voice transcription for structured or ambiguous values like confirmation codes or personal names.
  • Flows can include conditions to dynamically adapt responses based on user input. For example, if a user provides a specific date, the agent can check availability and guide the user accordingly.
  • Flows support the use of variables to dynamically fill in information during interactions. This allows context-aware responses.

Connecting steps

In the Flow Editor:

  • Use /Steps in the prompt to connect to the next step
  • Add named transition functions to manage movement between states
  • Use the Flow Functions modal to see all transitions in one place

Always include an exit step in your flow. Un-exited flows can cause .

Use descriptive function names like check_reservation_match, not vague ones like step_two — this helps the LLM reason correctly.

Advanced features

​ Conditional logic Flows can include conditions to dynamically adapt responses based on user input. For example:

If a user provides a specific date, the agent can check availability and guide the user accordingly. ​ Variables in prompts Flows support the use of variables to dynamically fill in information during interactions. This allows for personalized and context-aware responses.

Further reading