Helpers

This Langflow feature is currently in public preview. Development is ongoing, and the features and functionality are subject to change. Langflow, and the use of such, is subject to the DataStax Preview Terms.

Helper components provide utility functions to help manage data, tasks, and other components in your flow.

Use a helper component in a flow

Chat memory in Langflow is stored either in local Langflow tables with LCBufferMemory, or connected to an external database.

The Store Message helper component stores chat memories as Data objects, and the Message History helper component retrieves chat messages as data objects or strings.

This example flow stores and retrieves chat history from an Astra DB Chat Memory component with Store Message and Chat Memory components.

astra db chat memory rounded

Batch run

The batch run component runs a language model over each row of a DataFrame text column, and returns a new DataFrame with the original text and the model’s response.

Parameters

Inputs
Name Display Name Info

model

Language Model

Connect the 'Language Model' output from your LLM component here.

system_message

System Message

Multi-line system instruction for all rows in the DataFrame.

df

DataFrame

The DataFrame whose column, specified by 'column_name', will be treated as text messages.

column_name

Column Name

The name of the DataFrame column to treat as text messages.

Outputs
Name Display Name Info

batch_results

Batch Results

A DataFrame with two columns: 'text_input' and 'model_response'.

Component code

batch_run.py
404: Not Found

Create list

This component is in Legacy, which means it is no longer in active development. Use the Structured output component instead.

This component takes a list of text inputs and converts each text into a data object. These data objects are then collected into a list, which is returned as the output.

Parameters

Inputs
Name Display Name Info

texts

Texts

Enter one or more texts. This input accepts multiple text entries.

Outputs
Display Name Name Info

Data List

list

A list of data objects created from the input texts.

Component code

create_list.py
from langflow.custom.custom_component.component import Component
from langflow.inputs.inputs import StrInput
from langflow.schema.data import Data
from langflow.schema.dataframe import DataFrame
from langflow.template.field.base import Output


class CreateListComponent(Component):
    display_name = "Create List"
    description = "Creates a list of texts."
    icon = "list"
    name = "CreateList"
    legacy = True

    inputs = [
        StrInput(
            name="texts",
            display_name="Texts",
            info="Enter one or more texts.",
            is_list=True,
        ),
    ]

    outputs = [
        Output(display_name="Data List", name="list", method="create_list"),
        Output(display_name="DataFrame", name="dataframe", method="as_dataframe"),
    ]

    def create_list(self) -> list[Data]:
        data = [Data(text=text) for text in self.texts]
        self.status = data
        return data

    def as_dataframe(self) -> DataFrame:
        """Convert the list of Data objects into a DataFrame.

        Returns:
            DataFrame: A DataFrame containing the list data.
        """
        return DataFrame(self.create_list())

Current date

The Current date component returns the current date and time in a selected timezone. This component provides a flexible way to obtain timezone-specific date and time information within a Langflow pipeline.

Parameters

Inputs
Name Display Name Info

timezone

Timezone

Select the timezone for the current date and time.

Outputs
Name Display Name Info

current_date

Current Date

The resulting current date and time in the selected timezone.

Component code

current_date.py
from datetime import datetime
from zoneinfo import ZoneInfo, available_timezones

from loguru import logger

from langflow.custom.custom_component.component import Component
from langflow.io import DropdownInput, Output
from langflow.schema.message import Message


class CurrentDateComponent(Component):
    display_name = "Current Date"
    description = "Returns the current date and time in the selected timezone."
    icon = "clock"
    name = "CurrentDate"

    inputs = [
        DropdownInput(
            name="timezone",
            display_name="Timezone",
            options=list(available_timezones()),
            value="UTC",
            info="Select the timezone for the current date and time.",
            tool_mode=True,
        ),
    ]
    outputs = [
        Output(display_name="Current Date", name="current_date", method="get_current_date"),
    ]

    def get_current_date(self) -> Message:
        try:
            tz = ZoneInfo(self.timezone)
            current_date = datetime.now(tz).strftime("%Y-%m-%d %H:%M:%S %Z")
            result = f"Current date and time in {self.timezone}: {current_date}"
            self.status = result
            return Message(text=result)
        except Exception as e:  # noqa: BLE001
            logger.opt(exception=True).debug("Error getting current date")
            error_message = f"Error: {e}"
            self.status = error_message
            return Message(text=error_message)

Custom component

This component is available by clicking New Custom Component in the Components menu.

Use this component as a template to create your custom component.

Component code

custom_component.py
# from langflow.field_typing import Data
from langflow.custom.custom_component.component import Component
from langflow.io import MessageTextInput, Output
from langflow.schema.data import Data


class CustomComponent(Component):
    display_name = "Custom Component"
    description = "Use as a template to create your own component."
    documentation: str = "https://docs.langflow.org/components-custom-components"
    icon = "code"
    name = "CustomComponent"

    inputs = [
        MessageTextInput(
            name="input_value",
            display_name="Input Value",
            info="This is a custom component Input",
            value="Hello, World!",
            tool_mode=True,
        ),
    ]

    outputs = [
        Output(display_name="Output", name="output", method="build_output"),
    ]

    def build_output(self) -> Data:
        data = Data(value=self.input_value)
        self.status = data
        return data

Hierarchical Task

This component has moved to the Bundles section of the components menu.

This component creates and manages hierarchical tasks for CrewAI agents in a Playground environment.

For more information, see the CrewAI documentation.

Parameters

Inputs
Name Display Name Info

task_description

Description

Descriptive text detailing task’s purpose and execution.

expected_output

Expected Output

Clear definition of expected task outcome.

tools

Tools

List of tools/resources limited for task execution. Uses the Agent tools by default.

Outputs
Name Display Name Info

task_output

Task

The built hierarchical task.

Component code

hierarchical_task.py
from langflow.base.agents.crewai.tasks import HierarchicalTask
from langflow.custom.custom_component.component import Component
from langflow.io import HandleInput, MultilineInput, Output


class HierarchicalTaskComponent(Component):
    display_name: str = "Hierarchical Task"
    description: str = "Each task must have a description, an expected output and an agent responsible for execution."
    icon = "CrewAI"
    inputs = [
        MultilineInput(
            name="task_description",
            display_name="Description",
            info="Descriptive text detailing task's purpose and execution.",
        ),
        MultilineInput(
            name="expected_output",
            display_name="Expected Output",
            info="Clear definition of expected task outcome.",
        ),
        HandleInput(
            name="tools",
            display_name="Tools",
            input_types=["Tool"],
            is_list=True,
            info="List of tools/resources limited for task execution. Uses the Agent tools by default.",
            required=False,
            advanced=True,
        ),
    ]

    outputs = [
        Output(display_name="Task", name="task_output", method="build_task"),
    ]

    def build_task(self) -> HierarchicalTask:
        task = HierarchicalTask(
            description=self.task_description,
            expected_output=self.expected_output,
            tools=self.tools or [],
        )
        self.status = task
        return task

ID generator

This component generates a unique ID.

Parameters

Outputs
Name Display Name Info

value

Value

Unique ID generated.

Component code

id_generator.py
import uuid
from typing import Any

from typing_extensions import override

from langflow.custom.custom_component.component import Component
from langflow.io import MessageTextInput, Output
from langflow.schema.dotdict import dotdict
from langflow.schema.message import Message


class IDGeneratorComponent(Component):
    display_name = "ID Generator"
    description = "Generates a unique ID."
    icon = "fingerprint"
    name = "IDGenerator"
    legacy = True

    inputs = [
        MessageTextInput(
            name="unique_id",
            display_name="Value",
            info="The generated unique ID.",
            refresh_button=True,
            tool_mode=True,
        ),
    ]

    outputs = [
        Output(display_name="ID", name="id", method="generate_id"),
    ]

    @override
    def update_build_config(self, build_config: dotdict, field_value: Any, field_name: str | None = None):
        if field_name == "unique_id":
            build_config[field_name]["value"] = str(uuid.uuid4())
        return build_config

    def generate_id(self) -> Message:
        unique_id = self.unique_id or str(uuid.uuid4())
        self.status = f"Generated ID: {unique_id}"
        return Message(text=unique_id)

Message history

This component was named Chat Memory prior to Langflow version 1.1.0.

This component retrieves and manages chat messages from Langflow tables or an external memory.

Parameters

Inputs
Name Display Name Info

memory

External Memory

Retrieve messages from an external memory. If empty, it uses the Langflow tables.

sender

Sender Type

Filter by sender type.

sender_name

Sender Name

Filter by sender name.

n_messages

Number of Messages

Number of messages to retrieve.

session_id

Session ID

The session ID of the chat. If empty, the current session ID parameter is used.

order

Order

Order of the messages.

template

Template

The template to use for formatting the data. It can contain the keys {text}, {sender} or any other key in the message data.

Outputs
Name Display Name Info

messages

Messages (data object)

Retrieved messages as data objects.

messages_text

Messages (text)

Retrieved messages formatted as text.

lc_memory

Memory

The created LangChain-compatible memory object.

Component code

memory.py
from typing import Any, cast

from langflow.custom.custom_component.component import Component
from langflow.helpers.data import data_to_text
from langflow.inputs.inputs import DropdownInput, HandleInput, IntInput, MessageTextInput, MultilineInput, TabInput
from langflow.memory import aget_messages, astore_message
from langflow.schema.data import Data
from langflow.schema.dataframe import DataFrame
from langflow.schema.dotdict import dotdict
from langflow.schema.message import Message
from langflow.template.field.base import Output
from langflow.utils.component_utils import set_current_fields, set_field_display
from langflow.utils.constants import MESSAGE_SENDER_AI, MESSAGE_SENDER_NAME_AI, MESSAGE_SENDER_USER


class MemoryComponent(Component):
    display_name = "Message History"
    description = "Stores or retrieves stored chat messages from Langflow tables or an external memory."
    icon = "message-square-more"
    name = "Memory"
    default_keys = ["mode", "memory"]
    mode_config = {
        "Store": ["message", "memory", "sender", "sender_name", "session_id"],
        "Retrieve": ["n_messages", "order", "template", "memory"],
    }

    inputs = [
        TabInput(
            name="mode",
            display_name="Mode",
            options=["Retrieve", "Store"],
            value="Retrieve",
            info="Operation mode: Store messages or Retrieve messages.",
            real_time_refresh=True,
        ),
        MessageTextInput(
            name="message",
            display_name="Message",
            info="The chat message to be stored.",
            tool_mode=True,
            dynamic=True,
            show=False,
        ),
        HandleInput(
            name="memory",
            display_name="External Memory",
            input_types=["Memory"],
            info="Retrieve messages from an external memory. If empty, it will use the Langflow tables.",
            advanced=True,
        ),
        DropdownInput(
            name="sender_type",
            display_name="Sender Type",
            options=[MESSAGE_SENDER_AI, MESSAGE_SENDER_USER, "Machine and User"],
            value="Machine and User",
            info="Filter by sender type.",
            advanced=True,
        ),
        MessageTextInput(
            name="sender",
            display_name="Sender",
            info="The sender of the message. Might be Machine or User. "
            "If empty, the current sender parameter will be used.",
            advanced=True,
        ),
        MessageTextInput(
            name="sender_name",
            display_name="Sender Name",
            info="Filter by sender name.",
            advanced=True,
            show=False,
        ),
        IntInput(
            name="n_messages",
            display_name="Number of Messages",
            value=100,
            info="Number of messages to retrieve.",
            advanced=True,
            show=False,
        ),
        MessageTextInput(
            name="session_id",
            display_name="Session ID",
            info="The session ID of the chat. If empty, the current session ID parameter will be used.",
            value="",
            advanced=True,
        ),
        DropdownInput(
            name="order",
            display_name="Order",
            options=["Ascending", "Descending"],
            value="Ascending",
            info="Order of the messages.",
            advanced=True,
            tool_mode=True,
            required=True,
        ),
        MultilineInput(
            name="template",
            display_name="Template",
            info="The template to use for formatting the data. "
            "It can contain the keys {text}, {sender} or any other key in the message data.",
            value="{sender_name}: {text}",
            advanced=True,
            show=False,
        ),
    ]

    outputs = [
        Output(display_name="Message", name="messages_text", method="retrieve_messages_as_text", dynamic=True),
        Output(display_name="Dataframe", name="dataframe", method="retrieve_messages_dataframe", dynamic=True),
    ]

    def update_outputs(self, frontend_node: dict, field_name: str, field_value: Any) -> dict:
        """Dynamically show only the relevant output based on the selected output type."""
        if field_name == "mode":
            # Start with empty outputs
            frontend_node["outputs"] = []
            if field_value == "Store":
                frontend_node["outputs"] = [
                    Output(
                        display_name="Stored Messages",
                        name="stored_messages",
                        method="store_message",
                        hidden=True,
                        dynamic=True,
                    )
                ]
            if field_value == "Retrieve":
                frontend_node["outputs"] = [
                    Output(
                        display_name="Messages", name="messages_text", method="retrieve_messages_as_text", dynamic=True
                    ),
                    Output(
                        display_name="Dataframe", name="dataframe", method="retrieve_messages_dataframe", dynamic=True
                    ),
                ]
        return frontend_node

    async def store_message(self) -> Message:
        message = Message(text=self.message) if isinstance(self.message, str) else self.message

        message.session_id = self.session_id or message.session_id
        message.sender = self.sender or message.sender or MESSAGE_SENDER_AI
        message.sender_name = self.sender_name or message.sender_name or MESSAGE_SENDER_NAME_AI

        stored_messages: list[Message] = []

        if self.memory:
            self.memory.session_id = message.session_id
            lc_message = message.to_lc_message()
            await self.memory.aadd_messages([lc_message])

            stored_messages = await self.memory.aget_messages() or []

            stored_messages = [Message.from_lc_message(m) for m in stored_messages] if stored_messages else []

            if message.sender:
                stored_messages = [m for m in stored_messages if m.sender == message.sender]
        else:
            await astore_message(message, flow_id=self.graph.flow_id)
            stored_messages = (
                await aget_messages(
                    session_id=message.session_id, sender_name=message.sender_name, sender=message.sender
                )
                or []
            )

        if not stored_messages:
            msg = "No messages were stored. Please ensure that the session ID and sender are properly set."
            raise ValueError(msg)

        stored_message = stored_messages[0]
        self.status = stored_message
        return stored_message

    async def retrieve_messages(self) -> Data:
        sender_type = self.sender_type
        sender_name = self.sender_name
        session_id = self.session_id
        n_messages = self.n_messages
        order = "DESC" if self.order == "Descending" else "ASC"

        if sender_type == "Machine and User":
            sender_type = None

        if self.memory and not hasattr(self.memory, "aget_messages"):
            memory_name = type(self.memory).__name__
            err_msg = f"External Memory object ({memory_name}) must have 'aget_messages' method."
            raise AttributeError(err_msg)
        # Check if n_messages is None or 0
        if n_messages == 0:
            stored = []
        elif self.memory:
            # override session_id
            self.memory.session_id = session_id

            stored = await self.memory.aget_messages()
            # langchain memories are supposed to return messages in ascending order
            if order == "DESC":
                stored = stored[::-1]
            if n_messages:
                stored = stored[:n_messages]
            stored = [Message.from_lc_message(m) for m in stored]
            if sender_type:
                expected_type = MESSAGE_SENDER_AI if sender_type == MESSAGE_SENDER_AI else MESSAGE_SENDER_USER
                stored = [m for m in stored if m.type == expected_type]
        else:
            stored = await aget_messages(
                sender=sender_type,
                sender_name=sender_name,
                session_id=session_id,
                limit=n_messages,
                order=order,
            )
        self.status = stored
        return cast(Data, stored)

    async def retrieve_messages_as_text(self) -> Message:
        stored_text = data_to_text(self.template, await self.retrieve_messages())
        self.status = stored_text
        return Message(text=stored_text)

    async def retrieve_messages_dataframe(self) -> DataFrame:
        """Convert the retrieved messages into a DataFrame.

        Returns:
            DataFrame: A DataFrame containing the message data.
        """
        messages = await self.retrieve_messages()
        return DataFrame(messages)

    def update_build_config(
        self,
        build_config: dotdict,
        field_value: Any,  # noqa: ARG002
        field_name: str | None = None,  # noqa: ARG002
    ) -> dotdict:
        return set_current_fields(
            build_config=build_config,
            action_fields=self.mode_config,
            selected_action=build_config["mode"]["value"],
            default_fields=self.default_keys,
            func=set_field_display,
        )

Output Parser

This component is in Legacy, which means it is no longer in active development as of Langflow version 1.3. Use the Structured output component instead.

This component transforms the output of a language model into a specified format. It supports CSV format parsing, which converts LLM responses into comma-separated lists using Langchain’s CommaSeparatedListOutputParser.

This component only provides formatting instructions and parsing functionality. It does not include a prompt. You’ll need to connect it to a separate Prompt component to create the actual prompt template for the LLM to use.

Both the Output Parser and Structured Output components format LLM responses, but they have different use cases. The Output Parser is simpler and focused on converting responses into comma-separated lists. Use this when you just need a list of items, for example ["item1", "item2", "item3"]. The Structured Output is more complex and flexible, and allows you to define custom schemas with multiple fields of different types. Use this when you need to extract structured data with specific fields and types.

To use this component:

  1. Create a Prompt component and connect the Output Parser’s format_instructions output to it. This ensures the LLM knows how to format its response.

  2. Write your actual prompt text in the Prompt component, including the {format_instructions} variable. For example, in your Prompt component, the template might look like:

    {format_instructions}
    Please list three fruits.
  3. Connect the output_parser output to your LLM model.

  4. The output parser converts this into a Python list: ["apple", "banana", "orange"].

Parameters

Inputs
Name Display Name Info

parser_type

Parser

Select the parser type. Currently supports "CSV".

Outputs
Name Display Name Info

format_instructions

Format Instructions

Pass to a prompt template to include formatting instructions for LLM responses.

output_parser

Output Parser

The constructed output parser that can be used to parse LLM responses.

Sequential task

This component has moved to the Bundles section of the components menu.

This component creates and manage sequential tasks for CrewAI agents. It builds a SequentialTask object with the provided description, expected output, and agent, allowing for the specification of tools and asynchronous execution.

For more information, see the CrewAI documentation.

Parameters

Inputs
Name Display Name Info

task_description

Description

Descriptive text detailing task’s purpose and execution.

expected_output

Expected Output

Clear definition of expected task outcome.

tools

Tools

List of tools/resources limited for task execution. Uses the Agent tools by default.

agent

Agent

CrewAI Agent that will perform the task.

task

Task

CrewAI Task that will perform the task.

async_execution

Async Execution

Boolean flag indicating asynchronous task execution.

Outputs
Name Display Name Info

task_output

Task

The built sequential task or list of tasks.

Component code

sequential_task.py
from langflow.base.agents.crewai.tasks import SequentialTask
from langflow.custom.custom_component.component import Component
from langflow.io import BoolInput, HandleInput, MultilineInput, Output


class SequentialTaskComponent(Component):
    display_name: str = "Sequential Task"
    description: str = "Each task must have a description, an expected output and an agent responsible for execution."
    icon = "CrewAI"
    inputs = [
        MultilineInput(
            name="task_description",
            display_name="Description",
            info="Descriptive text detailing task's purpose and execution.",
        ),
        MultilineInput(
            name="expected_output",
            display_name="Expected Output",
            info="Clear definition of expected task outcome.",
        ),
        HandleInput(
            name="tools",
            display_name="Tools",
            input_types=["Tool"],
            is_list=True,
            info="List of tools/resources limited for task execution. Uses the Agent tools by default.",
            required=False,
            advanced=True,
        ),
        HandleInput(
            name="agent",
            display_name="Agent",
            input_types=["Agent"],
            info="CrewAI Agent that will perform the task",
            required=True,
        ),
        HandleInput(
            name="task",
            display_name="Task",
            input_types=["SequentialTask"],
            info="CrewAI Task that will perform the task",
        ),
        BoolInput(
            name="async_execution",
            display_name="Async Execution",
            value=True,
            advanced=True,
            info="Boolean flag indicating asynchronous task execution.",
        ),
    ]

    outputs = [
        Output(display_name="Task", name="task_output", method="build_task"),
    ]

    def build_task(self) -> list[SequentialTask]:
        tasks: list[SequentialTask] = []
        task = SequentialTask(
            description=self.task_description,
            expected_output=self.expected_output,
            tools=self.agent.tools,
            async_execution=False,
            agent=self.agent,
        )
        tasks.append(task)
        self.status = task
        if self.task:
            if isinstance(self.task, list) and all(isinstance(task, SequentialTask) for task in self.task):
                tasks = self.task + tasks
            elif isinstance(self.task, SequentialTask):
                tasks = [self.task, *tasks]
        return tasks

Message store

This component stores chat messages or text into Langflow tables or an external memory.

It provides flexibility in managing message storage and retrieval within a chat system.

Parameters

Inputs
Name Display Name Info

message

Message

The chat message to be stored. (Required)

memory

External Memory

The external memory to store the message. If empty, it will use the Langflow tables.

sender

Sender

The sender of the message. Can be Machine or User. If empty, the current sender parameter will be used.

sender_name

Sender Name

The name of the sender. Can be AI or User. If empty, the current sender parameter will be used.

session_id

Session ID

The session ID of the chat. If empty, the current session ID parameter will be used.

Outputs
Name Display Name Info

stored_messages

Stored Messages

The list of stored messages after the current message has been added.

Component code

store_message.py
from langflow.custom.custom_component.component import Component
from langflow.inputs.inputs import (
    HandleInput,
    MessageTextInput,
)
from langflow.memory import aget_messages, astore_message
from langflow.schema.message import Message
from langflow.template.field.base import Output
from langflow.utils.constants import MESSAGE_SENDER_AI, MESSAGE_SENDER_NAME_AI


class MessageStoreComponent(Component):
    display_name = "Message Store"
    description = "Stores a chat message or text into Langflow tables or an external memory."
    icon = "message-square-text"
    name = "StoreMessage"
    legacy = True

    inputs = [
        MessageTextInput(
            name="message", display_name="Message", info="The chat message to be stored.", required=True, tool_mode=True
        ),
        HandleInput(
            name="memory",
            display_name="External Memory",
            input_types=["Memory"],
            info="The external memory to store the message. If empty, it will use the Langflow tables.",
        ),
        MessageTextInput(
            name="sender",
            display_name="Sender",
            info="The sender of the message. Might be Machine or User. "
            "If empty, the current sender parameter will be used.",
            advanced=True,
        ),
        MessageTextInput(
            name="sender_name",
            display_name="Sender Name",
            info="The name of the sender. Might be AI or User. If empty, the current sender parameter will be used.",
            advanced=True,
        ),
        MessageTextInput(
            name="session_id",
            display_name="Session ID",
            info="The session ID of the chat. If empty, the current session ID parameter will be used.",
            value="",
            advanced=True,
        ),
    ]

    outputs = [
        Output(display_name="Stored Messages", name="stored_messages", method="store_message", hidden=True),
    ]

    async def store_message(self) -> Message:
        message = Message(text=self.message) if isinstance(self.message, str) else self.message

        message.session_id = self.session_id or message.session_id
        message.sender = self.sender or message.sender or MESSAGE_SENDER_AI
        message.sender_name = self.sender_name or message.sender_name or MESSAGE_SENDER_NAME_AI

        stored_messages: list[Message] = []

        if self.memory:
            self.memory.session_id = message.session_id
            lc_message = message.to_lc_message()
            await self.memory.aadd_messages([lc_message])

            stored_messages = await self.memory.aget_messages() or []

            stored_messages = [Message.from_lc_message(m) for m in stored_messages] if stored_messages else []

            if message.sender:
                stored_messages = [m for m in stored_messages if m.sender == message.sender]
        else:
            await astore_message(message, flow_id=self.graph.flow_id)
            stored_messages = (
                await aget_messages(
                    session_id=message.session_id, sender_name=message.sender_name, sender=message.sender
                )
                or []
            )

        if not stored_messages:
            msg = "No messages were stored. Please ensure that the session ID and sender are properly set."
            raise ValueError(msg)

        stored_message = stored_messages[0]
        self.status = stored_message
        return stored_message

Structured output

This component transforms LLM responses into structured data formats.

Use the structured output component in a flow

In this example from the Financial Support Parser template, the Structured Output component transforms unstructured financial reports into structured data.

Structured output component parsing output

The connected LLM model is prompted by the Structured Output component’s system_prompt parameter to extract structured output from the unstructured text.

In the Structured Output component, click the Open table button to view the output_schema table. The output_schema parameter defines the structure and data types for the model’s output using a table with the following fields:

  • Name: The name of the output field.

  • Description: The purpose of the output field.

  • Type: The data type of the output field. The available types are str, int, float, bool, list, or dict. Default: text.

  • Multiple: Set to True if you expect multiple values for a single field. For example, a list of features is set to true to contain multiple values, such as ["waterproof", "durable", "lightweight"]. Default: True.

The Parse DataFrame component parses the structured output into a template for orderly presentation in chat output. The template receives the values from the output_schema table with curly braces.

For example, the template EBITDA: {EBITDA} , Net Income: {NET_INCOME} , GROSS_PROFIT: {GROSS_PROFIT} presents the extracted values in the Playground as EBITDA: 900 million , Net Income: 500 million , GROSS_PROFIT: 1.2 billion.

Parameters

Inputs
Name Display Name Info

llm

Language Model

The language model to use to generate the structured output.

input_value

Input Message

The input message to the language model.

system_prompt

Format Instructions

Instructions to the language model for formatting the output.

schema_name

Schema Name

The name for the output data schema.

output_schema

Output Schema

The structure and data types for the model’s output.

multiple

Generate Multiple

[Deprecated] Always set to True.

Outputs
Name Display Name Info

structured_output

Structured Output

The structured output based on the defined schema.

structured_output_dataframe

DataFrame

The structured output converted to a DataFrame format.

Component code

structured_output.py
404: Not Found

Was this helpful?

Give Feedback

How can we improve the documentation?

© 2025 DataStax | Privacy policy | Terms of use | Manage Privacy Choices

Apache, Apache Cassandra, Cassandra, Apache Tomcat, Tomcat, Apache Lucene, Apache Solr, Apache Hadoop, Hadoop, Apache Pulsar, Pulsar, Apache Spark, Spark, Apache TinkerPop, TinkerPop, Apache Kafka and Kafka are either registered trademarks or trademarks of the Apache Software Foundation or its subsidiaries in Canada, the United States and/or other countries. Kubernetes is the registered trademark of the Linux Foundation.

General Inquiries: +1 (650) 389-6000, info@datastax.com