Langflow API

The API pane provides code templates to integrate your flows into external applications.

Flows are exposed at the /api/v1/run/{flow_id} endpoint, where {flow_id} is the ID of the flow you want to run. You POST a JSON object to the endpoint, with optional flow customizations, and then receive a response.

This example demonstrates how you can use the API pane to interact with the Basic Prompting flow as an API endpoint.

Prerequisites

  • Before you can use the Langflow API, you must have a flow that you want to expose as an API endpoint. If you haven’t already created a flow, create the Basic Prompting flow and ensure it’s working correctly.

    Langflow automatically makes your flows available as API endpoints, and Langflow runs a server process that listens for API requests.

  • Get an application token. You can generate an application token in the API pane in the Langflow Workspace, or use an existing Astra application token with Organization Administrator permissions.

Interact with your flow as an API endpoint

  1. Open your flow in the Workspace, and then click API.

  2. Select the client you want to use to interact with your flow. Langflow generates a code sample to retrieve the flow’s output with the selected client.

    • cURL

    • Python API

    • JS API

    • Python code

    To test the deployed application, send a request to the application’s endpoint:

    curl -X POST \
        "https://api.langflow.astra.datastax.com/lf/YOUR_LANGFLOW_ID/api/v1/run/YOUR_FLOW_ID?stream=false" \
        -H 'Content-Type: application/json'\
        -H "Authorization: Bearer YOUR_APPLICATION_TOKEN"\
        -d '{"input_value": "tell me about something interesting",
        "output_type": "chat",
        "input_type": "chat",
        "tweaks": {
      "ChatInput-ivyVt": {},
      "Prompt-lpeoJ": {},
      "ChatOutput-eV0ob": {},
      "OpenAIModel-SzTnW": {}
    }}'

    The response content depends on your flow. Make sure the endpoint returns a successful response.

    To test the endpoint using the Python requests library, paste the presented code into a Python script.

    Langflow auto-populates all necessary parameters for the flow except APPLICATION_TOKEN. You must paste your application token into the script or set it as an environment variable.

    Python example code
    import argparse
    import json
    from argparse import RawTextHelpFormatter
    import requests
    from typing import Optional
    import warnings
    try:
        from langflow.load import upload_file
    except ImportError:
        warnings.warn("Langflow provides a function to help you upload files to the flow. Please install langflow to use it.")
        upload_file = None
    
    BASE_API_URL = "https://api.langflow.astra.datastax.com"
    LANGFLOW_ID = "YOUR_LANGFLOW_ID"
    FLOW_ID = "YOUR_FLOW_ID"
    APPLICATION_TOKEN = "YOUR_APPLICATION_TOKEN"
    ENDPOINT = "" # You can set a specific endpoint name in the flow settings
    
    # You can tweak the flow by adding a tweaks dictionary
    # e.g {"OpenAI-XXXXX": {"model_name": "gpt-4"}}
    TWEAKS = {
      "ChatInput-ivyVt": {},
      "Prompt-lpeoJ": {},
      "ChatOutput-eV0ob": {},
      "OpenAIModel-SzTnW": {}
    }
    
    def run_flow(message: str,
      endpoint: str,
      output_type: str = "chat",
      input_type: str = "chat",
      tweaks: Optional[dict] = None,
      application_token: Optional[str] = None) -> dict:
        """
        Run a flow with a given message and optional tweaks.
    
        :param message: The message to send to the flow
        :param endpoint: The ID or the endpoint name of the flow
        :param tweaks: Optional tweaks to customize the flow
        :return: The JSON response from the flow
        """
        api_url = f"{BASE_API_URL}/lf/{LANGFLOW_ID}/api/v1/run/{endpoint}"
    
        payload = {
            "input_value": message,
            "output_type": output_type,
            "input_type": input_type,
        }
        headers = None
        if tweaks:
            payload["tweaks"] = tweaks
        if application_token:
            headers = {"Authorization": "Bearer " + application_token, "Content-Type": "application/json"}
        response = requests.post(api_url, json=payload, headers=headers)
        return response.json()
    
    def main():
        parser = argparse.ArgumentParser(description="""Run a flow with a given message and optional tweaks.
    Run it like: python <your file>.py "your message here" --endpoint "your_endpoint" --tweaks '{"key": "value"}'""",
            formatter_class=RawTextHelpFormatter)
        parser.add_argument("message", type=str, help="The message to send to the flow")
        parser.add_argument("--endpoint", type=str, default=ENDPOINT or FLOW_ID, help="The ID or the endpoint name of the flow")
        parser.add_argument("--tweaks", type=str, help="JSON string representing the tweaks to customize the flow", default=json.dumps(TWEAKS))
        parser.add_argument("--application_token", type=str, default=APPLICATION_TOKEN, help="Application Token for authentication")
        parser.add_argument("--output_type", type=str, default="chat", help="The output type")
        parser.add_argument("--input_type", type=str, default="chat", help="The input type")
        parser.add_argument("--upload_file", type=str, help="Path to the file to upload", default=None)
        parser.add_argument("--components", type=str, help="Components to upload the file to", default=None)
    
        args = parser.parse_args()
        try:
          tweaks = json.loads(args.tweaks)
        except json.JSONDecodeError:
          raise ValueError("Invalid tweaks JSON string")
    
        if args.upload_file:
            if not upload_file:
                raise ImportError("Langflow is not installed. Please install it to use the upload_file function.")
            elif not args.components:
                raise ValueError("You need to provide the components to upload the file to.")
            tweaks = upload_file(file_path=args.upload_file, host=BASE_API_URL, flow_id=ENDPOINT, components=args.components, tweaks=tweaks)
    
        response = run_flow(
            message=args.message,
            endpoint=args.endpoint,
            output_type=args.output_type,
            input_type=args.input_type,
            tweaks=tweaks,
            application_token=args.application_token
        )
    
        print(json.dumps(response, indent=2))
    
    if name == "main":
        main()

    Run the script with any necessary arguments for your flow:

    python3.10 bp-python-api.py --message="tell me about something interesting"

    To test the deployed application, send a request to the application’s endpoint.

    Langflow auto-populates all necessary parameters for the flow except applicationToken. You must paste your application token into the script or set it as an environment variable.

    JavaScript example code
    class LangflowClient {
        constructor(baseURL, applicationToken) {
            this.baseURL = baseURL;
            this.applicationToken = applicationToken;
        }
        async post(endpoint, body, headers = {"Content-Type": "application/json"}) {
            headers["Authorization"] = Bearer ${this.applicationToken};
            headers["Content-Type"] = "application/json";
            const url = ${this.baseURL}${endpoint};
            try {
                const response = await fetch(url, {
                    method: 'POST',
                    headers: headers,
                    body: JSON.stringify(body)
                });
    
                const responseMessage = await response.json();
                if (!response.ok) {
                    throw new Error(${response.status} ${response.statusText} - ${JSON.stringify(responseMessage)});
                }
                return responseMessage;
            } catch (error) {
                console.error('Request Error:', error.message);
                throw error;
            }
        }
    
        async initiateSession(flowId, langflowId, inputValue, inputType = 'chat', outputType = 'chat', stream = false, tweaks = {}) {
            const endpoint = /lf/${langflowId}/api/v1/run/${flowId}?stream=${stream};
            return this.post(endpoint, { input_value: inputValue, input_type: inputType, output_type: outputType, tweaks: tweaks });
        }
    
        handleStream(streamUrl, onUpdate, onClose, onError) {
            const eventSource = new EventSource(streamUrl);
    
            eventSource.onmessage = event => {
                const data = JSON.parse(event.data);
                onUpdate(data);
            };
    
            eventSource.onerror = event => {
                console.error('Stream Error:', event);
                onError(event);
                eventSource.close();
            };
    
            eventSource.addEventListener("close", () => {
                onClose('Stream closed');
                eventSource.close();
            });
    
            return eventSource;
        }
    
        async runFlow(flowIdOrName, langflowId, inputValue, inputType = 'chat', outputType = 'chat', tweaks = {}, stream = false, onUpdate, onClose, onError) {
            try {
                const initResponse = await this.initiateSession(flowIdOrName, langflowId, inputValue, inputType, outputType, stream, tweaks);
                console.log('Init Response:', initResponse);
                if (stream && initResponse && initResponse.outputs && initResponse.outputs[0].outputs[0].artifacts.stream_url) {
                    const streamUrl = initResponse.outputs[0].outputs[0].artifacts.stream_url;
                    console.log(Streaming from: ${streamUrl});
                    this.handleStream(streamUrl, onUpdate, onClose, onError);
                }
                return initResponse;
            } catch (error) {
                console.error('Error running flow:', error);
                onError('Error initiating session');
            }
        }
    }
    
    async function main(inputValue, inputType = 'chat', outputType = 'chat', stream = false) {
        const flowIdOrName = 'YOUR_FLOW_ID';
        const langflowId = 'YOUR_LANGFLOW_ID';
        const applicationToken = 'YOUR_APPLICATION_TOKEN';
        const langflowClient = new LangflowClient('https://api.langflow.astra.datastax.com',
            applicationToken);
    
        try {
          const tweaks = {
      "ChatInput-ivyVt": {},
      "Prompt-lpeoJ": {},
      "ChatOutput-eV0ob": {},
      "OpenAIModel-SzTnW": {}
    };
          response = await langflowClient.runFlow(
              flowIdOrName,
              langflowId,
              inputValue,
              inputType,
              outputType,
              tweaks,
              stream,
              (data) => console.log("Received:", data.chunk), // onUpdate
              (message) => console.log("Stream Closed:", message), // onClose
              (error) => console.log("Stream Error:", error) // onError
          );
          if (!stream && response && response.outputs) {
              const flowOutputs = response.outputs[0];
              const firstComponentOutputs = flowOutputs.outputs[0];
              const output = firstComponentOutputs.outputs.message;
    
              console.log("Final Output:", output.message.text);
          }
        } catch (error) {
          console.error('Main Error', error.message);
        }
    }
    
    const args = process.argv.slice(2);
    if (args.length < 1) {
      console.error('Please run the file with the message as an argument: node <YOUR_FILE_NAME>.js "user_message"');
    }
    main(
      args[0], // inputValue
      args[1], // inputType
      args[2], // outputType
      args[3] === 'true' // stream
    );

    Run the script with any necessary arguments for your flow:

    node bp-js-api.js "tell me about something interesting"

    The response content depends on your flow. Make sure the endpoint returns a successful response.

    To use your code in a Python application via the Langflow runtime, you have to first download your flow’s JSON file.

    1. In your Workspace, click Settings, and then select Export.

    2. Download the flow to your local machine. Make sure the flow path in the script matches the flow’s location on your machine.

    3. Paste the code from the API tab into a Python script file.

      from langflow.load import run_flow_from_json
      TWEAKS = {
        "ChatInput-kKhri": {},
        "Prompt-KDSi5": {},
        "ChatOutput-Vr3Q7": {},
        "OpenAIModel-4xYtx": {}
      }
      
      result = run_flow_from_json(flow="./basic-prompting-local.json",
                                  input_value="tell me about something interesting",
                                  fallback_to_env_vars=True, # False by default
                                  tweaks=TWEAKS)
      
      print(result)
    4. Run the script:

      python3.10 bp-python-local.py

Customize flows with tweaks at runtime

On the Tweaks tab, you can modify your flow’s parameters. Tweaks are passed as a JSON object in the API call. Langflow automatically updates the client code samples based on your changes.

Tweaks don’t persist in DataStax Langflow.

For example, changing the OpenAI model_name from the default value to gpt-3.5-turbo modifies the cURL request:

curl -X POST \
    "https://api.langflow.astra.datastax.com/lf/**YOUR_LANGFLOW_ID**/api/v1/run/**YOUR_FLOW_ID**?stream=false" \
    -H 'Content-Type: application/json'\
    -H "Authorization: Bearer **YOUR_APPLICATION_TOKEN**"\
    -d '{"input_value": "tell me about something interesting",
    "output_type": "chat",
    "input_type": "chat",
    "tweaks": {
  "ChatInput-ivyVt": {},
  "Prompt-lpeoJ": {},
  "ChatOutput-eV0ob": {},
  "OpenAIModel-SzTnW": {
    "model_name": "gpt-3.5-turbo"
  }
}}'

Was this helpful?

Give Feedback

How can we improve the documentation?

© 2024 DataStax | Privacy policy | Terms of use

Apache, Apache Cassandra, Cassandra, Apache Tomcat, Tomcat, Apache Lucene, Apache Solr, Apache Hadoop, Hadoop, Apache Pulsar, Pulsar, Apache Spark, Spark, Apache TinkerPop, TinkerPop, Apache Kafka and Kafka are either registered trademarks or trademarks of the Apache Software Foundation or its subsidiaries in Canada, the United States and/or other countries. Kubernetes is the registered trademark of the Linux Foundation.

General Inquiries: +1 (650) 389-6000, info@datastax.com