Integrate Azure Functions with Astra DB Serverless

query_builder 30 min

Azure Functions is Microsoft Azure’s function-as-a-service offering that provides a serverless execution environment for your code. You can use Azure Functions for actions such as the following:

  • Extend Astra DB Serverless with additional data processing capabilities, such as aggregating, summarizing, and validating data periodically.

  • Connect Astra DB Serverless with other cloud services into data pipelines that move, process, and analyze data.

Prerequisites

Create a local Azure Functions project

Create a local project, based on the Azure Functions Python quickstart, to develop and test your Azure Functions before you deploy them to the cloud.

  1. Install the Azure CLI.

  2. Install the Azure functions package version 4.0 or later.

    • Windows

    • macOS

    • Linux

    These steps use a Windows installer (MSI) to install Core Tools v4.x. For information about other package-based installers, see the Core Tools readme.

    If you previously used Windows installer (MSI) to install Core Tools on Windows, uninstall the old version before installing the latest version.

    1. Download and run the Core Tools installer for your version of Windows:

    brew tap azure/functions
    brew install azure-functions-core-tools@4
    # if upgrading on a machine that has 2.x or 3.x installed:
    brew link --overwrite azure-functions-core-tools@4

    These steps use APT to install Core Tools on an Ubuntu or Debian Linux distribution. For other Linux distributions, see the Core Tools readme.

    1. Install the Microsoft package repository GPG key to validate package integrity:

      curl https://packages.microsoft.com/keys/microsoft.asc | gpg --dearmor > microsoft.gpg
      sudo mv microsoft.gpg /etc/apt/trusted.gpg.d/microsoft.gpg
    2. Set up the APT source list before doing an APT update.

      • Ubuntu:

        sudo sh -c 'echo "deb [arch=amd64] https://packages.microsoft.com/repos/microsoft-ubuntu-$(lsb_release -cs)-prod $(lsb_release -cs) main" > /etc/apt/sources.list.d/dotnetdev.list'
      • Debian:

        sudo sh -c 'echo "deb [arch=amd64] https://packages.microsoft.com/debian/$(lsb_release -rs | cut -d'.' -f 1)/prod $(lsb_release -cs) main" > /etc/apt/sources.list.d/dotnetdev.list'
    3. Check the /etc/apt/sources.list.d/dotnetdev.list file for your Linux version.

      Linux distribution Version

      Debian 11

      bullseye

      Debian 10

      buster

      Debian 9

      stretch

      Ubuntu 22.04

      jammy

      Ubuntu 20.04

      focal

      Ubuntu 19.04

      disco

      Ubuntu 18.10

      cosmic

      Ubuntu 18.04

      bionic

      Ubuntu 17.04/Linux Mint 18

      xenial

    4. Start the APT source update:

      sudo apt-get update
    5. Install the Core Tools package:

      sudo apt-get install azure-functions-core-tools-4
  3. Create and activate a new Python virtual environment:

    python -m venv .venv
    source .venv/bin/activate
  4. Initialize a new Azure Functions project with the Python runtime:

    func init --python

    This command initializes a project directory with a function_app.py file, a requirements.txt file, and other necessary Python files.

  5. Use func new to add a function to your project. The --name argument is the unique name of your function, and the --template argument specifies the function’s trigger.

    In this example, func new adds an HTTP trigger endpoint named HttpExample to the function_app.py file, which is accessible without authentication. For more information, see the Azure functionapp CLI reference.

    func new --name HttpExample --template "HTTP trigger" --authlevel "ANONYMOUS"
  6. Replace the contents of function_app.py with the following code:

    import azure.functions as func
    import datetime
    import json
    import logging
    
    app = func.FunctionApp()
    
    @app.route(route="HttpExample", auth_level=func.AuthLevel.ANONYMOUS)
    def HttpExample(req: func.HttpRequest) -> func.HttpResponse:
        logging.info('Python HTTP trigger function processed a request.')
    
        name = req.params.get('name')
        if not name:
            try:
                req_body = req.get_json()
            except ValueError:
                pass
            else:
                name = req_body.get('name')
    
        if name:
            return func.HttpResponse(f"Hello, {name}. This HTTP triggered function executed successfully.")
        else:
            return func.HttpResponse(
                 "This HTTP triggered function executed successfully. Pass a name in the query string or in the request body for a personalized response.",
                 status_code=200
            )
  7. Run the function locally:

    func start

    From the output, you can visit the HttpExample endpoint, which prints a message like This HTTP triggered function executed successfully. Pass a name in the query string or in the request body for a personalized response..

    Found Python version 3.9.6 (python3).
    
    Azure Functions Core Tools
    Core Tools Version:       4.0.5907 Commit hash: N/A +807e89766a92b14fd07b9f0bc2bea1d8777ab209 (64-bit)
    Function Runtime Version: 4.834.3.22875
    
    [2024-07-25T14:48:34.922Z] Worker process started and initialized.
    
    Functions:
    
            HttpExample:  http://localhost:7071/api/HttpExample

    You might get an error like No job functions found. Try making your job classes and methods public. If this happens, check your local.settings.json file and make sure UseDevelopmentStorage is set to true. The local.settings.json file is in the root of the project directory you created with func init.

    {
      "IsEncrypted": false,
      "Values": {
        "FUNCTIONS_WORKER_RUNTIME": "python",
        "AzureWebJobsFeatureFlags": "EnableWorkerIndexing",
        "AzureWebJobsStorage": "UseDevelopmentStorage=true"
      }
    }

Deploy the function to the cloud

  1. Sign into your Azure account:

    az login
  2. Create a resource group named AzureFunctionsQuickstart-rg in your chosen region. Replace REGION with a region near you, using an available region code returned from the az account list-locations command.

    az group create --name AzureFunctionsQuickstart --location REGION
  3. Create a general-purpose storage account in your resource group and region. Replace STORAGE_NAME with a name that’s appropriate to you and unique in Azure Storage.

    --sku Standard_LRS specifies a general-purpose account supported by Azure Functions.

    az storage account create --name STORAGE_NAME --location eastus2 --resource-group AzureFunctionsQuickstart --sku Standard_LRS
  4. Create the function app in Azure. The az functionapp create command creates the function app in Azure. You must supply --os-type linux because Python functions only run on Linux.

    az functionapp create --resource-group AzureFunctionsQuickstart --consumption-plan-location eastus2 --runtime python --runtime-version 3.9 --functions-version 4 --name uniqueapplication --os-type linux --storage-account azurequickstartstoragemk

    Your template Python application is now deployed to Azure Functions.

  5. Publish the application in Azure to make it publicly available:

    func azure functionapp publish uniqueapplication
  6. To test the deployed application, send a request to the application’s endpoint:

    curl "https://uniqueapplication.azurewebsites.net/api/HttpExample?name=datastax"
    Response:
    Hello, NAME. This HTTP triggered function executed successfully.

    You can also view, test, and debug the function in the Azure portal.

Create the Azure function with the Python Cassandra driver

Now that you have a working template application, modify the local application and dependencies to connect to Astra DB with the Python Apache Cassandra® driver.

  1. Set the following environment variables:

    • APP_NAME: The name of your application.

    • RESOURCE_GROUP_NAME: The name of the resource group you created.

    • ASTRA_DB_CLIENT_ID: Enter token. This must be the exact, all-lowercase word token.

    • ASTRA_DB_CLIENT_SECRET: Your application token, which is prefixed by AstraCS:.

      export APP_NAME = APPLICATION_NAME
      export RESOURCE_GROUP_NAME = AZURE_RESOURCE_GROUP_NAME
      export ASTRA_DB_CLIENT_ID = token
      export ASTRA_DB_CLIENT_SECRET = APPLICATION_TOKEN
  2. Use the Azure CLI to add ASTRA_DB_CLIENT_ID and ASTRA_DB_CLIENT_SECRET to the application settings:

    az functionapp config appsettings set \
      --name ${APP_NAME} \
      --resource-group ${RESOURCE_GROUP_NAME} \
      --settings "ASTRA_DB_CLIENT_ID=${ASTRA_DB_CLIENT_ID}"
    
    az functionapp config appsettings set \
      --name ${APP_NAME} \
      --resource-group ${RESOURCE_GROUP_NAME} \
      --settings "ASTRA_DB_CLIENT_SECRET=${ASTRA_DB_CLIENT_SECRET}"
  3. In the Azure portal, verify the environment variables in your app settings.

  4. Move the SCB zip file to root of the project directory.

  5. Add the Python Cassandra driver to the requirements.txt file:

    echo "cassandra-driver" | cat >> requirements.txt
    cat requirements.txt
  6. Replace the function_app.py content with the following code, and then replace secure-connect-bundle-for-your-database.zip with the file name for your database’s SCB:

    import azure.functions as func
    from azure.functions import AuthLevel
    import datetime
    import json
    import logging
    import os
    from cassandra.cluster import Cluster
    from cassandra.auth import PlainTextAuthProvider
    
    ASTRA_DB_CLIENT_ID = os.environ.get('ASTRA_DB_CLIENT_ID')
    ASTRA_DB_CLIENT_SECRET = os.environ.get('ASTRA_DB_CLIENT_SECRET')
    
    if not ASTRA_DB_CLIENT_ID or not ASTRA_DB_CLIENT_SECRET:
        raise ValueError("Environment variables ASTRA_DB_CLIENT_ID and ASTRA_DB_CLIENT_SECRET must be set")
    
    cloud_config = {
        'secure_connect_bundle': 'secure-connect-bundle-for-your-database.zip',
        'use_default_tempdir': True
    }
    auth_provider = PlainTextAuthProvider(ASTRA_DB_CLIENT_ID, ASTRA_DB_CLIENT_SECRET)
    cluster = Cluster(
        cloud=cloud_config,
        auth_provider=auth_provider,
        protocol_version=4
    )
    
    app = func.FunctionApp()
    
    @app.route(route="HttpExample", auth_level=func.AuthLevel.ANONYMOUS)
    def HttpExample(req: func.HttpRequest) -> func.HttpResponse:
        session = cluster.connect()
        session.default_timeout = 60
        row = session.execute("SELECT cql_version FROM system.local WHERE key = 'local';").one()
        cql_version = row[0]
        logging.info(f"{cql_version} Success")
        return func.HttpResponse(f"{cql_version} Success")
  7. Test the function locally:

    func start

    From the output, you can visit the HttpExample endpoint, which prints VERSION Success, where VERSION is your Astra DB CQL version.

    Found Python version 3.9.6 (python3).
    
    Azure Functions Core Tools
    Core Tools Version:       4.0.5907 Commit hash: N/A +807e89766a92b14fd07b9f0bc2bea1d8777ab209 (64-bit)
    Function Runtime Version: 4.834.3.22875
    
    [2024-07-25T14:48:34.922Z] Worker process started and initialized.
    
    Functions:
    
            HttpExample:  http://localhost:7071/api/HttpExample
  8. Deploy the updated function to Azure:

    func azure functionapp publish uniqueapplication

    A successful build returns Remote build succeeded and an invoke URL for your application.

  9. To test your application, send a GET request to your application’s endpoint:

    curl "https://uniqueapplication.azurewebsites.net/api/HttpExample"
    Response

    A successful response contains VERSION Success, where VERSION is your Astra DB CQL version.

    3.4.5 Success

Your Azure function is now integrated with Astra DB Serverless through the Python Cassandra driver, and you can now extend or modify the Azure function to interact with your Astra DB Serverless database.

Was this helpful?

Give Feedback

How can we improve the documentation?

© 2024 DataStax | Privacy policy | Terms of use

Apache, Apache Cassandra, Cassandra, Apache Tomcat, Tomcat, Apache Lucene, Apache Solr, Apache Hadoop, Hadoop, Apache Pulsar, Pulsar, Apache Spark, Spark, Apache TinkerPop, TinkerPop, Apache Kafka and Kafka are either registered trademarks or trademarks of the Apache Software Foundation or its subsidiaries in Canada, the United States and/or other countries. Kubernetes is the registered trademark of the Linux Foundation.

General Inquiries: +1 (650) 389-6000, info@datastax.com