Integrate AWS Lambda with Astra DB Serverless

AWS Lambda is a functions-as-a-service (FaaS) offering that provides a serverless execution environment for your code. Lambda functions consist of function code and dependencies bundled into a deployment package.

You can use AWS Lambda with Astra DB for additional data processing capabilities, such as aggregating, summarizing, and validating data, and to connect Astra DB with other cloud services in data pipelines, among other use cases.

If you are new to AWS Lambda, practice creating a simple function.

This guide explains how to create, deploy, and test functions that use the DataStax Python and Java drivers to connect to your Astra DB database.

Prerequisites

Create a Python driver function

This function’s deployment package includes the DataStax Python driver library, a Python script that connects to your Astra DB database, and your database’s SCB, which is used for authentication.

  1. Create a directory for your Python project:

    mkdir lambda-astra-db-project
    cd lambda-astra-db-project
  2. Move your database’s SCB zip file into the project directory.

  3. Create a Python script file that contains the following code:

    from cassandra.cluster import Cluster
    from cassandra.auth import PlainTextAuthProvider
    import os
    
    ASTRA_DB_CLIENT_ID = os.environ.get('ASTRA_DB_CLIENT_ID')
    ASTRA_DB_CLIENT_SECRET = os.environ.get('ASTRA_DB_CLIENT_SECRET')
    
    cloud_config = {
        'secure_connect_bundle': 'SCB_FILE_NAME.zip',
        'use_default_tempdir': True
    }
    auth_provider = PlainTextAuthProvider(ASTRA_DB_CLIENT_ID, ASTRA_DB_CLIENT_SECRET)
    cluster = Cluster(cloud=cloud_config, auth_provider=auth_provider, protocol_version=4)
    session = cluster.connect()
    
    def lambda_handler(event, context):
        row = session.execute("SELECT cql_version FROM system.local WHERE key = 'local';").one()
        cql_version = row[0]
    
        print(cql_version)
        print('Success')
    
        return cql_version

    Replace SCB_FILE_NAME with the SCB file name, such as secure-connect-DATABASE_NAME.zip.

  4. Install the Python cassandra-driver library in the project directory:

    pip install --target . cassandra-driver
  5. Create a zip file archive deployment package from your project directory, including the SCB, Python script, and cassandra-driver library:

    zip -r lambda-astra-db-deployment-package.zip .
  6. In the AWS Lambda console, go to Functions, and then click Create function.

  7. Click Author from scratch.

  8. Enter a function name, select a Python runtime, select an architecture, and then click Create function.

  9. On the Code tab, in the Code source section, click Upload from, and then select your Python function deployment package.

  10. On the Configuration tab, set the following environment variables:

  11. On the Test tab, test your function in the Lambda console.

    A successful result includes the CQL version, such as 3.4.5, and a 200 OK status code.

Create a Java driver function

This function’s deployment package includes the function’s dependencies, a Java script that connects to your Astra DB database, and your database’s SCB, which is used for authentication. The dependencies include the DataStax Java driver library and aws-lambda-java-core.

This example uses Apache Maven™ to create the project and build the deployment package.

  1. Use Maven to create a Java project:

    mvn archetype:generate -DgroupId=com.example -DartifactId=AstraDBFunction -DinteractiveMode=false
  2. Rename App.java to AstraDBFunction.java, and then replace the content with the following code:

    package com.example;
    
    import com.amazonaws.services.lambda.runtime.Context;
    import com.amazonaws.services.lambda.runtime.RequestHandler;
    import com.amazonaws.services.lambda.runtime.LambdaLogger;
    
    import com.datastax.oss.driver.api.core.CqlSession;
    import com.datastax.oss.driver.api.core.cql.ResultSet;
    import com.datastax.oss.driver.api.core.cql.Row;
    import java.nio.file.Paths;
    
    import java.util.Map;
    
    public class AstraDBFunction implements RequestHandler<Map<String,String>, String>{
    
        private static final String ASTRA_DB_CLIENT_ID = System.getenv("ASTRA_DB_CLIENT_ID");
        private static final String ASTRA_DB_CLIENT_SECRET = System.getenv("ASTRA_DB_CLIENT_SECRET");
    
        private static CqlSession session = CqlSession.builder()
                .withCloudSecureConnectBundle(Paths.get("SCB_FILE_NAME.zip"))
                .withAuthCredentials(ASTRA_DB_CLIENT_ID,ASTRA_DB_CLIENT_SECRET)
                .build();
    
        public String handleRequest(Map<String,String> event, Context context) {
            LambdaLogger logger = context.getLogger();
    
            ResultSet rs = session.execute("SELECT cql_version FROM system.local WHERE key = 'local';");
            Row row = rs.one();
            String response = row.getString("cql_version");
    
            logger.log(response + " Success \n");
    
            return response;
        }
    }

    Replace SCB_FILE_NAME with the SCB file name, such as secure-connect-DATABASE_NAME.zip.

  3. In the project directory, create a resources directory in /src/main.

  4. Move your database’s SCB zip file into the resources directory.

  5. Make sure that your project has the following directory structure:

    AstraDBFunction
    ├── src
    │   ├── main
    │   │   ├── java
    │   │   │   └── com
    │   │   │       └── example
    │   │   │           └── AstraDBFunction.java
    │   │   └── resources
    │   │       └── secure-connect-DATABASE_NAME.zip
    │   └── test
    └── pom.xml
  6. Add the AWS Lambda and Java driver dependencies to your pom.xml file:

    <dependency>
      <groupId>com.amazonaws</groupId>
      <artifactId>aws-lambda-java-core</artifactId>
      <!-- Use the latest version from https://central.sonatype.dev/artifact/com.amazonaws/aws-lambda-java-core/1.2.2/versions -->
      <version>VERSION</version>
    </dependency>
    <dependency>
      <groupId>com.amazonaws</groupId>
      <artifactId>aws-lambda-java-events</artifactId>
      <!-- Use the latest version from https://central.sonatype.dev/artifact/com.amazonaws/aws-lambda-java-events/3.11.0/versions -->
      <version>VERSION</version>
    </dependency>
    <dependency>
      <groupId>com.amazonaws</groupId>
      <artifactId>aws-lambda-java-log4j2</artifactId>
      <!-- Use the latest version from https://central.sonatype.dev/artifact/com.amazonaws/aws-lambda-java-log4j2/1.5.1/versions -->
      <version>VERSION</version>
    </dependency>
    <dependency>
      <groupId>com.datastax.oss</groupId>
      <artifactId>java-driver-core</artifactId>
      <!-- Use the latest version from https://search.maven.org/artifact/com.datastax.oss/java-driver-core -->
      <version>VERSION</version>
    </dependency>
  7. Add or replace the pom.xml file’s build section with the following code:

    <build>
        <plugins>
            <plugin>
                <artifactId>maven-surefire-plugin</artifactId>
                <version>2.22.2</version>
            </plugin>
            <plugin>
                <groupId>org.apache.maven.plugins</groupId>
                <artifactId>maven-shade-plugin</artifactId>
                <version>3.2.2</version>
                <configuration>
                    <createDependencyReducedPom>false</createDependencyReducedPom>
                    <filters>
                        <filter>
                            <artifact>*:*</artifact>
                            <excludes>
                                <exclude>**/Log4j2Plugins.dat</exclude>
                            </excludes>
                        </filter>
                    </filters>
                </configuration>
                <executions>
                    <execution>
                        <phase>package</phase>
                        <goals>
                            <goal>shade</goal>
                        </goals>
                    </execution>
                </executions>
            </plugin>
            <plugin>
                <groupId>org.apache.maven.plugins</groupId>
                <artifactId>maven-compiler-plugin</artifactId>
                <version>3.8.1</version>
                <configuration>
                    <source>11</source>
                    <target>11</target>
                </configuration>
            </plugin>
        </plugins>
    </build>
  8. Use Maven to compile the project into a JAR file:

    mvn clean compile package

    The JAR file is your Java function deployment package. You can find the deployment package, such as AstraDBFunction-1.0-SNAPSHOT.jar, in the project’s target directory:

    AstraDBFunction
    ├── src
    │   ├── main
    │   │   ├── java
    │   │   │   └── com
    │   │   │       └── example
    │   │   │           └── AstraDBFunction.java
    │   │   └── resources
    │   │       └── secure-connect-database-name.zip
    │   └── test
    ├── target
    │   ├── AstraDBFunction-1.0-SNAPSHOT.jar
    │   ├── original-AstraDBFunction-1.0-SNAPSHOT.jar
    │   ├── maven-archiver
    │   ├── surefire-reports
    │   ├── test-classes
    │   ├── generated-test-sources
    │   ├── maven-status
    │   ├── generated-sources
    │   └── classes
    └── pom.xml
  9. In the AWS Lambda console, go to Functions, and then click Create function.

  10. Click Author from scratch.

  11. Enter a function name, select a Java runtime, select an architecture, and then click Create function.

  12. On the Code tab, in the Code source section, click Upload from, and then select your Java function deployment package.

  13. In the Runtime settings section, set Handler to com.example.AstraDBFunction::handleRequest.

    aws lambda java handler

  14. On the Configuration tab, set the following environment variables:

  15. On the Test tab, test your function in the Lambda console.

    A successful result includes the CQL version, such as 3.4.5, and a 200 OK status code.

Was this helpful?

Give Feedback

How can we improve the documentation?

© 2024 DataStax | Privacy policy | Terms of use

Apache, Apache Cassandra, Cassandra, Apache Tomcat, Tomcat, Apache Lucene, Apache Solr, Apache Hadoop, Hadoop, Apache Pulsar, Pulsar, Apache Spark, Spark, Apache TinkerPop, TinkerPop, Apache Kafka and Kafka are either registered trademarks or trademarks of the Apache Software Foundation or its subsidiaries in Canada, the United States and/or other countries. Kubernetes is the registered trademark of the Linux Foundation.

General Inquiries: +1 (650) 389-6000, info@datastax.com