RAGStack Python packages

RAGStack comes with a set of Python packages that provide the necessary tools to implement the RAG pattern in your applications.

  1. ragstack-ai: All-in-one package that contains all components supported by RAGStack. While this is the most convenient package to use, it may be heavier than necessary for some use cases.

  2. ragstack-ai-langchain: This package is meant for users who want to use RAGStack with the LangChain framework.

  3. ragstack-ai-llamaindex: This package is meant for users who want to use RAGStack with the LlamaIndex framework.

  4. ragstack-ai-langflow: This package is meant for users who want to use RAGStack with the LangFlow framework.

  5. ragstack-ai-colbert: This package contains the implementation of the ColBERT retrieval.

Supported integrations for ragstack-ai-langchain

The ragstack-ai-langchain package includes the minimum set of dependencies for using LangChain with Astra DB. LLMs, embeddings, and third-party providers are not included in this package by default, except for OpenAI and Azure OpenAI.

To use LLMs, embeddings, or third-party providers, you can leverage ragstack-ai-langchain extras:

  1. ragstack-ai-langchain[google] lets you work with Google Vertex AI and Google Gemini API.

  2. ragstack-ai-langchain[nvidia] lets you work with NVIDIA hosted API endpoints for NVIDIA AI Foundation Models.

Additional LangChain packages should work out of the box, although you need to manage the packages and their dependencies yourself.

Supported integrations for ragstack-ai-llamaindex

The ragstack-ai-llamaindex package includes the minimum set of dependencies for using LlamaIndex with Astra DB. LLMs, embeddings, and third-party providers are not included in this package by default, except for OpenAI.

To use LLMs, embeddings, or third-party providers, you can leverage ragstack-ai-llamaindex extras:

  1. ragstack-ai-llamaindex[google] lets you work with Google Vertex AI and Google Gemini API.

  2. ragstack-ai-llamaindex[azure] lets you work with Azure OpenAI.

  3. ragstack-ai-llamaindex[bedrock] lets you work with AWS Bedrock.

Additional LLamaIndex packages should work out of the box, although you need to manage the packages and their dependencies yourself.

Supported integrations for ragstack-ai-langflow

The ragstack-ai-langflow package contains a curated set of dependencies for using Langflow with Astra DB and all the supported integrations by ragstack-ai-langchain.

All the Langflow’s builtin integrations are included in the ragstack-ai-langflow package.

ColBERT with ragstack-ai-langchain and ragstack-ai-llamaindex

The colbert module provides a vanilla implementation for ColBERT retrieval. It is not tied to any specific framework and can be used with any of the RAGStack packages.

If you want to use ColBERT with LangChain or LLamaIndex, you can use the following extras:

  1. ragstack-ai-langchain[colbert]

  2. ragstack-ai-llamaindex[colbert]

Was this helpful?

Give Feedback

How can we improve the documentation?

© 2024 DataStax | Privacy policy | Terms of use

Apache, Apache Cassandra, Cassandra, Apache Tomcat, Tomcat, Apache Lucene, Apache Solr, Apache Hadoop, Hadoop, Apache Pulsar, Pulsar, Apache Spark, Spark, Apache TinkerPop, TinkerPop, Apache Kafka and Kafka are either registered trademarks or trademarks of the Apache Software Foundation or its subsidiaries in Canada, the United States and/or other countries. Kubernetes is the registered trademark of the Linux Foundation.

General Inquiries: +1 (650) 389-6000, info@datastax.com