Installing DataStax Enterprise 4.5 using the binary tarball

Install DataStax Enterprise 4.5 on any Linux-based platform, including 32-bit platforms.

For a complete list of supported platforms, see DataStax Enterprise Supported Platforms.

Important: DataStax Enterprise 4.5 uses Cassandra 2.0.

Prerequisites

  • All Linux platforms:
    • DataStax Academy registration email address and password.
    • Latest version of Oracle Java SE Runtime Environment 7, not OpenJDK. See Installing Oracle JDK.
    • Java Native Access (JNA). The recommended versions are 3.2.4 to 3.2.7. Do not install version 4.0 and above. See Installing the JNA.
  • Debian/Ubuntu distributions:
  • RedHat-compatible distributions:
    • If installing on a 64-bit Oracle Linux distribution, first install the 32-bit versions of glibc libraries.
    • If you are using an older RHEL-based Linux distribution, such as CentOS-5, you may need to replace the Snappy compression/decompression library; see the DataStax Enterprise 4.5.0 Release Notes.
    • Before installing, make sure EPEL (Extra Packages for Enterprise Linux) is installed. See Installing EPEL on RHEL OS 5.x.

Also see Recommended production settings and the DataStax Enterprise Reference Architecture white paper.

The binary tarball runs as a stand-alone process.

Procedure

These steps install DataStax Enterprise. After installing, you must configure and start DataStax Enterprise.

In a terminal window:
Note: In the following commands, be sure to change X to an actual version number. To view the available versions, see the Release notes. The latest version of DataStax Enterprise 4.5 is 4.5.9.

  1. Check which version of Java is installed:
    $ java -version

    If not Oracle Java 7, see Installing Oracle JDK.

    Important: Package management tools do not install Oracle Java.
  2. Download the tarball from the Download DataStax Enterprise page.

    You will need the DataStax Acadmeny account credentials from your registration. Be sure to use your registration email address, not your username.

    Note: For production installations, DataStax recommends installing the OpsCenter separate from the cluster. See the OpsCenter documentation.
  3. Unpack the distribution:
    $ tar -xzvf dse-4.5.X.tar.gz
  4. If you do not have root access to the default directories locations, you can define your own directory locations as described in the following steps or change the ownership of the directories:
    • /var/lib/cassandra
    • /var/log/cassandra
    • /var/lib/spark
    • /var/log/spark
    $ sudo mkdir -p /var/lib/cassandra; sudo chown -R  $USER: $GROUP /var/lib/cassandra
    $ sudo mkdir -p /var/log/cassandra; sudo chown -R  $USER: $GROUP /var/log/cassandra
    $ sudo mkdir -p /var/lib/spark; sudo chown -R  $USER: $GROUP /var/lib/spark
    $ sudo mkdir -p /var/log/spark; sudo chown -R  $USER: $GROUP /var/log/spark
  5. Optional: If you do not want to use the default data and logging directories, you can define your own directory locations:
    1. Make the directories for data and logging directories:
      $ mkdir install_location/dse-data
      $ cd dse-data
      $ mkdir commitlog
      $ mkdir saved_caches
    2. Go the directory containing the cassandra.yaml file:
      $ cd install_location/resources/cassandra/conf
    3. Edit the following lines in the cassandra.yaml file:
      data_file_directories: install_location/dse-data
      commitlog_directory: install_location/dse-data/commitlog
      saved_caches_directory: install_location/dse-data/saved_caches
  6. Optional: If you do not want to use the default Spark directories, you can define your own directory locations:
    1. Make the directories for the Spark lib and log directories.
    2. Go the directory containing the spark-env.sh file:
      • Installer-Services and Package installations: /etc/dse/spark/spark-env.sh
      • Installer-No Services and Tarball installations: install_location/resources/spark/conf/spark-env.sh
    3. Edit the spark-env.sh file to match the locations of your Spark lib and log directories, as described in Spark configuration.

Results

DataStax Enterprise is ready for configuration.

What's next