Portfolio Manager demo using Spark

The Portfolio Manager demo runs an application that is based on a financial use case. You run scripts that create a portfolio of stocks.

About this task

The Portfolio Manager demo runs an application that is based on a financial use case. You run scripts that create a portfolio of stocks. On the OLTP (online transaction processing) side, each portfolio contains a list of stocks, the number of shares purchased, and the purchase price. The demo’s pricer utility simulates real-time stock data. Each portfolio gets updated based on its overall value and the percentage of gain or loss compared to the purchase price. The utility also generates 100 days of historical market data (the end-of-day price) for each stock. On the DSE OLAP (online analytical processing) side, a Spark job calculates the greatest historical 10 day loss period for each portfolio, which is an indicator of the risk associated with a portfolio. This information is then fed back into the real-time application to allow customers to better gauge their potential losses.

Procedure

  1. To run the demo:

    DataStax Demos do not work with LDAP or internal authorization (username/password) enabled.

  2. Install a node

    If using a tarball installation, the Portfolio Manager demo is installed as part of the normal installation. If using a package install, you must include the command for installing the demos.

    • Default Interface: localhost (127.0.0.1) You must use this IP for the demo.

  3. Start DataStax Enterprise as DSE Analytics node:

    • For package installations:

      1. In /etc/default/dse, set:

        SPARK_ENABLED=1
      2. Start the node:

        sudo service dse start
    • For tarball installations:

      installation\_location/bin/dse cassandra -k ## Starts node in analytics mode
  4. Go to the Portfolio Manager demo directory.

    The default location of the Portfolio Manager demo depends on the type of installation:

    • Package installations: /usr/share/dse/demos/portfolio_manager

    • Tarball installations:installation_location/demos/portfolio_manager

  5. Run the bin/pricer utility to generate stock data for the application:

    • To see all of the available options for this utility:

      bin/pricer --help
    • Start the pricer utility:

      bin/pricer -o INSERT_PRICES &&
      bin/pricer -o UPDATE_PORTFOLIOS &&
      bin/pricer -o INSERT_HISTORICAL_PRICES -n 100

    The pricer utility takes several minutes to run.

  6. Start the web service:

    cd website &&
    sudo ./start
  7. Open a browser and go to http://localhost:8983/portfolio.

    The real-time Portfolio Manager demo application is displayed.

    ana portfolio demo1
  8. Open another terminal.

  9. Run the Spark SQL job in the 10-day-loss.q file.

    dse spark-sql -f 10-day-loss.q
  10. Run the equivalent Spark Scala job in the 10-day-loss.sh script.

    The Spark application takes several minutes to run.

    ./10-day-loss.sh
  11. Run the equivalent Spark Java job in the 10-day-loss-java.sh script.

    ./10-day-loss-java.sh
  12. After the job completes, refresh the Portfolio Manager web page.

    The results of the Largest Historical 10 day Loss for each portfolio are displayed.

    ana historical loss

What’s next

The Scala and Java source code for the demo are in the src directory.

Was this helpful?

Give Feedback

How can we improve the documentation?

© 2024 DataStax | Privacy policy | Terms of use

Apache, Apache Cassandra, Cassandra, Apache Tomcat, Tomcat, Apache Lucene, Apache Solr, Apache Hadoop, Hadoop, Apache Pulsar, Pulsar, Apache Spark, Spark, Apache TinkerPop, TinkerPop, Apache Kafka and Kafka are either registered trademarks or trademarks of the Apache Software Foundation or its subsidiaries in Canada, the United States and/or other countries. Kubernetes is the registered trademark of the Linux Foundation.

General Inquiries: +1 (650) 389-6000, info@datastax.com