Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.
Warning

This section is currently a work in progress!

The Spark UI is mainly used to monitor the performance of your Spark setup. However, it may also expose sensitive data such as the keystore password in some of the UIs depending on your Spark configurations. For security purposes, you can either disable the Spark UI completely, remove selected tabs from the selected UIsUI, or disable only the required properties.

Info

For more information about Spark UI, see Spark Troubleshooting.

The following sections cover the different methods to disable the Spark UI.

Table of Contents
maxLevel2
minLevel2

Disabling the Spark UI completely

To disable the Spark UI completely, follow the steps below.

  1. Build a sample

    servlet filter as described here

    servlet filter.

    Info

    You can use any servlet filter for this purpose. If you want to implement your own servley filter, you can follow the instructions in Apache Tomcat - Interface Filter.

  2. Place the build jar in the <DAS_HOME>/repository/component/lib directory.
  3. Add the following property to the <DAS_HOME>/repository/conf/analytics/spark/spark-default.conf file.

    Code Block
    languagejava
    spark.ui.filters com.neolitec.examples.BasicAuthenticationFilter
    spark.com.neolitec.examples.BasicAuthenticationFilter.params username=admin,password=admin
  4. Add the following property to the <DAS_HOME>/repository/conf/analytics/spark/external-spark-classpath.conf file.

    Code Block
    repository/components/lib/basicAuthenticationFilter-0.0.1-SNAPSHOT.jar
  5. Restart the the DAS cluster.

Disabling selected UIs

Disable the Spark UIs UI for your DAS deployment by setting the spark.ui.enabled property to false in the <DAS_HOME>/repository/conf/analytics/spark/spark-defaults.conf file as shown below.

Code Block
spark.ui.enabled false


Disabling the selected properties

Exclude the relevant Spark properties from being displayed in the Environment tab of the Application UI by editing your spark properties.