Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

...

Note

Before you enable Spark UIs, note that they are only supported with HTTP, and the Environment tab of Application UI may display security sensitive information such as the keystore password depending on your Spark Configuration.

If you do not want to expose such information, the following options are available:

  • Exclude the relevant Spark properties from being displayed in the Environment tab by editing your spark properties.
  • Disable the Spark UIs for your DAS deployment by setting the spark.ui.enabled property to false in the <DAS_HOME>/repository/conf/analytics/spark/spark-defaults.conf file as shown below.

    Code Block
    spark.ui.enabled false

you can disable Spark UIs as explained in Disabling Spark UI.

Apache Spark provides a set of user interfaces (UI) that allow you to monitor and troubleshoot the issues in a Spark cluster. This section helps you to understand the information accessed from these UIs.

...

This tab displays detailed information about the SQL queries of the selected Spark application.


Spark issues in a production envirionment

The following are three issues that may occur when you work with Spark in a multi node DAS cluster:

Info

The following issues only occur when the DAS cluster is running in RedHat Linux environments.

  • The DAS nodes consuming too much CPU processing power.
  • DAS nodes running out of memory.
  • Too many log directories being created in the <DAS_HOME>/work directory.

All of the above issues can be created as a result of he symbolic link not being correctly resolved in the operating system. To address this, you are required to update the <DAS_HOME>/bin/wso2server.sh file with the following entry so that the <DAS_HOME> is exported. 

Export CARBON_HOME=<symbolic link