Unknown macro: {next_previous_link3}
Skip to end of metadata
Go to start of metadata

You are viewing an old version of this page. View the current version.

Compare with Current View Page History

« Previous Version 5 Next »

When you run WSO2 DAS in a clustered set up, information relating to the Spark workers in the cluster are logged by default in the <DAS_HOME>/work directory. Each time we start a DAS server, a spark application will be created, and the relevant 'stderr' and ''stdout logs can be found in each application folder inside the <DAS_HOME>/work directory.

These logs can be controlled by creating a log4j.properties file in the <DAS_HOME>/repository/conf/analytics/spark directory. Note: To create this file, you need to rename the log4j.properties.template file that is available in the <DAS_HOME>/repository/conf/analytics/spark directory to log4j.properties.

This section explains how to configure this file and the other configurations required to manage Sparl worker logs.

  • No labels