Unknown macro: {next_previous_link3}
Skip to end of metadata
Go to start of metadata

You are viewing an old version of this page. View the current version.

Compare with Current View Page History

« Previous Version 9 Next »

When you run WSO2 DAS in a clustered set up, information relating to the Spark workers in the cluster are logged by default in the <DAS_HOME>/work directory. These spark logs are generated in addition to the default Carbon logs, Audit logs and HTTP access logs that are available for every WSO2 product.

See Managing Logs in the WSO2 Administration Guide for information on how to configure and use Carbon logs, Audit logs and HTTP access logs.

Each time we start a DAS server, a spark application will be created, and the relevant 'stderr' and 'stdout' logs can be found in each application folder inside the <DAS_HOME>/work directory.

These logs can be controlled by creating a log4j.properties file in the <DAS_HOME>/repository/conf/analytics/spark directory. Note: To create this file, you need to rename the log4j.properties.template file that is available in the <DAS_HOME>/repository/conf/analytics/spark directory to log4j.properties.

This section explains how to configure this file and the other configurations required to manage Spark worker logs.

  • No labels