The contents on this page are currently under review!
When you run WSO2 DAS in a clustered set up, information relating to the Spark workers in the cluster are logged by default in the <DAS_HOME>/work
directory. These spark logs are generated in addition to the default Carbon logs, Audit logs and HTTP access logs that are available for every WSO2 product. See Managing Logs in the WSO2 Administration Guide for information on Carbon logs, Audit logs and HTTP access logs.
Carbon logs in WSO2 DAS
Carbon logs are configured in the log4j.properties
file (stored in the <PRODUCT_HOME>/repository/conf
directory) for all WSO2 products. However, WSO2 DAS generates some additional Carbon logs that should be separately configured by creating a new log4j.properties
file in the <DAS_HOME>/repository/conf/analytics/spark
directory. Note: To create this file, you need to rename the log4j.properties.template
file that is available in the <DAS_HOME>/repository/conf/analytics/spark
directory to log4j.properties
.
The following sections explain how to configure and manage Spark worker logs. Each time we start a DAS server, a spark application will be created, and the relevant 'stderr' and 'stdout' logs can be found in each application folder inside the <DAS_HOME>/work
directory. Spark logs are configured in the spark-defaults.conf
file.
Managing the size of spark log
By default, a spark log will be a maximum of 10MBs in size. This can be controlled in the spark-defaults.conf
file, which is stored in the <DAS_HOME>/repository/conf/analytics/spark
directory. The maximum size of a log file and the number of executor logs can be controlled using the two properties given below in the same configuration file.
spark.executor.logs.rolling.maxSize10000000 spark.executor.logs.rolling.maxRetainedFiles10
Removing logs in WSO2 DAS servers
Follow the guidelines given below for managing logs in DAS servers.
Delete old content from the
<DAS-HOME>/work
directory for non-running nodes in DAS. Since new directories will be created each time we start the server, the previous data will be unusable after the restart.Be sure to purge data stored in DAS, instead of deleting log files. Data from the database as well as index data stored in the the
<DAS_HOME>/repository/data
directory will be purged. Log files are required to be managed in the log4j level.