Unknown macro: {next_previous_link3}
Skip to end of metadata
Go to start of metadata

You are viewing an old version of this page. View the current version.

Compare with Current View Page History

« Previous Version 30 Next »

When you run WSO2 DAS in a clustered set up, information relating to the Spark workers in the cluster are logged by default in the <DAS_HOME>/work directory. These spark logs are generated in addition to the default Carbon logs, Audit logsHTTP access logs, and Trace logs that are available for every WSO2 product.  See Managing Logs in the WSO2 Administration Guide for information on Carbon logs, Audit logsHTTP access logs, and Trace logs.

Carbon logs in WSO2 DAS 

Carbon logs are configured in the log4j.properties file (stored in the <PRODUCT_HOME>/repository/conf directory) for all WSO2 products as explained in the admin guide. However, WSO2 DAS generates some additional Carbon logs (which will be stored in the same Carbon log file) that should be separately configured by creating a new log4j.properties file in the <DAS_HOME>/repository/conf/analytics/spark  directory. Note: To create this file, you need to rename the log4j.properties.template file that is available in the <DAS_HOME>/repository/conf/analytics/spark directory to log4j.properties.

The following sections explain how to configure and manage Spark worker logs. Each time we start a DAS server, a spark application will be created, and the relevant 'stderr' and 'stdout' logs can be found in each application folder inside the <DAS_HOME>/work directory. These spark logs are configured in the spark-defaults.conf file (stored in the <DAS_HOME>/repository/conf/analytics/spark directory).

Managing the size of spark log 

By default, a spark log will be a maximum of 10MBs in size. The maximum size of a log file and the number of executor logs can be controlled using the following two properties in the spark-defaults.conf file.

spark.executor.logs.rolling.maxSize=10000000
spark.executor.logs.rolling.maxRetainedFiles=10

Removing logs in WSO2 DAS servers

Follow the guidelines given below for managing Spark logs in WSO2 DAS servers.

  • Delete old content from the <DAS-HOME>/work directory for non-running nodes of WSO2 DAS. Since new directories will be created each time we start the server, the previous data will be unusable after the restart.

  • When you delete index data from Spark logs, be sure to purge data stored in DAS, instead of deleting the data stored in the <DAS_HOME>/repository/data directory. Purging data using this method will remove the index data from the database as well as the /repository/data directory.

  • No labels