Note |
---|
The contents on this page are currently under review! |
...
Info |
---|
Carbon logs in WSO2 DAS Carbon logs are configured in the |
The following sections explain how to configure and manage Spark worker logs. Each time we start a DAS server, a spark application will be created, and the relevant 'stderr' and 'stdout' logs can be found in each application folder inside the <DAS_HOME>/work
directory. Spark logs are configured in the spark-defaults.conf
file (stored in the <DAS_HOME>/repository/conf/analytics/spark
directory).
Managing the size of spark log
By default, a spark log will be a maximum of 10MBs in size. This can be controlled in the spark-defaults.conf
file, which is stored in the <DAS_HOME>/repository/conf/analytics/spark
directory. The maximum size of a log file and the number of executor logs can be controlled using the following two properties given below in the same configuration filein the spark-defaults.conf
file.
Code Block |
---|
spark.executor.logs.rolling.maxSize10000000 spark.executor.logs.rolling.maxRetainedFiles10 |
Removing logs in WSO2 DAS servers
Follow the guidelines given below for managing logs in DAS servers.
Delete old content from the
<DAS-HOME>/work
directory for non-running nodes in DAS. Since new directories will be created each time we start the server, the previous data will be unusable after the restart.Be sure to purge data stored in DAS, instead of deleting log files. Data from the database as well as index data stored in the the
<DAS_HOME>/repository/data
directory will be purged. Log files are required to be managed in the log4j level.