Note |
---|
The contents on this page are currently under review! |
...
Info |
---|
Carbon logs in WSO2 DAS Carbon logs are configured in the |
The following sections explain how to configure and manage Spark worker logs. Each time we start a DAS server, a spark application will be created, and the relevant 'stderr' and 'stdout' logs can be found in each application folder inside the <DAS_HOME>/work
directory. These spark logs are configured in the spark-defaults.conf
file (stored in the <DAS_HOME>/repository/conf/analytics/spark
directory).
Managing the size of spark log
By default, a spark log will be a maximum of 10MBs in size. The maximum size of a log file and the number of executor logs can be controlled using the following two properties in the spark-defaults.conf
file.
Code Block |
---|
spark.executor.logs.rolling.maxSize10000000 spark.executor.logs.rolling.maxRetainedFiles10 |
Removing logs in WSO2 DAS servers
Follow the guidelines given below for managing Spark logs in WSO2 DAS servers.
Delete old content from the
<DAS-HOME>/work
directory for non-running nodes of WSO2 DAS. Since new directories will be created each time we start the server, the previous data will be unusable after the restart.When you delete index data from Spark logs, be sure to purge data stored in DAS, instead of deleting the data stored in the
<DAS_HOME>/repository/data
directory. Purging data using this method will remove the index data from the database as well as the/repository/data
directory.