...
The following are three issues that may occur when you work with Spark in a multi node DAS cluster:
Info |
---|
The following issues only occur when the DAS cluster is running in RedHat Linux environments. |
- The DAS nodes consuming too much CPU processing power.
- DAS nodes running out of memory.
- Too many log directories being created in the
<DAS_HOME>/work
directory.
All of the above issues can be created as a result of he symbolic link not being correctly resolved in these the operating system. To address this, you are required to update the <DAS_HOME>/bin/wso2server.sh
file with the following entry so that the <DAS_HOME>
is exported. Export CARBON_HOME=<symbolic link