Upgrading from the Previous Release
This page takes you through the steps for upgrading from BAM 2.4.0 version to BAM 2.4.1. If you are upgrading from BAM 2.3.0, you must first upgrade to 2.4.0 before upgrading to 2.4.1.
Preparing to upgrade
Configuration upgrades
Make changes in the files inside the  <BAM_HOME>/repository/conf/
 folder as follows:
Folder path | Description | Required change |
---|---|---|
advanced/hive-site.xml
 | Change the value of the hive.jar.path property. | <value>${CARBON_HOME}/repository/components/plugins/hive_0.8.1.wso2v9.jar</value>
|
Change the value of the hive.aux.jars.path property. | <value> file://${CARBON_HOME}/repository/components/plugins/apache-cassandra_1.2.13.wso2v2.jar, file://${CARBON_HOME}/repository/components/plugins/guava_12.0.0.wso2v1.jar, file://${CARBON_HOME}/repository/components/plugins/json_2.0.0.wso2v1.jar, file://${CARBON_HOME}/repository/components/plugins/commons-dbcp_1.4.0.wso2v1.jar, file://${CARBON_HOME}/repository/components/plugins/commons-pool_1.5.6.wso2v1.jar</value> | |
data-bridge/data-bridge-config.xml
| Comment the line with the value CassandraStreamDefinitionStore in StreamDefinitionStore element. |
|
Uncomment the line with the value RegistryStreamDefinitionStore in StreamDefinitionStore element | org.wso2.carbon.databridge.streamdefn.registry.datastore.RegistryStreamDefinitionStore | |
datasources/bam-datasources.xml
| This
| Move existing entries in the <BAM_HOME>/repository/conf/datasources/master-datasources.xml here. |
etc/
| Both cassandra-auth.xml and cassandra-component.xml files are no longer required | Remove the two files. |
etc/cassandra.ymal
| Need to apply values of the new cassandra.yaml that ships with BAM 2.4.1.> | Update seeds details, IP addresses, and port according to your deployment. |
etc/-config.xml
| This hector-config.xml file is newly introduced with BAM 2.4.1. It contains details previously defined in <BAM_HOME>/repository/conf/etc/cassandra-component.xml file. Â | None. |
etc/tasks-config.xml
| Add the new entry. | <locationResolverClass> org.wso2.carbon.ntask.core.impl.RandomTaskLocationResolver </locationResolverClass> |
carbon.xml
|
| Â |
 | Use the latest log4j.properties file. | Do relevant changes according to your configurations of the previous version. |
Cassandra data migration
Take one node at a time and perform the following steps for the Caasandra data migration:
Removing the node  temporarily from the active cluster
Run
disablegossip
anddisablethrift
using the NodeTool, to make the node stop accepting further requests from external clients/other nodes.Flush/drain the
memtables
in order to flush the data written to memory into the disk.Run Compaction to merge
sstables
.Take snapshots and enable incremental backups.
This stops all the other nodes/clients from writing to this node and since
memtables
are flushed to disk, startup times are fast as it need not walk-through commit logs.Stop Cassandra. (Though this node is down, cluster is available for write/read, therefore the downtime is zero.)
Upgrading SS Tables
Install Cassandra 1.2.13 on the new locations.
Upgrade
sstables
to new storage format usingsstableupgrade.
- Copy the upgraded related files of theÂ
sstables
 to the proper location, where they are typically stored. (<CARBON_HOME>/repository/database/cassandra/data/[keyspace_name]/
)
Merging cassandra.yaml related configurations
Compare the cassandra.yaml
files shipped with both Cassandra versions and apply the parameter values appropriately. This is required as there could be certain parameters that have been dropped. (It could be that they are retained to preserve backward compatibility), due to moving from one major version to another.
Rebooting Cassandra
Start Cassandra.
- Check whether the nodes have properly joined the cluster via NodeTool
ring
command.
Hadoop cluster configuration
Hadoop cluster configuration settings have not been changed since BAM 2.4.0. Therefore, you can use your existing Hadoop installation.
From BAM 2.4.1 onwards the stream definition store
is moved to Registry from Cassandra stream definition store
. Therefore, if you have the streams added in the toolboxes, then having the relevant toolboxes in the <BAM_HOME>/repository/deployment/server/bam-toolbox/
directory would be sufficient. However, if you don't have the toolbox deployed, then you need to re-define the streams as below:
- If you use default BAM data agents of WSO2 servers to publish events to BAM, then you can restart the WSO2 servers hence it will re-define the streams again.Â
- If you use custom agents, then you need to re-start/reset such that it will define the stream again.
Â