...
Artifact Type in BAM | Replaced in DAS By | Note | Recommended Action |
---|---|---|---|
| Event Receiver | The | Identify the input event adapters and event builders used in combination in BAM 2.5.0, and redefine them as event receivers. For detailed instructions, see Configuring Event Receivers. |
| N/A | The Event Stream artifact in DAS is different to that in BAM since a DAS event stream can be a real time event stream or a persistent event stream, whereas BAM event streams are always defined for persistent data. | Redefine the event streams in DAS 3.0.0 using the following procedure.
Alternatively, you can redefine the complete event stream configuration as described in Understanding Event Streams and Event Tables. |
| Event Publisher | The Event Publisher artifact in DAS 3.0.0 embeds both the Output Event Adapter and Event Formatter artifacts in BAM 2.5.0. | Identify the output event adapters and event formatters used in combination in BAM 2.5.0, and redefine them as event publishers. For detailed instructions, see Creating Alerts. |
| N/A | The Siddhi Query Language in DAS 3.0.0 is different to that in BAM 2.5.0. | Redefine the execution plans in DAS 3.0.0. For detailed instructions on defining execution plans, see Creating a Standalone Execution Plan and Creating a STORM Based Distributed Execution Plan. For the modified Siddhi Query Language, see Siddhi Query Language. |
Migrating
...
data
Run the <DAS_HOME>/bin/analytics-migrate.sh
script to migrate events data from WSO2 BAM to WSO2 DAS. For detailed information, see Analytics Migration Tool.
...
Scripts that analyze data in WSO2 BAM are written in the Apache Hive query language whereas the scripts in WSO2 DAS are written in the Apache Spark SQL query language. Therefore, the syntax of the scripts differ in WSO2 BAM and WSO2 DAS. Due to this, scripts cannot be directly migrated from WSO2 BAM to WSO2 DAS.
...
Redefine the required scripts in WSO2 DAS in the Spark query language. For detailed information on writing queries in the Spark SQL query language, see Spark Query LanguageSQL query language.
Handling dashboards and gadgets
...
- When the database upgrade scripts are executed, the following are some of the new tables that will be created in the database:
- UM_DOMAIN
- UM_SYSTEM_USER
- UM_SYSTEM_ROLE
- UM_SYSTEM_USER_ROLE
- Deploy the toolbox in DAS.
- Publish data to WSO2 DAS and check whether they are received by the event receivers configured in DAS.
- Execute the scripts you redefined in the Spark SQL query language, and use the Data Explorer and/or the Analytics Dashboard to view the results.
- Verify that all the required scenarios are working as expected. This confirms that the upgrade is successful.