Publishing API Runtime Statistics Using WSO2 DAS
This section explains how to set up WSO2 Data Analytics Server (WSO2 DAS) to collect and analyze runtime statistics from the API Manager.
Prerequisites
- WSO2 Data Analytics Server
- An RDBMS instance (MySQL, MsSQL, Oracle, H2 etc.)Â
- API_Manager_Analytics_3.0.x.car (for WSO2 DAS 3.0.0 and WSO2 DAS 3.0.1)
- DB scripts (for WSO2 DAS 3.0.0 and WSO2 DAS 3.0.1)
Configuring WSO2 DAS
Download WSO2 Data Analytics Server 3.0.x.
If the API Manager and DAS run on the same machine, open the
<DAS_HOME>/repository/conf/carbon.xml
file and increase the default service port of DAS by setting the offset value as follows:<Offset>1</Offset>
This increments all ports used by the server by 1, which means the WSO2 DAS server will run on port 9444. Port offset is used to increment the default port by a given value. It avoids possible port conflicts when multiple WSO2 products run on same host.
Define the datasource declaration according to your RDBMS in the
<DAS_HOME>/repository/conf/datasources/master-datasources.xml
file.This DB is used to push the summarized data after analyzing is done by WSO2 DAS. Later, WSO2 API Manager uses this DB to fetch the summary data and display it on the API Manager dashboard. MySQL databases are used here as an example. However, it is also possible to configure it with H2, Oracle, etc. Note that you should always use the
WSO2AM_STATS_DB
as the datasoure name.The auto commit option should be disabled when working with WSO2 DAS. Set this in the JDBC URL or by addingÂ
<defaultAutoCommit>false</defaultAutoCommit>
under the datasource<configuration>
property as shown below:<datasource> <name>WSO2AM_STATS_DB</name> <description>The datasource used for setting statistics to API Manager</description> <jndiConfig> <name>jdbc/WSO2AM_STATS_DB</name> </jndiConfig> <definition type="RDBMS"> <configuration> <url>jdbc:mysql://localhost:3306/TestStatsDB</url> <username>db_username</username> <password>db_password</password> <driverClassName>com.mysql.jdbc.Driver</driverClassName> <maxActive>50</maxActive> <maxWait>60000</maxWait> <testOnBorrow>true</testOnBorrow> <validationQuery>SELECT 1</validationQuery> <validationInterval>30000</validationInterval> <defaultAutoCommit>false</defaultAutoCommit> </configuration> </definition> </datasource>
Configuring the MySQL database
If you are using MySQL as the database, download and paste the MySQL driver to the
<DAS_HOME>/repository/components/lib
directory.ÂIf your WSO2 DAS version is 3.0.x, import the appropriate schema declaration script from the DB scripts folder to the above database. For example, use the
mysql.sql
to create the schema in the database.ÂSimilar to WSO2 Business Activity Monitor (BAM), WSO2 DAS 3.0.x does not automatically create the table structure in the database. Therefore, it needs to be done manually as
-Dsetup
does not work.For a list of all the RDBMS summarized tables, see RDBMS summarized tables.
Uploading the API Manager analytics file
WSO2 DAS uses SparkSQL to analyze the data. All definitions about the data published from WSO2 API Manager and the way it should be analyzed using Spark are shipped to WSO2 DAS as a .car
file.Â
- If your WSO2 DAS version is 3.0.x, download theÂ
API_Manager_Analytics_3.0.x.car
file. - Start the WSO2 DAS server and log in to the Management Console.
- Navigate to the Carbon Applications section under Manage and click Add.
- Point to the downloaded file and upload.
Configuring WSO2 API Manager
- Start the API Manager server and log in to the Admin Dashboard (
https://<Server Host>:9443/admin-dashboard
). - Click Configure Analytics under the Settings section.
- Select the Enable check box to enable statistical data publishing.
- Set the event receiver configurations according to the DAS server. Event receivers refer to the endpoint to which events are published from the API Gateway.
- Click Add URL Group to save the configuration.Â
- Clear all settings under the Data Analyzer Configurations section (these settings are only required when using WSO2 BAM).
- Clear all settings under the Statistics Summary Datasource section. Give the datasource definition that is used to store summarized statistical data. The tables are created automatically when the Hive script runs. You just need to create the schema. The same configurations will be done in the DAS server.
- URL: The connection URL for the RDBMS datasource
- JDBC Driver Class: The fully qualified Java class name of the JDBC driver
- Username/Password: Credentials to be passed to the JDBC driver to establish a connection
Click Save. It deploys the Analytics toolbox, which describes the information collected, how to analyze the data, and the location of the database where the analyzed data is stored.
Tip: To edit the datasource connection pool parameters, click the Show More Options link.
If you are using MySQL as the database, copy and paste the MySQL driver library to the <AM_HOME>/repository/components/lib
directory.
Invoking the sample
Invoke an API to generate traffic and see the statistics.Â
- Log in to the API Publisher and deploy the sample Weather API.
- Sign up to OpenWeatherMap and obtain an API key.
- Edit the Weather API in the API Publisher and replace the existing Production and Sandbox URLs with the URLs from OpenWeatherMap.
- Save and publish the Weather API.
- Log in to the API Store and subscribe to the Weather API.Â
- Invoke the sample using the API Store or the cURL command and wait a few minutes for the analytics to be generated.Â
- In the API Publisher, click API Usage under the All Statistics section.
Purging Data (optional)
Data purging is an option to remove historical data in WSO2 DAS. This is important since it is not possible to delete tables or table data in WSO2 DAS. By purging data, you can achieve high performance on data analysing without removing analysed summary data. Only data from the stream data fired by APIM is purged and it is contained in the following tables:
ORG_WSO2_APIMGT_STATISTICS_DESTINATION
ORG_WSO2_APIMGT_STATISTICS_FAULT
ORG_WSO2_APIMGT_STATISTICS_REQUEST
ORG_WSO2_APIMGT_STATISTICS_RESPONSE
ORG_WSO2_APIMGT_STATISTICS_WORKFLOW
ORG_WSO2_APIMGT_STATISTICS_THROTTLE
There are two ways to purge data in DAS,
Using the admin console
- Navigate to the Data-explorer and select one of the above tables.Â
- Click Schedule Data Purge.
- On the dialog box that appears, set the time and days within which you want to purge data and save.
- Repeat the steps for all of the tables above and wait for the data to be purged.
Using the global method
- Open the
<DAS_HOME>/repository/conf/analytics/analytics-config.xml
file. Change the contents under the <analytics-data-purging> property as shown below,
<analytics-data-purging> <!-- Below entry will indicate purging is enable or not. If user wants to enable data purging for cluster then this property need to be enable in all nodes --> <purging-enable>true</purging-enable> <cron-expression>0 0 12 * * ?</cron-expression> <!-- Tables that need include to purging. Use regex expression to specify the table name that need include to purging.--> <purge-include-table-patterns> <table>.*</table> <!--<table>.*jmx.*</table>--> </purge-include-table-patterns> <!-- All records that insert before the specified retention time will be eligible to purge --> <data-retention-days>365</data-retention-days> </analytics-data-purging>
- Save your changes.