com.atlassian.confluence.content.render.xhtml.migration.exceptions.UnknownMacroMigrationException: The macro 'next_previous_links' is unknown.

Publishing API Runtime Statistics Using WSO2 DAS

This section explains how to set up WSO2 Data Analytics Server (WSO2 DAS) to collect and analyze runtime statistics from the API Manager.

Prerequisites

Configuring WSO2 DAS

  1. Download WSO2 Data Analytics Server 3.0.x.

  2. If the API Manager and DAS run on the same machine, open the <DAS_HOME>/repository/conf/carbon.xml file and increase the default service port of DAS by setting the offset value as follows:

      <Offset>1</Offset>

    This increments all ports used by the server by 1, which means the WSO2 DAS server will run on port 9444. Port offset is used to increment the default port by a given value. It avoids possible port conflicts when multiple WSO2 products run on same host.

  3. Define the datasource declaration according to your RDBMS in the <DAS_HOME>/repository/conf/datasources/master-datasources.xml file.

    This DB is used to push the summarized data after analyzing is done by WSO2 DAS. Later, WSO2 API Manager uses this DB to fetch the summary data and display it on the API Manager dashboard. MySQL databases are used here as an example. However, it is also possible to configure it with H2, Oracle, etc. Note that you should always use the WSO2AM_STATS_DB as the datasoure name.

  4. The auto commit option should be disabled when working with WSO2 DAS. Set this in the JDBC URL or by adding <defaultAutoCommit>false</defaultAutoCommit> under the datasource <configuration> property as shown below:

    <datasource>
        <name>WSO2AM_STATS_DB</name>
        <description>The datasource used for setting statistics to API Manager</description>
        <jndiConfig>
          <name>jdbc/WSO2AM_STATS_DB</name>
          </jndiConfig>
        <definition type="RDBMS">
          <configuration>
            <url>jdbc:mysql://localhost:3306/TestStatsDB</url>
            <username>db_username</username>
            <password>db_password</password>
            <driverClassName>com.mysql.jdbc.Driver</driverClassName>
            <maxActive>50</maxActive>
            <maxWait>60000</maxWait>
            <testOnBorrow>true</testOnBorrow>
            <validationQuery>SELECT 1</validationQuery>
            <validationInterval>30000</validationInterval>
            <defaultAutoCommit>false</defaultAutoCommit>
            </configuration>
          </definition>
      </datasource>

Configuring the MySQL database

  1. If you are using MySQL as the database, download and paste the MySQL driver to the <DAS_HOME>/repository/components/lib directory. 

  2. If your WSO2 DAS version is 3.0.x, import the appropriate schema declaration script from the DB scripts folder to the above database. For example, use the mysql.sql to create the schema in the database. 

    Similar to WSO2 Business Activity Monitor (BAM), WSO2 DAS 3.0.x does not automatically create the table structure in the database. Therefore, it needs to be done manually as -Dsetup does not work.

    For a list of all the RDBMS summarized tables, see RDBMS summarized tables.

Uploading the API Manager analytics file

WSO2 DAS uses SparkSQL to analyze the data. All definitions about the data published from WSO2 API Manager and the way it should be analyzed using Spark are shipped to WSO2 DAS as a .car file. 

  1. If your WSO2 DAS version is 3.0.x, download the API_Manager_Analytics_3.0.x.car file.
  2. Start the WSO2 DAS server and log in to the Management Console.
  3. Navigate to the Carbon Applications section under Manage and click Add.
  4. Point to the downloaded file and upload.

Configuring WSO2 API Manager

  1. Start the API Manager server and log in to the Admin Dashboard (https://<Server Host>:9443/admin-dashboard).
  2. Click Configure Analytics under the Settings section.
  3. Select the Enable check box to enable statistical data publishing.
  4. Set the event receiver configurations according to the DAS server. Event receivers refer to the endpoint to which events are published from the API Gateway.
  5. Click Add URL Group to save the configuration. 
  6. Clear all settings under the Data Analyzer Configurations section (these settings are only required when using WSO2 BAM).
  7. Clear all settings under the Statistics Summary Datasource section. Give the datasource definition that is used to store summarized statistical data. The tables are created automatically when the Hive script runs. You just need to create the schema. The same configurations will be done in the DAS server.
    • URL: The connection URL for the RDBMS datasource
    • JDBC Driver Class: The fully qualified Java class name of the JDBC driver
    • Username/Password: Credentials to be passed to the JDBC driver to establish a connection
  8. Click Save. It deploys the Analytics toolbox, which describes the information collected, how to analyze the data, and the location of the database where the analyzed data is stored.

    Tip: To edit the datasource connection pool parameters, click the Show More Options link.

If you are using MySQL as the database, copy and paste the MySQL driver library to the <AM_HOME>/repository/components/lib directory.

Invoking the sample

Invoke an API to generate traffic and see the statistics. 

  1. Log in to the API Publisher and deploy the sample Weather API.
  2. Sign up to OpenWeatherMap and obtain an API key.
  3. Edit the Weather API in the API Publisher and replace the existing Production and Sandbox URLs with the URLs from OpenWeatherMap.
  4. Save and publish the Weather API.
  5. Log in to the API Store and subscribe to the Weather API. 
  6. Invoke the sample using the API Store or the cURL command and wait a few minutes for the analytics to be generated. 
  7. In the API Publisher, click API Usage under the All Statistics section.

Purging Data (optional)

Data purging is an option to remove historical data in WSO2 DAS. This is important since it is not possible to delete tables or table data in WSO2 DAS. By purging data, you can achieve high performance on data analysing without removing analysed summary data. Only data from the stream data fired by APIM is purged and it is contained in the following tables:

ORG_WSO2_APIMGT_STATISTICS_DESTINATION

ORG_WSO2_APIMGT_STATISTICS_FAULT

ORG_WSO2_APIMGT_STATISTICS_REQUEST

ORG_WSO2_APIMGT_STATISTICS_RESPONSE

ORG_WSO2_APIMGT_STATISTICS_WORKFLOW

ORG_WSO2_APIMGT_STATISTICS_THROTTLE

Make sure you do not purge data in tables other than those mentioned above as that will delete your summarized historical data.

There are two ways to purge data in DAS,

Using the admin console

  1. Navigate to the Data-explorer and select one of the above tables. 
  2. Click Schedule Data Purge.
  3. On the dialog box that appears, set the time and days within which you want to purge data and save.
  4. Repeat the steps for all of the tables above and wait for the data to be purged.

Using the global method

Note that this will affect all tenants.
  1. Open the <DAS_HOME>/repository/conf/analytics/analytics-config.xml file.
  2. Change the contents under the <analytics-data-purging> property as shown below,

    <analytics-data-purging>
        <!-- Below entry will indicate purging is enable or not. If user wants to enable data purging for cluster then this property need to be enable in all nodes -->
        <purging-enable>true</purging-enable>
        <cron-expression>0 0 12 * * ?</cron-expression>
        <!-- Tables that need include to purging. Use regex expression to specify the table name that need include to purging.-->
        <purge-include-table-patterns>
          <table>.*</table>
          <!--<table>.*jmx.*</table>-->
          </purge-include-table-patterns>
        <!-- All records that insert before the specified retention time will be eligible to purge -->
        <data-retention-days>365</data-retention-days>
      </analytics-data-purging>
  3. Save your changes.
com.atlassian.confluence.content.render.xhtml.migration.exceptions.UnknownMacroMigrationException: The macro 'next_previous_links2' is unknown.