com.atlassian.confluence.content.render.xhtml.migration.exceptions.UnknownMacroMigrationException: The macro 'next_previous_link3' is unknown.

Quick Start Guide

To view a screencast of the Quick Start Guide, click here .

WSO2 Data Analytics Server monitors the transactions and activities of an enterprise (referred to as events), analyses them and presents the results to a range of interfaces in real time. In addition, it combines real time analytics with batch, interactive, and predictive (via machine learning) analysis of data into one integrated platform to support the multiple demands of Internet of Things (IoT) solutions, as well as mobile and Web apps. It is designed to analyse millions of events per second, and is therefore capable to handle large volumes of data in Big Data and Internet of Things projects. WSO2 DAS workflow consists of three main phases as illustrated in the diagram below.


The following sections walk you through the basic features of the DAS to get you started.


Before you begin

  1. Install Oracle Java SE Development Kit (JDK) version 1.8 and set the JAVA_HOME environment variable.
  2. Install Apache ant.
  3. Download WSO2 DAS.
  4. Start the DAS by going to <DAS_HOME>/bin using the command-line and executing wso2server.bat  (for Windows) or  wso2server.sh  (for Linux.) 

Creating a simple event flow

An event flow refers to a specific combination of event streams, event receivers, event publishers and execution plans. The following steps describe how to define these elements of an event flow.

Step 1: Add an event stream

Event stream defines the events which goes through a particular flow by defining the event's attributes and it's types. An event stream can be created in the DAS Management Console as follows.

  1. Log into the DAS Management Console and click on the Main tab. Under Manage, click Streams to open the Available Event Streams page. 
  2. Click Add Event Stream to open the Define New Event Stream page. 
  3. Enter information as follows to create a new event stream named org.wso2.event.sensor.stream.

    Event Stream Details

    Parameter NameValue
    Event Stream Nameorg.wso2.event.sensor.stream
    Event Stream Version1.0.0

    Stream Attributes

    Click Add to add the attribute after entering the attribute name and attribute type.

    Attribute CategoryAttributeAttribute Type
    Meta Datatimestamplong

    isPowerSaverEnabledbool

    sensorIdint

    sensorNamestring
    Correlation Datalongitudedouble

    latitudedouble
    Payload Datahumidityfloat

    sensorValuedouble
  4. Click Add Event Stream to save the information.

Step 2: Add an event receiver

Events received by the DAS server have different formats such as XML, JSON and Map. Event receivers transform these events to WSO2 Events that can be directed to an event stream defined in the DAS.

In this step, you will create an event receiver named httpReceiver which directs events to the event stream named  org.wso2.event.sensor.stream that was created in Step 1. The receiver can be created using the Management Console as follows.

  1. Log into the DAS Management Console and click on the Main tab. Under Manage, click Receivers to open the Available Receivers page.
  2. Click Add Event Receiver to open the Create a New Event Receiver page.
  3. Enter information as follows to create the new event receiver named httpReceiver.

    Parameter NameValue
    Event Receiver NamehttpReceiver
    Input Event Adapter Typehttp
    Transportsall
    Event Streamorg.wso2.event.sensor.stream
    Message Formatjson
  4. Click Add Event Receiver to save the information.

Step 3: Add an Event Publisher

Event publishers publish events processed by the WSO2 servers to external applications. These events are published via  HTTP, Kafka, JMS, etc. in JSON, XML, Map, text, and WSO2Event formats to various endpoints and data stores.

In this step, you will create an event publisher named loggerPublisher to publish events from the event stream named  org.wso2.event.sensor.stream that was created in Step 1. Since the output event adapter type of this publisher is logger, the events published will be logged in the DAS CLI in text format. The publisher can be created using the Management Console as follows.

  1. Log into the DAS Management Console and click on the Main tab. Under Manage, click Publishers to open the Available Publishers page.
  2. Click Add Event Publisher to open the Create a New Event Publisher page.
  3. Enter information as follows to create the new event publisher named  loggerPublisher.

    Parameter NameDescription
    Event Publisher NameloggerPublisher
    Event Sourceorg.wso2.event.sensor.stream
    Output Event Adapter Typelogger
    Message Formattext


  4. Click Add Event Publisher to save the information.

Step 4: View the simple event flow

This step involves viewing the event flow you created and understanding how the different elements in it are connected. The event flow can be viewed as follows.

  1. Log into the DAS Management Console if you are not already logged in. 
  2. Click the Main tab and then click Flow to open the DAS Event Flow page. The event flow you created is displayed as follows.
     
    This diagram indicates that httpReceiver forwards events to the org.wso2.event.sensor.stream.1.0.0 stream. These events are then published by the loggerPublisher.

Receive and log events

Step 5: Receive events via HTTP transport

Navigate to <DAS_HOME>/samples/cep/producers/http and run the following command which sends events to the DAS via the HTTP transport.

ant -Durl=http://localhost:9763/endpoints/httpReceiver -Dsn=0001

This builds the HTTP client and sends the events in the <DAS_HOME>/samples/cep/artifacts/0001/httpReceiver.txt file to the httpReceiver endpoint. You can view the details of the events that are sent as shown in the log below. These logs are published by the publisher created in Step 3.

The logs of the JSON events received by the DAS server will be displayed in the CLI as shown in the example below.

Processing events with an execution plan

Step 6: Add another event stream

  1. Log into the DAS Management Console and click on the Main tab. Under Manage, click Streams to open the Available Event Streams page. 
  2. Click Add Event Stream to open the Define New Event Stream page. 
  3. Enter information as follows to create the event stream named org.wso2.event.sensor.filtered.stream.

    Event Stream Details

    Paramater NameValue
    Event Stream Nameorg.wso2.event.sensor.filtered.stream
    Event Stream Version1.0.0

    Stream Attributes

    Attribute CategoryAttributeAttribute Type
    Meta Datetimestamplong

    sensorNamestring
    Correlation Datalongitudedouble

    latitudedouble
    Payload DatasensorValuedouble
  4. Click Add Event Stream to save the information.

Step 7: Add an execution plan 

An Execution Plan can import one or more streams from the server for processing and push zero or more output streams back to the server. For more information, see Analyzing Data.

  1. Log into the DAS Management Console and click on the Main tab. Under Manage, click Execution Plans to open the Available Execution Plans page. 
  2. Click Add Execution Plan to open the Create a New Execution Plan page.
  3. Enter information as follows to create the new execution plan

    Parameter NameValue
    Import Streamorg.wso2.event.sensor.stream:1.0.0
    AssensorStream
    Value OffilteredStream
    StreamIdorg.wso2.event.sensor.filtered.stream:1.0.0
  4. Click Import and then click Export. The section for query expressions will be updated as shown below.
  5. Add the following query expression.

    from sensorStream [sensorValue > 100]
    select meta_timestamp, meta_sensorName, correlation_longitude, correlation_latitude, sensorValue
    insert into filteredStream;

    This query includes the value sensorValue > 100. Therefore, when the execution plan forwards events from org.wso2.event.sensor.stream:1.0.0  to org.wso2.event.sensor.filtered.stream:1.0.0, events in which the value for the sensorValue attribute is less than 100 will be dropped.

  6. Click Validate  Query Expressions. Once you get a message to confirm that the queries are valid, click Add Execution Plan.

Publish events in dashboard

Step 8: Add a UI event publisher

In this step, you will add another publisher named uiPublisher to publish events from the stream named org.wso2.event.sensor.filtered stream to the Analytics Dashboard.

  1. Log into the DAS Management Console and click on the Main tab. Under Manage, click Publishers to open the Available Publishers page.
  2. Click Add Event Publisher to open the Create a New Event Publisher page.
  3. Enter information as follows to create the new event publisher named UIPublisher.

    Parameter NameDescription
    Event Publisher NameuiPublisher
    Event Sourceorg.wso2.event.sensor.filtered.stream:1.0.0
    Output Event Adapter Typeui
    Message Formatwso2event
  4. Click Add Event Publisher to save the information.

Step 9: Create a dashboard and a gadget

WSO2 Analytics Dashboard will be used as the tool to analyse the output of the event flow you created in this guide. This step creates a dashboard and a gadget which analyses events from the  org.wso2.event.sensor.filtered.stream stream published by the uiPublisher publisher.

  1. Log into the DAS Management Console. In the Main tab, click Analytics Dashboard.
  2. Log into the Analytics Dashboard with your username and password.
  3. Click the menu icon and then click Gadgets to open the Gadgets page as demonstrated below.
  4. Click GENERATE GADGET, and enter values in the Generate a Gadget wizard as follow. 

    1. In the  Select Provider  field, select  Realtime Data Source . Then click  Next .
    2. In the Event Stream field, select org.wso2.event.sensor.filtered.stream:1.0.0 . Then click Next.
    3. Configure a chart as follows.

      Parameter NameValue
      Gadget NameSensor Value VS Timestamp
      Select Chart TypeLine Chart
      X-AxisTIMESTAMP
      X typetime
      Y-AxissensorValue
      Y typedefault
      Color domainsensorName
      Max length30
    4. Click Add to Store, and then click Go to Portal. the Dashboards page appears again.
  5. Click CREATE DASHBOARD to open the Create a Dashboard page. Configure a new dashboard as follows.
    1. Enter a name and a description for the new dashboard as follows, and click Next.

      Parameter NameValue
      Name of your DashboardSensor Statistics
      DescriptionThis dashboard indicates the sensor value at different times in a particular location.
    2. Select the Single Column layout. A message appears to indicate that the dashboard is successfully created.
    3. Click the  icon for gadgets. Then select and drag Sensor Value VS Timestamp gadget to the first column as demonstrated above.
    4. Click View to view created dashboard.

Step 10: Send Events to the HTTP Receiver via Curl Command

This step sends events to the receiver named httpReceiver using a curl command. These events are processed by the event flow you created, and published in the DAS CLI by the publisher created in Step 3.

  1. Issue the following curl command.

    curl -X POST -d "{ "event": { "metaData": { "timestamp": 1439468145264 , "isPowerSaverEnabled": false, "sensorId": 701, "sensorName": temperature }, "correlationData": { "longitude": 4.504343, "latitude": 20.44345 }, "payloadData": { "humidity": 2.3, "sensorValue": 96.5 } } }" http://localhost:9763/endpoints/httpReceiver --header "Content-Type:application/json"

    The following log will appear in the DAS CLI.

    Note that the Sensor Statistics dashboard you created does not get updated. This is because the value for the sensorValue attribute is less than 100 in this event. Therefore, it gets dropped by the filter you created in the execution plan, and as a result, it is not forwarded to the org.wso2.event.sensor.filtered. stream:1.0.0 stream.

  2. Issue another command with a value greater than 100 for the sensorValue attribute as follows.

    curl -X POST -d "{ "event": { "metaData": { "timestamp":1439467524120 , "isPowerSaverEnabled": false, "sensorId": 701, "sensorName": temperature }, "correlationData": { "longitude": 4.504343, "latitude": 20.44345 }, "payloadData": { "humidity": 2.3, "sensorValue": 156 } } }" http://localhost:9763/endpoints/httpReceiver --header "Content-Type:application/json"
    
    

     The following log will appear in he DAS CLI.

    The event is forwarded to the  org.wso2.event.sensor.filtered. stream:1.0.0 stream since the value for the sensorValue attribute is greater than 100. Therefore, the Sensor Statistics dashboard will be updated as shown below.
     

  3. Issue more curl commands as follows with several timestamps and sensor values. 

    curl -X POST -d "{ "event": { "metaData": { "timestamp": 1439467524120 , "isPowerSaverEnabled": false, "sensorId": 701, "sensorName": temperature }, "correlationData": { "longitude": 4.504343, "latitude": 20.44345 }, "payloadData": { "humidity": 2.3, "sensorValue": 156 } } }" http://localhost:9763/endpoints/httpReceiver --header "Content-Type:application/json"
    curl -X POST -d "{ "event": { "metaData": { "timestamp": 1439467890957 , "isPowerSaverEnabled": false, "sensorId": 701, "sensorName": temperature }, "correlationData": { "longitude": 4.504343, "latitude": 20.44345 }, "payloadData": { "humidity": 2.3, "sensorValue": 170 } } }" http://localhost:9763/endpoints/httpReceiver --header "Content-Type:application/json"
    curl -X POST -d "{ "event": { "metaData": { "timestamp": 1439467951518 , "isPowerSaverEnabled": false, "sensorId": 701, "sensorName": temperature }, "correlationData": { "longitude": 4.504343, "latitude": 20.44345 }, "payloadData": { "humidity": 2.3, "sensorValue": 131 } } }" http://localhost:9763/endpoints/httpReceiver --header "Content-Type:application/json"
    curl -X POST -d "{ "event": { "metaData": { "timestamp": 1439467992936 , "isPowerSaverEnabled": false, "sensorId": 701, "sensorName": temperature }, "correlationData": { "longitude": 4.504343, "latitude": 20.44345 }, "payloadData": { "humidity": 2.3, "sensorValue": 126 } } }" http://localhost:9763/endpoints/httpReceiver --header "Content-Type:application/json"
    curl -X POST -d "{ "event": { "metaData": { "timestamp": 1439468050928 , "isPowerSaverEnabled": false, "sensorId": 701, "sensorName": temperature }, "correlationData": { "longitude": 4.504343, "latitude": 20.44345 }, "payloadData": { "humidity": 2.3, "sensorValue": 145 } } }" http://localhost:9763/endpoints/httpReceiver --header "Content-Type:application/json"

    Since the value for the sensorValue attribute is greater than 100 in all these events, the dashboard will be updated as shown below. These events will also be logged in the DAS CLI.

    Send sensor events with several timestamps and sensorValues, The dashboard will now get effected with the sent event since sensorValue is over 100 and the event gets sent to the dashboard. These events will get logged in the DAS CLI as well.

Deploying execution plans using templates

Step 11: Create and deploy a template

In this step, the configurations of the WSO2 DAS artifacts that you previously created are added as templates via the Template Manager tool. This allows you to reuse the same artifacts for different scenarios where the sensor value differs.

Copy the following template, and save it with the SensorStatistics.xml  file name in the <DAS_HOME>/repository/conf/template-manager/domain-template directory.

The $sensorValue attribute in this template is defined as a configurable parameter by using the $ sign in the attribute name. Therefore, in each scenario you create from this template, you can specify a different sensor value based on which the events are to be filtered.

<?xml version="1.0"?>
<!--
  ~ Copyright (c) 2016, WSO2 Inc. (http://www.wso2.org) All Rights Reserved.
  ~
  ~ WSO2 Inc. licenses this file to you under the Apache License,
  ~ Version 2.0 (the "License"); you may not use this file except
  ~ in compliance with the License.
  ~ You may obtain a copy of the License at
  ~
  ~     http://www.apache.org/licenses/LICENSE-2.0
  ~
  ~ Unless required by applicable law or agreed to in writing,
  ~ software distributed under the License is distributed on an
  ~ "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
  ~ KIND, either express or implied. See the License for the
  ~ specific language governing permissions and limitations
  ~ under the License.
  -->

<domain name="SensorStatistics">
    <description>Domain for sensor data analysis</description>
    <scenarios>
        <scenario type="AnalyzeSensorStatistics">
            <description>Configure a sensor analytics scenario to display statistics for a given stream of your choice
            </description>
            <templates>
                <!--Note: These will be deployed in the order they appear here-->
                <!-- Input Event Stream-->
                <template type="eventstream">
                  {
                    "streamId": "org.wso2.event.sensor.stream:1.0.0",
                    "name": "org.wso2.event.sensor.stream",
                    "version": "1.0.0",
                    "nickName": "",
                    "description": "",
                    "metaData": [
                      {
                        "name": "timestamp",
                        "type": "LONG"
                      },
                      {
                        "name": "isPowerSaverEnabled",
                        "type": "BOOL"
                      },
                      {
                        "name": "sensorId",
                        "type": "INT"
                      },
                      {
                        "name": "sensorName",
                        "type": "STRING"
                      }
                    ],
                    "correlationData": [
                      {
                        "name": "longitude",
                        "type": "DOUBLE"
                      },
                      {
                        "name": "latitude",
                        "type": "DOUBLE"
                      }
                    ],
                    "payloadData": [
                      {
                        "name": "humidity",
                        "type": "FLOAT"
                      },
                      {
                        "name": "sensorValue",
                        "type": "DOUBLE"
                      }
                    ]
                  }
                </template>
                <!-- Output Event Stream-->
                <template type="eventstream">
                  {
                    "streamId": "org.wso2.event.sensor.filtered.stream:1.0.0",
                    "name": "org.wso2.event.sensor.filtered.stream",
                    "version": "1.0.0",
                    "nickName": "",
                    "description": "",
                    "metaData": [
                      {
                        "name": "timestamp",
                        "type": "LONG"
                      },
                      {
                        "name": "sensorName",
                        "type": "STRING"
                      }
                    ],
                    "correlationData": [
                      {
                        "name": "longitude",
                        "type": "DOUBLE"
                      },
                      {
                        "name": "latitude",
                        "type": "DOUBLE"
                      }
                    ],
                    "payloadData": [
                      {
                        "name": "sensorValue",
                        "type": "DOUBLE"
                      }
                    ]
                  }

                </template>
                <!-- Realtime Execution Plan-->
                <template type="realtime">
                    <![CDATA[
                    /* Enter a unique ExecutionPlan */
                    @Plan:name('ExecutionPlan')
                    /* Enter a unique description for ExecutionPlan */
                    -- @Plan:description('ExecutionPlan')
                    /* define streams/tables and write queries here ... */
                    @Import('org.wso2.event.sensor.stream:1.0.0')
                    define stream sensorStream (meta_timestamp long, meta_isPowerSaverEnabled bool, 
                        meta_sensorId int, meta_sensorName string, correlation_longitude double, correlation_latitude double, 
                        humidity float, sensorValue double);
                    @Export('org.wso2.event.sensor.filtered.stream:1.0.0')
                    define stream filteredStream (meta_timestamp long, meta_sensorName string, correlation_longitude double, 
                        correlation_latitude double, sensorValue double);
                    from sensorStream [sensorValue > $filteringVal]
                    select meta_timestamp, meta_sensorName, correlation_longitude, correlation_latitude, sensorValue
                    insert into filteredStream
                     ]]>
                </template>
                <template type="eventreceiver">
                  <![CDATA[
                  
                    <eventReceiver name="httpReceiver" statistics="disable" trace="disable" xmlns="http://wso2.org/carbon/eventreceiver">
                        <from eventAdapterType="http">
                            <property name="basicAuthEnabled">true</property>
                            <property name="transports">all</property>
                        </from>
                        <mapping customMapping="disable" type="json"/>
                        <to streamName="org.wso2.event.sensor.stream" version="1.0.0"/>
                    </eventReceiver>]]>
                  </template>
                <!-- Event Publisher-->
                <template type="eventpublisher">
                    <![CDATA[
                    
                    <eventPublisher name="loggerPublisher" statistics="disable"
                      trace="disable" xmlns="http://wso2.org/carbon/eventpublisher">
                      <from streamName="org.wso2.event.sensor.stream" version="1.0.0"/>
                      <mapping customMapping="disable" type="text"/>
                      <to eventAdapterType="logger"/>
                    </eventPublisher>
                        ]]>
                </template>
                <template type="eventpublisher">
                    <![CDATA[
                    
                    <eventPublisher name="uiPublisher" statistics="disable" trace="disable" xmlns="http://wso2.org/carbon/eventpublisher">
                      <from streamName="org.wso2.event.sensor.filtered.stream" version="1.0.0"/>
                      <mapping customMapping="disable" type="wso2event"/>
                      <to eventAdapterType="ui"/>
                    </eventPublisher>
                        ]]>
                </template>
                <!-- Gadget line chart -->
                <template type="gadget">
                    <config>
                        <properties>
                            <property name="directoryName">$sensorType-line-chart</property>
                            <property name="templateDirectory">lineChart</property>
                        </properties>
                        <artifacts>
                            <artifact file="gadget.json">
                                <![CDATA[
                                	{
                                    	"id": "$sensorType-line-chart",
                                    	"title": "$sensorType-line-chart",
                                    	"type": "gadget",
                                    	"thumbnail": "gadget/$sensorType-line-chart/thumbnail.png",
                                    	"data": {
                                        	"url": "gadget/$sensorType-line-chart/gadget.xml"
                                    	}
                                	}                        
								]]>
                            </artifact>
                            <artifact file="conf.json">
                                <![CDATA[
                                	{
										"provider-conf" : {
											"streamName" : "org.wso2.event.sensor.filtered.stream:1.0.0", 
											"provider-name" : "realtime"
										}, 
										"chart-conf" : {
											"x" : "TIMESTAMP", 
											"xType" : "time", 
											"y" : "sensorValue", 
											"yType" : "default", 
											"color" : "sensorName", 
											"maxLength" : "30", 
											"gadget-name" : "$sensorType-line-chart", 
											"chart-name" : "line-chart"
										}
									}
                                ]]>
                            </artifact>
                            <artifact file="js/core/gadget-util.js">
                                <![CDATA[
                                        var getGadgetLocation = function (callback) {
                                        var gadgetLocation = "/portal/store/carbon.super/fs/gadget/$sensorType-line-chart";
                                        var PATH_SEPERATOR = "/";
                                        if (gadgetLocation.search("store") != -1) {
                                            wso2.gadgets.identity.getTenantDomain(function (tenantDomain) {
                                                var gadgetPath = gadgetLocation.split(PATH_SEPERATOR);
                                                var modifiedPath = '';
                                                for (var i = 1; i < gadgetPath.length; i++) {
                                                    if (i === 3) {
                                                        modifiedPath = modifiedPath.concat(PATH_SEPERATOR, tenantDomain);
                                                    } else {
                                                        modifiedPath = modifiedPath.concat(PATH_SEPERATOR, gadgetPath[i])
                                                    }
                                                }
                                                callback(modifiedPath);
                                            });
                                        } else {
                                            callback(gadgetLocation);
                                        }
                                    }
                             
                                ]]>
                            </artifact>
                        </artifacts>
                    </config>
                </template>
                <!-- Gadget line chart -->
                <!-- Dashboard -->
                <template type="dashboard">
                    <config>
                        <properties>
                            <property name="dashboardId">analytics-$sensorType-dashboard</property>
                        </properties>
                        <content>
						<![CDATA[
                            {
  								"id": "analytics-$sensorType-dashboard",
  								"title": "Analytics $sensorType Dashboard",
  								"description": "This dashboard indicates the sensor value at different times in a particular location",
  								"permissions": {
    								"viewers": [
      									"Internal\/sensor-statistics-viewer"
    								],
    								"editors": [
      									"Internal\/sensor-statistics-editor"
    								],
    								"owners": [
      									"Internal\/sensor-statistics-owner"
    								]
  								},
  								"pages": [
    								{
      									"id": "landing",
      									"title": "Home",
      									"layout": {
        									"content": {
          										"loggedIn": {
            										"blocks": [
              											{
                											"id": "90dfe9100dc10dca1ae562e7f7451a4a",
                											"x": 0,
                											"y": 0,
                											"width": 12,
                											"height": 3,
                											"banner": false
              											}
            										]
          										}
        									},
        								"fluidLayout": false
      								},
      								"isanon": false,
      								"content": {
        								"default": {
          									"90dfe9100dc10dca1ae562e7f7451a4a": [
            									{
              										"id": "$sensorType-line-chart-0",
              										"content": {
                										"id": "$sensorType-line-chart",
                										"title": "$sensorType-line-chart",
                										"type": "gadget",
                										"thumbnail": "fs:\/\/gadget\/$sensorType-line-chart\/thumbnail.png",
                										"data": {
                  											"url": "fs:\/\/gadget\/$sensorType-line-chart\/gadget.xml"
                										},
                										"styles": {
                  											"title": "$sensorType-line-chart",
                  											"borders": true
                										},
                										"options": {
                  											"windowSize": {
                    											"type": "STRING",
                    											"title": "Window Size",
                    											"value": "10",
                    											"options": [
                      	
                    											],
                    											"required": false
                  											}
                										},
                										"locale_titles": {
                  
                										}
              										}
            									}
          									]
        								},
        								"anon": {
          
        								}
      								}
    							}
  							],
  							"menu": [
    							{
      								"id": "landing",
      								"isanon": false,
      								"ishidden": false,
      								"title": "Home",
      								"subordinates": [
        
      								]
    							}
  							],
  							"hideAllMenuItems": false,
  							"identityServerUrl": "",
  							"accessTokenUrl": "",
  							"apiKey": "",
  							"apiSecret": "",
  							"theme": "Default Theme",
  							"shareDashboard": false,
  							"isUserCustom": false,
  							"isEditorEnable": true,
  							"banner": {
    							"globalBannerExists": false,
    							"customBannerExists": false
  							},
  							"landing": "landing",
  							"isanon": false
						}
                         ]]>						
						</content>
                    </config>
                </template>
            </templates>
            <parameters>
                <parameter name="filteringVal" type="string">
                    <displayName>Filtering Value</displayName>
                    <description>Only the sensor values below filtering value will be dropped</description>
                    <defaultValue>100</defaultValue>
                </parameter>
				<parameter name="sensorType" type="string">
                    <displayName>Sensor Type Name</displayName>
                    <description>The name of the sensor type</description>
                    <defaultValue>temperature</defaultValue>
                </parameter>
            </parameters>
        </scenario>
    </scenarios>
</domain>


Step 12: Configure a template

Before you carry out this step

  1.  Copy Sensor Value VS Timestamp gadget which resides in the <DAS_HOME>/repository/deployment/server/jaggeryapps/portal/store/carbon.super/fs/gadget/ directory and copy the directory to wso2das-3.1.0/repository/conf/template-manager/gadget-templates and rename it as lineChart.
  2. Delete the following artifacts that you have already configured in steps 1 - 9.
    • org.wso2.event.sensor.stream event stream
    • org.wso2.event.sensor.filtered.stream event stream
    • httpReceiver event receiver
    • ExecutionPlan execution plan
    • loggerPublisher event publisher
    • uiPublisher event publisher
    • Sensor Value VS Timestamp gadget

    • Sensor Statistics dashboard


This step involves adding execution plan and stream configurations using the template you created and added in the previous step.

  1. If the DAS server was running when you created and deployed the template, restart the DAS server. 
  2. Log into the DAS Management Console. Click the Main tab and then click Template Manager. Create a new scenario as demonstrated below.
    1. Click on SensorStatistics to open the Deployed Scenarios page. Then click Create New Scenario to open the Edit Scenario page.
    2. Enter information as shown in the table below and click Add scenario .

      Parameter NameValue
      Scenario TypeAnalyzeSensorStatistics
      Scenario NameFilterSensorValues
      DescriptionFilter events with a sensor value greater than 120
      Sensor Value120

       A message appears to inform you that the scenario is successfully created. Close the message. The scenario you configured is displayed in the Deployed Scenarios page

Step 13: View the elements of the event flow

Log into the DAS Management Console and click the Main tab. Then click Flow   to open the DAS Event Flow page. The complete event flow you created in this guide is displayed as follows.

This event flow is displayed when you create the artifacts manually, as well as when you deploy them in a template and then create a scenario.


The following is a summary of this guide which describes each element in the event flow.

ElementTypeRole
httpReceiverEvent ReceiverReceives DAS events in multiple formats and converts them all into the WSO2 Event format before forwarding them to the org.wso2.event.sensor.stream:1.0.0 event stream.
org.wso2.event.sensor.stream:1.0.0:Event StreamDefines the attributes on which selection of events to be processed by the event flow is based.
loggerPublisherEvent PublisherLogs events from the org.wso2.event.sensor.stream:1.0.0 event stream in the DAS CLI.
SensorStatistics-FilterSensorValuesExecution PlanApplies a filter criteria to events in the org.wso2.event.sensor.stream:1.0.0  event stream and directs the filtered events to org.wso2.event.sensor.filtered.stream:1.0.0 event stream.
org.wso2.event.sensor.filtered.stream:1.0.0 Event StreamImports attributes from the org.wso2.event.sensor.stream:1.0.0 event stream and receives events filtered from that event stream by the execution plan.
uiPublisherEvent PublisherPublishes events from the org.wso2.event.sensor.filtered.stream:1.0.0 event stream in the Analytics Dashboard.

Deploying the sample C-App

You can deploy artifacts (i.e. event streams, event receivers, Spark scripts, event publishers, and dashboards etc.) as composite Carbon Applications (C-Apps) in WSO2 DAS. This guide uses the SMART_HOME.car file as the toolbox which contains all the artifacts required for this guide in a single package. For more information on C-Apps, see Packaging Artifacts as a C-App Archive. Follow the steps below to deploy and use a sample C-App in WSO2 DAS.

  1. Log in to the DAS management console using the following URL: https://<DAS_HOST>:<DAS_PORT>/carbon/
  2. Click Main, and then click Add in the Carbon Applications menu.
  3. Click  Choose File, and upload the <DAS_HOME>/capps/Smart_Home.car file as shown below.
    adding the new C-App
  4. Click Main , then click Carbon Applications, and then click List view, to see the uploaded Carbon application as shown below.
    list of all available C-Apps


You can use the Event Flow feature of WSO2 DAS to visualize how the components that you created above are connected with each other. Also, you can use it for verification purposes i.e. to validate the flow of the events within the DAS as shown below.


Publishing events 

Once you develop the complete Event Flow, you can test the flow by publishing the events to the DAS. There are several methods of publishing to DAS. In this section, the events are published via a log file.

Navigate to <DAS_HOME>/samples/smart-home directory in a new CLI tab, and execute the following command to run the data publisher.

ant

This executes a Java client based on the <DAS_HOME>/samples/smart-home/src/main/java/org/wso2/carbon/das/smarthome/sample/SmartHomeAgent.java file. This Java client generates random events and sends them to the event stream that is deployed through the Smart_Home.car file.

Viewing the output

Follow the steps below to view the presentation of the output in the Analytics Dashboard.

  1. Log in to the Management console, if you are not already logged in.

  2. Click Main, and then click Analytics Dashboard in the Dashboard menu. 

  3. Log in to the Analytics Dashboard, using admin/admin credentials. 

  4. Click the DASHBOARDS button in the top menu. The dashboard deployed by the C-App is displayed as shown below.  

  5. Click the View button of the corresponding Dashboard. The following charts are displayed.

     


 Follow the steps below to undeploy the C-App, which you already uploaded in this section before proceeding to the next sections.

  1. Log in to the DAS Management Console using admin/admin credentials, if you are not already logged in.
  2. Click Main, then click Carbon Applications, and then click List view, to see the uploaded Carbon application.
  3. Click on the Delete option to delete the Carbon application as shown below.
  4. Refresh the Web browser screen, and check if the SMART_HOME.car file has beed removed from the list of all available C-Apps. 

Batch and interactive analytics

You can perform batch analytics when event streams are configured to be persisted for later batch processing scenarios such as data aggregation, summarization etc. WSO2 DAS batch analytics engine is powered by Apache Spark, which accesses the underlying data storage and executes programs to process the event data. The DAS provides an SQL-like query language to create the jobs through scripts that need to be executed.

Step 1: Persist event stream

In this step, the org.wso2.event.sensor.stream and org.wso2.event.sensor.filtered.stream event streams that you previously created are persisted so that the data received by them are stored in the databases configured for WSO2 DAS by default.
 

  1. Log in the the WSO2 DAS Management Console if you are not already logged in.
  2. In the Main   tab, click Streams   to open the   Available Event Streams   page.
  3. Click Edit for the event stream org.wso2.event.sensor.stream. This opens the Edit Event Stream page.
  4. Click Next [Persist Event].
  5. Select the Persist Event Stream check box.
  6. Select the Persist Attribute check box for all the available attributes.
  7. Select the Index Column check box for the sensorid attribute.
  8. Click Save Event Stream to save the changes.
  9. Similarly, persist the org.wso2.event.sensor.filtered.stream event stream as shown below.
     

Step 2: Simulate events

In this step, multiple events are simulated to be stored in the data base for batch analytics.

  1. Download and save this file in a preferred location in your machine.
  2. Log into the DAS management console using the following URL: https://<DAS_HOST>:<DAS_PORT>/carbon/.
  3. Click Tools, and then click Event Simulator.
  4. Select org.wso2.event.sensor.stream in the  Event Stream Name  field.
     
  5. Click Choose File, and then browse and upload the CSV file you downloaded and saved. Click Upload and refresh the page to view the uploaded file.
  6. Click Configure to open the Event Mapping Configuration dialog box. Enter a comma ( , ) in the Field delimiter field, and click Configure.

  7. Click Play to start sending the events in the file. Click OK in the message that appears to confirm that the system has started sending events from the file. The events sent are logged in the CLI as shown below.
     

Step 3: Create and execute Spark scripts

In this exercise, a Spark query is written to process the data received by WSO2 DAS and stored in the databases.

  1. Log in to the DAS Management Console, if you are not already logged in.
  2. Click Main, and then click Scripts in the Batch Analytics menu to open the Available Analytics Scripts page.
  3. Click Add New Analytics Script.
  4. Enter BATCH_ANALYTICS_SCRIPT in the  Script Name parameter. Enter the following Spark SQL script in the  Spark SQL Queries parameter as shown below.
     

    CREATE TEMPORARY TABLE sensorData USING CarbonAnalytics OPTIONS (tableName "ORG_WSO2_EVENT_SENSOR_STREAM", schema "humidity FLOAT, sensorValue DOUBLE");
     
    create temporary table highSensorVal using CarbonAnalytics OPTIONS (tableName "highSensorVal",  schema "humidity FLOAT, sensorValue DOUBLE");
     
    insert overwrite table highSensorVal select humidity, sensorValue from sensorData WHERE sensorValue > 100;
     
    Select * from highSensorVal

      The above script does the following:   

    • Loads data from the DAS Data Access Layer (DAL), and registers temporary tables in the Spark environment.

    • Performs batch processing by fetching the sensor values greater than 100 from sensorData table.

    • Writes back to a new DAL table named highSensorVal.

    • Executes the following query:
      Select * from highSensorVal

     

  5. Click Add to save the script. Click OK in the message that appears to confirm that the script was successfully saved.

  6. Click Execute for the script.

    The following is displayed to indicate that the queries were successfully executed.

You can obtain faster results by executing adhoc queries on the indexed attributes through an interactive Web console named the Interactive Analytics Console. Follow the procedure below to perform a batch analytics operation using the Interactive Analytics Console.

  1. Log into the DAS Management Console if you are not already logged in.
  2. In the Main tab, click Console to open the Interactive Analytics Console.
  3. Enter the following query in the console and press the Enter key.
    Select * from highSensorVal

    The output of the query is displayed as follows. 

Step 4: Search for data

This step involves performing interactive analytics for data received by DAS and stored in the configured databases. Searching is supported by the indexing functionality of DAS powered by Apache Lucene. In this step, we search for a specific record by the sensorid attribute. This is possible because the sensorid attribute was persisted as an index column when you persisted the event stream.

  1. Log into the DAS Management Console if you are not already logged in.
  2. In the Main tab, click Data Explorer to open the Data Explorer page.
  3. In the Table Name parameter, select ORG_WSO2_EVENT_SENSOR_STREAM.
  4. Select the By Query option, and enter meta_sensorId:501 in the data field displayed.

     
  5. Click Search. The following records are displayed in the Results section.
     



Predictive Analytics

WSO2 DAS allows you to build predictive models that analyse data and make predictions.

In this exercise, a dataset is analyzed and a ML model is trained to predict the possibility of a person suffering with breast cancer when data relating to a set of other bodily characteristics is provided.

The dataset used in this scenario contains 10 features to provide data on bodily characteristics, and a response variable with the following labels.

LabelPrediction
2This value indicates that the situation is benign.
4This value indicates that the situation is malignant.

The following is a summary of the procedure to make a prediction via WSO2 DAS.

Step 1: Create a dataset

Step 2: Create a project

Step 3: Create an analysis and train a model

Step 4: Predict using the model

Step 1: Create a dataset

Follow the procedure below to upload the dataset based on which the training model is created.

  1. Access the WSO2 Machine Learner wizard embedded within DAS via the https://<DAS_HOME>:<DAS_PORT/ml/site/home/login.jag URL, and log in with your credentials.
  2. Click ADD DATASET to open the Create Dataset page.
  3. In the Data Source field, click Choose File and browse for the <DAS_HOME>/samples/ml/tuned/naive-bayes/breastCancerWisconsin.csv file. Enter values for the rest of the parameters as shown below.

    Parameter NameValue
    Dataset NameBreast_Cancer_Dataset
    Version1.0.0
    DescriptionBreast cancer data in Wisconsin
    Source TypeFile
    Data FormatCSV
    Column Header AvailableYes
  4. Click CREATE DATASET to save your changes. The Datasets page opens and the dataset you entered is displayed as follows.


 Step 2: Create a project

Follow the procedure below to create a project for the dataset uploaded.

  1. Log into the Machine Learner wizard of WSO2 DAS if you are not already logged in. You can access it with the https://<DAS_HOME>:<DAS_PORT/ml/site/home/login.jag URL, and log in with your credentials.
  2. Click ADD PROJECT.
     
     
  3. In the Create Project page, enter information as shown below.
     

    Parameter NameDescription
    Project NameBreast_Cancer_data_analytics_project
    DescriptionThis project performs predictive analysis on the breast cancer data in Wisconsin.
    DatasetBreast_Cancer_Dataset


     
  4. Click Create Project to save the information. The project is displayed in the Projects page as follows.
     

Step 3: Create an analysis and train a model

Follow the procedure below to analyse the Breast_Cancer_Dataset dataset, and then create a training model based on that analysis.

  1. Log into the Machine Learner wizard if you are not already logged in. You can access it with the https://<DAS_HOME>:<DAS_PORT/ml/site/home/login.jag URL, and login with your credentials.
  2. Click the You have X projects link as shown below.
     
  3. Click on the Breast_Cancer_data_analytics_project project to expand it.
  4. Enter breast_cancer_analysis_1 as the analysis name and click CREATE ANALYSIS. The following page appears displaying the summary statistics.
     
  5. Click Next without making any changes to the summary statistics.
     
    The Explore view opens. Note that Parallel Sets and Trellis Chart visualizations are enabled, and Scatter Plot and Cluster Diagram visualizations are disabled. This is determined by the feature types of the dataset.  
  6. Click Next. The Algorithms view is displayed. Enter values as shown below.
     

    ParameterValue
    Algorithm nameLOGISTIC REGRESSION L_BFGS
    Response variableClass
    Train data fraction0.7

     

  7. Click Next. The Parameters view appears. Enter L2 as the reg type.
     
  8. Click Next. The Model view appears. Select Breast_Cancer_Dataset-1.0.0 as the dataset version.
     
  9. Click RUN to train the model.
     
    The training model is created as displayed as shown below.
     

Step 4: Predict using the model

Follow the procedure below to make a prediction based on the training model you created.

  1. Log into the Machine Learner wizard if you are not already logged in. You can access it with the https://<DAS_HOME>:<DAS_PORT/ml/site/home/login.jag URL, and login with your credentials.
  2. Click the You have X projects link as shown below to open the Projects window.
     
  3. Click MODELS for the breast_cancer_analysis_1  analysis.
     
  4. Click Predict on the model displayed.
     
  5. Enter values in the Predict page as shown below.
     

    Parameter NameValue
    Prediction SourceFeature values
    SampleCodeNumber1018561
    ClumpThickness2
    UniformityOfCellSize1
    UniformityOfCellShape1
    MarginalAdhesion1
    SingleEpithelialCellSize2
    BareNuclei1
    BlandChromati1
    NormalNucleoli1
    Mitoses5
  6. Click Predict. The prediction is displayed as follows.
     


Where to go next

This is your first experience of learning about DAS and trying out its functionalities.

  • For more information on the features and architecture of WSO2 DAS, see About DAS.
  • For more information on how to download, install, run and get started with WSO2 DAS, see Getting Started.
  • For more information on the main functionalities of WSO2 DAS, see User Guide.
  • For more information on various product deployment scenarios and other topics useful for system administrators, see Admin Guide.
  • For more information on several business use case samples of WSO2 DAS, see Samples.
com.atlassian.confluence.content.render.xhtml.migration.exceptions.UnknownMacroMigrationException: The macro 'next_previous_links2' is unknown.