Info |
---|
To view a screencast of the Quick Start Guide, click here . |
...
Tip |
---|
Before you begin, - Install Oracle Java SE Development Kit (JDK) version 1.7* or 1.8 and set the
JAVA_HOME environment variable. - Install Apache ant.
- Download WSO2 DAS.
- Start the DAS by going to
<DAS_HOME>/bin using the command-line and executing wso2server.bat (for Windows) or wso2server.sh (for Linux.)
|
...
- Log into the DAS Management Console and click on the Main tab. Under Manage, click Execution Plans to open the Available Execution Plans page.
- Click Add Execution Plan to open the Create a New Execution Plan page.
Enter information as follows to create the new execution plan
Parameter Name | Value |
---|
Import Stream | org.wso2.event.sensor.stream:1.0.0 |
As | sensorStream |
Value Of | filteredStream |
StreamId | org.wso2.event.sensor.filtered.stream:1.0.0 |
- Click Import and then click Export. The section for query expressions will be updated as shown below.
Add the following query expression.
Code Block |
---|
|
from sensorStream [sensorValue > 100]
select meta_timestamp, meta_sensorName, correlation_longitude, correlation_latitude, sensorValue
insert into filteredStream; |
This query includes the value sensorValue > 100
. Therefore, when the execution plan forwards events from org.wso2.event.sensor.stream:1.0.0
to org.wso2.event.sensor.filtered.stream:1.0.0
, events in which the value for the sensorValue
attribute is less than 100 will be dropped.
- Click Validate Query Expressions. Once you get a message to confirm that the queries are valid, click Add Execution Plan.
...
- Log into the DAS Management Console and click on the Main tab. Under Manage, click Publishers to open the Available Publishers page.
- Click Add Event Publisher to open the Create a New Event Publisher page.
Enter information as follows to create the new event publisher named UIPublisher
.
Parameter Name | Description |
---|
Event Publisher Name | uiPublisher |
Event Source | org.wso2.event.sensor.filtered.stream:1.0.0 |
Output Event Adapter Type | ui |
Message Format | wso2event |
Click Add Event Publisher to save the information.
Step 9: Create a dashboard and a gadget
WSO2 Analytics Dashboard will be used as the tool to analyse the output of the event flow you created in this guide. This step creates a dashboard and a gadget which analyses events from the org.wso2.event.sensor.filtered.stream
stream published by the uiPublisher
publisher.
- Log into the DAS Management Console. In the Main tab, click Analytics Dashboard.
- Log into the Analytics Dashboard with your username and password.
- Click the menu icon and then click Gadgets to open the Gadgets page as demonstrated below.
Image Modified - Click GENERATE GADGET, and enter values in the Generate a Gadget wizard as follow.
Image Modified
- In the Select Provider field, select Realtime Data Source . Then click Next .
- In the Event Stream field, select org.wso2.event.sensor.filtered.stream:1.0.0 . Then click Next.
Configure a chart as follows.
Parameter Name | Value |
---|
Gadget Name | Sensor Value VS Timestamp |
Select Chart Type | Line Chart |
X-Axis | TIMESTAMP |
X type | time |
Y-Axis | sensorValue |
Y type | default |
Color domain | sensorName |
Max length | 30 |
- Click Add to Store, and then click Go to Portal. the Dashboards page appears again.
- Click CREATE DASHBOARD to open the Create a Dashboard page. Configure a new dashboard as follows.Image Modified
Enter a name and a description for the new dashboard as follows, and click Next.
Parameter Name | Value |
---|
Name of your Dashboard | Sensor Statistics |
Description | This dashboard indicates the sensor value at different times in a particular location. |
- Select the Single Column layout. A message appears to indicate that the dashboard is successfully created.
- Click the icon for gadgets. Then select and drag Sensor Value VS Timestamp gadget to the first column as demonstrated above.
- Click View to view created dashboard.
Step 10: Send Events to the HTTP Receiver via Curl Command
...
Panel |
---|
Deploying the sample C-AppYou can deploy artifacts (i.e. event streams, event receivers, Spark scripts, event publishers, and dashboards etc.) as composite Carbon Applications (C-Apps) in WSO2 DAS. This guide uses the SMART_HOME.car file as the toolbox which contains all the artifacts required for this guide in a single package. For more information on C-Apps, see Packaging Artifacts as a C-App Archive. Follow the steps below to deploy and use a sample C-App in WSO2 DAS. - Log in to the DAS management console using the following URL:
https://<DAS_HOST>:<DAS_PORT>/carbon/ - Click Main, and then click Add in the Carbon Applications menu.
- Click Choose File, and upload the
<DAS_HOME>/capps/Smart_Home.car file as shown below. - Click Main , then click Carbon Applications, and then click List view, to see the uploaded Carbon application as shown below.
Tip |
---|
You can use the Event Flow feature of WSO2 DAS to visualize how the components that you created above are connected with each other. Also, you can use it for verification purposes i.e. to validate the flow of the events within the DAS as shown below. Image Modified
|
Publishing events Once you develop the complete Event Flow, you can test the flow by publishing the events to the DAS. There are several methods of publishing to DAS. In this section, the events are published via a log file. Navigate to <DAS_HOME>/samples/smart-home directory in a new CLI tab, and execute the following command to run the data publisher: . Info |
---|
This executes a Java client based on the <DAS_HOME>/samples/smart-home/src/main/java/org/wso2/carbon/das/smarthome/sample/SmartHomeAgent.java file. This Java client generates random events and sends them to the event stream that is deployed through the Smart_Home.car file. |
Viewing the outputFollow the steps below to view the presentation of the output in the Analytics Dashboard. Log in to the Management console, if you are not already logged in. Click Main, and then click Analytics Dashboard in the Dashboard menu. Log in to the Analytics Dashboard, using admin/admin credentials. Click the DASHBOARDS button in the top menu. The dashboard deployed by the C-App is displayed as shown below. Click the View button of the corresponding Dashboard. The following charts are displayed.
Tip |
---|
Follow the steps below to undeploy the C-App, which you already uploaded in this section before proceeding to the next sections. - Log in to the DAS Management Console using
admin/admin credentials, if you are not already logged in. - Click Main, then click Carbon Applications, and then click List view, to see the uploaded Carbon application.
- Click on the Delete option to delete the Carbon application as shown below.
- Refresh the Web browser screen, and check if the
SMART_HOME.car file has beed removed from the list of all available C-Apps.
|
|
Panel |
---|
Batch and interactive analyticsYou can perform batch analytics when event streams are configured to be persisted for later batch processing scenarios such as data aggregation, summarization etc. WSO2 DAS batch analytics engine is powered by Apache Spark, which accesses the underlying data storage and executes programs to process the event data. The DAS provides an SQL-like query language to create the jobs through scripts that need to be executed. Step 1: Persist event streamIn this step, the org.wso2.event.sensor.stream and org.wso2.event.sensor.filtered.stream event streams that you previously created are persisted so that the data received by them are stored in the databases configured for WSO2 DAS by default. - Log in the the WSO2 DAS Management Console if you are not already logged in.
- In the Main tab, click Streams to open the Available Event Streams page.
- Click Edit for the event stream
org.wso2.event.sensor.stream . This opens the Edit Event Stream page. - Click Next [Persist Event].
- Select the Persist Event Stream check box.
- Select the Persist Attribute check box for all the available attributes.
- Select the Index Column check box for the
sensorid attribute. - Click Save Event Stream to save the changes.
- Similarly, persist the
org.wso2.event.sensor.filtered.stream event stream as shown below.
Step 2: Simulate eventsIn this step, multiple events are simulated to be stored in the data base for batch analytics. - Download and save this file in a preferred location in your machine.
- Log into the DAS management console using the following URL:
https://<DAS_HOST>:<DAS_PORT>/carbon/ . - Click Tools, and then click Event Simulator.
- Select
org.wso2.event.sensor.stream in the Event Stream Name field. - Click Choose File, and then browse and upload the CSV file you downloaded and saved. Click Upload and refresh the page to view the uploaded file.
- Click Configure to open the Event Mapping Configuration dialog box. Enter a comma (
, ) in the Field delimiter field, and click Configure.
- Click Play to start sending the events in the file. Click OK in the message that appears to confirm that the system has started sending events from the file. The events sent are logged in the CLI as shown below.
Step 3: Create and execute Spark scriptsIn this exercise, a Spark query is written to process the data received by WSO2 DAS and stored in the databases. - Log in to the DAS Management Console, if you are not already logged in.
- Click Main, and then click Scripts in the Batch Analytics menu to open the Available Analytics Scripts page.
- Click Add New Analytics Script.
Enter BATCH_ANALYTICS_SCRIPT in the Script Name parameter. Enter the following Spark SQL script in the Spark SQL Queries parameter as shown below. Code Block |
---|
| CREATE TEMPORARY TABLE sensorData USING CarbonAnalytics OPTIONS (tableName "ORG_WSO2_EVENT_SENSOR_STREAM", schema "humidity FLOAT, sensorValue DOUBLE");
create temporary table highSensorVal using CarbonAnalytics OPTIONS (tableName "highSensorVal", schema "humidity FLOAT, sensorValue DOUBLE");
insert overwrite table highSensorVal select humidity, sensorValue from sensorData WHERE sensorValue > 100;
Select * from highSensorVal |
The above script does the following: Loads data from the DAS Data Access Layer (DAL), and registers temporary tables in the Spark environment. Performs batch processing by fetching the sensor values greater than 100 from sensorData table. Writes back to a new DAL table named highSensorVal . - Executes the following query:
Select * from highSensorVal
Click Add to save the script. Click OK in the message that appears to confirm that the script was successfully saved. - Click Execute for the script.
The following is displayed to indicate that the queries were successfully executed.
You can obtain faster results by executing adhoc queries on the indexed attributes through an interactive Web console named the Interactive Analytics Console. Follow the procedure below to perform a batch analytics operation using the Interactive Analytics Console. - Log into the DAS Management Console if you are not already logged in.
- In the Main tab, click Console to open the Interactive Analytics Console.
- Enter the following query in the console and press the Enter key.
Select * from highSensorVal
The output of the query is displayed as follows.
Step 4: Search for dataThis step involves performing interactive analytics for data received by DAS and stored in the configured databases. Searching is supported by the indexing functionality of DAS powered by Apache Lucene. In this step, we search for a specific record by the sensorid attribute. This is possible because the sensorid attribute was persisted as an index column when you persisted the event stream. - Log into the DAS Management Console if you are not already logged in.
- In the Main tab, click Data Explorer to open the Data Explorer page.
- In the Table Name parameter, select ORG_WSO2_EVENT_SENSOR_STREAM.
- Select the By Query option, and enter
meta_sensoridsensorId:501 in the data field displayed.
- Click Search. The following records are displayed in the Results section.
|
...