Siddhi applications are files that define the Siddhi logic to process the events sent to WSO2 SP. They are written in the Siddhi Query Language using the Stream Processor Studio tool shipped with WSO2 SP.
A Siddhi file contains the following configurations:
Configuration | Description |
---|---|
Stream | A logical series of events ordered in time with a uniquely identifiable name, and set of defined attributes with specific data types defining its schema. |
Source | This consumes data from external sources (such as TCP , Kafka , HTTP , etc) in the form of events, then converts each event (that can be in XML , JSON , binary , etc. format) to a Siddhi event, and passes that to a stream for processing. |
Sink | This takes events arriving at a stream, maps them to a predefined data format (such as XML , JSON, binary , etc), and publishes them to external endpoints (such as E-mail , TCP , Kafka , HTTP , etc). |
Executional Element | An executional element can be one of the following:
|
A Siddhi application can be created from the source view or the design view of the WSO2 SP Stream Processor Studio.
Creating a Siddhi application in the source view
To create a Siddhi application via the source view of the WSO2 SP Stream Processor Studio, follow the steps below:
- Start WSO2 SP in the editor mode and access the Stream Processor Studio. For detailed instructions, see Starting Stream Processor Studio. The Stream Processor Studio opens as shown below.
- Click New to start defining a new Siddhi application. A new file opens as shown below.
Add the following sample Siddhi application to the file.
@App:name("SweetProductionAnalysis") @Source(type = 'tcp', context='SweetProductionData', @map(type='binary')) define stream SweetProductionStream (name string, amount double); @sink(type='log', @map(type='json')) define stream ProductionAlertStream (name string, amount double); from SweetProductionStream select * insert into ProductionAlertStream;
Note the following in this Siddhi application
Configuration Description Stream This stream contains two stream configurations:
SweetProductionStream
define stream SweetProductionStream (name string, amount double);
This is the input stream that defines the schema based on which events are selected to be processed by the
SweetProductionAnalysis
Siddhi application. Events received via the source in this application are directed to this stream.ProductionAlertStream
define stream ProductionAlertStream (name string, amount double);
This is the output stream from which the sink configured in this application takes events to be published as the output.
Source @Source(type = 'tcp', context='SweetProductionData', @map(type='binary'))
This source configuration has the following sections:@Source(type = ‘tcp’, context='SweetProductionData'
This configuration defines
tcp
as the transport via which events are received to be processed by theSweetProductionAnalysis
Siddhi application.@map(type='binary')
This configuration defines the input mapping. In this scenario, Binary Mapper is used which converts input events into binary events and feeds them into siddhi.)
The source types and map types are available as Siddhi extensions, and you can find via the operator finder as follows:
Click the Operator Finder icon to open the Operator Finder.
Move the cursor to the location in the Siddhi application where you want to add the source.
Search for the required transport type. Once it appears in the search results, click the Add to Source icon on it.
Similarly, search for the mapping type you want to include in the source configuration, and add it.
The source annotation is now displayed as follows. You can add the other properties as required, and save your changes.
Sink @sink(type='log', @map(type='json'))
This sink configuration has the following sections:
@sink(type='log')
This configuration defines
log
as the transport via which the processed events are published from theProductionAlertStream
output stream. Log sink simply publishes events into the console.@map(type='json'))
This configuration defines the output mapping. Events are published with thejson
mapping type. Json mapper converts the events in theProductionAlertStream
to the Json format.
You can select the sink type and the map type from the Operator Finder.
Executional Elements from SweetProductionStream select * insert into ProductionAlertStream;
This is where the logic of the siddhi app is defined. In this scenario, all the events received in the
SweetProductionStream
input stream are inserted into theProductionAlertStream
output stream.To save this Siddhi application, click File, and then click Save. By default siddhi applications are saved in the
<SP_HOME>/wso2/editor/deployment/workspace
directory.- To export the Siddhi application to your preferred location, click File, and then click Export File.
- To see a graphical view of the event flow you defined in your Siddhi application, click Design View.
The event flow is displayed as follows.
Creating a Siddhi application in the design view
To create a Siddhi application via the design view of the WSO2 SP Stream Processor Studio, follow the steps below:
- Start WSO2 SP in the editor mode and access the Stream Processor Studio. For detailed instructions, see Starting Stream Processor Studio. The Stream Processor Studio opens as shown below.
- Click New to start defining a new Siddhi application. A new file opens as shown below.
- To open the design view, click Design View.
- To define the input stream into which the events to be processed via the Siddhi application should be received, drag and drop the stream icon (shown below) into the grid.
As a result, the Stream Configuration form appears below the grid as shown below.
Fill this form as follows to define a stream namedSweetProductionStream
with two attributes namedname
andamount
:
- In the Name field, enter
SweetProductionStream.
In the Attributes table, enter two attributes as follows. You can click +Attribute to add a new row in the table to define a new attribute.
Attribute Name Attribute Type name
string
amount
double
- Click Submit to save the new stream definition. As a result, the stream is displayed on the grid with the
SweetProductionStream
label as shown below.
- In the Name field, enter
To define the output stream to which the processed events need to be directed, drag and drop the stream icon again. Place it after the
SweetProductionStream
stream. This stream should be namedProductionAlertStream
and have the following attributes.Attribute Name Attribute Type name
string
totalProduction
long
- To add the source from which events are received, drag and drop the source icon (shown below) into the grid.
As a result, the Source Configuration form appears below the grid. Enter values in this form as follows to configure a source via which events can be received.In the Name field, enter a name for the source. Then click Save to add the source
The name must always be the name of the transport via which the events are received. This must be entered in lower case (e.g.,
tcp
).- To configure the source by entering values for the rest of the parameters, the source must be connected to a stream. This tcp source needs to be connected to the
SweetProductionStream
input stream so that the events received via TCP can be directed there. To connect the source to the stream, draw an arrow from the source to the stream by dragging the cursor as demonstrated below.
Then open the Source Configuration form again by clicking the Settings icon of thetcp
source.
Click Properties. Based on the properties you want to configure for the source, select the relevant check box to select the required annotation type. In this example, let's select both Options and Map.
As a result, the Options and Map sections appear as shown below.- Enter
context='SweetProductionData'
in the Options field to indicate the context. You can add multiple options by clicking + Option to add more fields under Options. - For this example, assume that events are received in the
binary
format. To allow this, enterbinary
in the Name field under Type in the Map section.
- Enter
- Click Save to save the source configuration.
- To add a query that defines the execution logic, drag and drop the projection query icon (shown below) to the grid.
- The query uses the events in the
SweetProductionStream
input stream as inputs and directs the processed events (which are its output) to theProductionAlertStream
output stream. Therefore, create two connections as demonstrated below.
- To define the execution logic, move the cursor over the query in the grid, and click on the settings icon that appears.
This opens the Query Configuration form. Enter information in it as follows:
- To specify that the
SweetProductionStream
stream is the input stream from which the events to be processed are taken, enter it in the Stream/Trigger field in the Input section. To define how each attribute is processed, enter the following in the User Defined Attributes table.
Expression As Purpose name
name
When creating the output events, the name
attribute is output with the same attribute name.sum(amount)
amount
With each event that arrives in the SweetProductionStream
stream, the query calculates the sum for theamount
attribute, and outputs the result withamount
as the attribute value.- The output events generated need to be inserted into the
ProductionAlertStream
output stream. To achieve this, enter information in the Output section as follows:- In the Operation field, select Insert. This value is selected because the output events need to be inserted into the
ProductionAlertStream
output stream. - In the Into field, enter
ProductionAlertStream.
- In the For field, select all events.
- In the Operation field, select Insert. This value is selected because the output events need to be inserted into the
- Click Save to save the entries.
- To specify that the
- To add a sink to publish the output events that are directed to the
ProductionAlertStream
output stream, drag and drop the sink icon (shown below) into the grid.
As a result, the Sink Configuration form appears below the grid. Enter values in this form as follows to configure a sink via which events can be published.In the Name field, enter a name for the sink.
The name must always be the name of the transport via which the events are published. This must be entered in lower case (e.g.,
log
).Click Properties. Based on the properties you want to configure for the sink, select the relevant check box to select the required annotation type. In this example, assume that the events need to be published in the
json
format. Therefore, let's select the Map check box, and entertype=json
to specifyjson
as the mapping type.
- Connect the sink you added to the
ProductionAlertStream
output stream by dragging and dropping the cursor from theProductionAlertStream
output stream to the sink. - To allign the Siddhi components that you have added to the grid, click Edit and then click Auto-Align. As a result, all the components are horizontally aligned as shown below.
- Click Source View. The siddhi application is displayed as follows.
- Click File and then click Save as. The Save to Workspace dialog box appears. In the File Name field, enter
SweetProductionAnalysis
and click Save.