com.atlassian.confluence.content.render.xhtml.migration.exceptions.UnknownMacroMigrationException: The macro 'next_previous_link3' is unknown.

Creating a Siddhi Application

Siddhi applications are files that define the Siddhi logic to process the events sent to WSO2 SP. They are written in the Siddhi Query Language using the Stream Processor Studio tool shipped with WSO2 SP.

A Siddhi file contains the following configurations:

ConfigurationDescription
StreamA logical series of events ordered in time with a uniquely identifiable name, and set of defined attributes with specific data types defining its schema.
SourceThis consumes data from external sources (such as TCPKafkaHTTP, etc) in the form of events, then converts each event (that can be in XMLJSONbinary, etc. format) to a Siddhi event, and passes that to a stream for processing.
SinkThis takes events arriving at a stream, maps them to a predefined data format (such as XMLJSON, binary, etc), and publishes them to external endpoints (such as E-mailTCPKafkaHTTP, etc).
Executional Element

An executional element can be one of the following:

  • Stateless query: Queries that only consider currently incoming events when generating an output. e.g., filters
  • Stateful query: Queries that consider both currently incoming events as well as past events when generating an output. e.g., windows, sequences, patterns, etc. 
  • Partitions: Collections of stream definitions and Siddhi queries separated from each other within a Siddhi application for the purpose of processing events in parallel and in isolation.

A Siddhi application can be created from the source view or the design view of the WSO2 SP Stream Processor Studio.

Creating a Siddhi application in the source view

To create a Siddhi application via the source view of the WSO2 SP Stream Processor Studio, follow the steps below:

  1. Start WSO2 SP in the editor mode and access the Stream Processor Studio. For detailed instructions, see Starting Stream Processor Studio. The Stream Processor Studio opens as shown below.
  2. Click New to start defining a new Siddhi application. A new file opens as shown below.
  3.  Add the following sample Siddhi application to the file.

    @App:name("SweetProductionAnalysis")
    
    @Source(type = 'tcp', context='SweetProductionData', @map(type='binary'))
    define stream SweetProductionStream (name string, amount double);
    
    @sink(type='log', @map(type='json'))
    define stream ProductionAlertStream (name string, amount double);
    
    from SweetProductionStream
    select *
    insert into ProductionAlertStream;

    Note the following in this Siddhi application

    ConfigurationDescription
    Stream

    This stream contains two stream configurations:

    • SweetProductionStream

      define stream SweetProductionStream (name string, amount double);

      This is the input stream that defines the schema based on which events are selected to be processed by the SweetProductionAnalysis Siddhi application. Events received via the source in this application are directed to this stream.

    • ProductionAlertStream

      define stream ProductionAlertStream (name string, amount double);

      This is the output stream from which the sink configured in this application takes events to be published as the output.

    Source
    @Source(type = 'tcp', context='SweetProductionData', @map(type='binary'))


    This source configuration has the following sections:

    • @Source(type = ‘tcp’, context='SweetProductionData'

      This configuration defines tcp as the transport via which events are received to be processed by the SweetProductionAnalysis Siddhi application. 

    • @map(type='binary'))
      This configuration defines the input mapping. In this scenario, Binary Mapper is used which converts input events into binary events and feeds them into siddhi.
    Sink
    @sink(type='log', @map(type='json'))

    This source configuration has the following sections:

    • @sink(type='log')

      This configuration defines log as the transport via which the processed events are published from the ProductionAlertStream output stream. Log sink simply publishes events into the console.

    • @map(type='json'))
      
      This configuration defines the output mapping. Events are published with the json mapping type. Json mapper converts the events in the ProductionAlertStream to the Json format.
    Executional Elements
    from SweetProductionStream
    select *
    insert into ProductionAlertStream;

    This is where the logic of the siddhi app is defined. In this scenario, all the events received in the SweetProductionStream input stream are inserted into the ProductionAlertStream output stream.

  4. To save this Siddhi application, click File, and then click Save. By default siddhi applications are saved in the <SP_HOME>/wso2/editor/deployment/workspace directory.

  5. To export the Siddhi application to your preferred location, click File, and then click Export File
  6. To see a graphical view of the event flow you defined in your Siddhi application, click Design View.

    The event flow is displayed as follows.
     

Creating a Siddhi application in the design view

To create a Siddhi application via the design view of the WSO2 SP Stream Processor Studio, follow the steps below:

  1. Start WSO2 SP in the editor mode and access the Stream Processor Studio. For detailed instructions, see Starting Stream Processor Studio. The Stream Processor Studio opens as shown below.
  2. Click New to start defining a new Siddhi application. A new file opens as shown below.
  3. To open the design view, click Design View.
  4. To define the input stream into which the events to be processed via the Siddhi application should be received, drag and drop the stream icon (shown below) into the grid.
     
    As a result, the Stream Configuration form appears below the grid as shown below.
     
    Fill this form as follows to define a stream named SweetProductionStream with two attributes named name and amount:

    1. In the Name field, enter SweetProductionStream.
    2. In the Attributes table, enter two attributes as follows. You can click +Attribute to add a new row in the table to define a new attribute.

      Attribute NameAttribute Type
      namestring
      amountdouble
    3. Click Submit to save the new stream definition. As a result, the stream is displayed on the grid with the SweetProductionStream label as shown below.
       
  5. To define the output stream to which the processed events need to be directed, drag and drop the stream icon again. Place it after the SweetProductionStream stream. This stream should be named ProductionAlertStream and have the following attributes.

    Attribute NameAttribute Type
    namestring
    totalProductionlong
  6. To add the source from which events are received, drag and drop the source icon (shown below) into the grid.

    As a result, the Source Configuration form appears below the grid. Enter values in this form as follows to configure a source via which events can be received.
    1. In the Name field, enter a name for the source. Then click Save to add the source

      The name must always be the name of the transport via which the events are received. This must be entered in lower case (e.g., tcp).

    2. To configure the source by entering values for the rest of the parameters, the source must be connected to a stream. This tcp source needs to be connected to the  SweetProductionStream input stream so that the events received via  TCP can be directed there. To connect the source to the stream, draw an arrow from the source to the stream by dragging the cursor as demonstrated below.

      Then open the Source Configuration form again by clicking the Settings icon of the tcp source.
       
    3. Click Properties. Based on the properties you want to configure for the source, select the relevant check box to select the required annotation type. In this example, let's select both Options and Map.

      As a result, the Options and Map sections appear as shown below.

      1. Enter context='SweetProductionData' in the Options field to indicate the context. You can add multiple options by clicking + Option to add more fields under Options.
      2. For this example, assume that events are received in the binary format. To allow this, enter binary in the Name field under Type in the Map section.
      Now the Source Configuration form looks as follows:
       
    4. Click Save to save the source configuration.
  7. To add a query that defines the execution logic, drag and drop the projection query icon (shown below) to the grid.
  8. The query uses the events in the SweetProductionStream input stream as inputs and directs the processed events (which are its output) to the ProductionAlertStream output stream. Therefore, create two connections as demonstrated below.
     
  9. To define the execution logic, move the cursor over the query in the grid, and click on the settings icon that appears.

    This opens the Query Configuration form. Enter information in it as follows:
    1. To specify that the SweetProductionStream stream is the input stream from which the events to be processed are taken, enter it in the Stream/Trigger field in the Input section.
    2. To define how each attribute is processed, enter the following in the User Defined Attributes table.

      ExpressionAsPurpose
      namenameWhen creating the output events, the name attribute is output with the same attribute name.
      sum(amount)amountWith each event that arrives in the SweetProductionStream stream, the query calculates the sum for the amount attribute, and outputs the result with amount as the attribute value.
    3. The output events generated need to be inserted into the ProductionAlertStream output stream. To achieve this, enter information in the Output section as follows:
      1. In the Operation field, select Insert. This value is selected because the output events need to be inserted into the ProductionAlertStream output stream.
      2. In the Into field, enter ProductionAlertStream.
      3. In the For field, select all events.
    4. Click Save to save the entries.
  10. To add a sink to publish the output events that are directed to the ProductionAlertStream output stream, drag and drop the sink icon (shown below) into the grid.
     
    As a result, the Sink Configuration form appears below the grid. Enter values in this form as follows to configure a sink via which events can be published.
    1. In the Name field, enter a name for the sink.

      The name must always be the name of the transport via which the events are published. This must be entered in lower case (e.g., log ).

    2. Click Properties. Based on the properties you want to configure for the sink, select the relevant check box to select the required annotation type. In this example, assume that the events need to be published in the json format. Therefore, let's select the Map check box, and enter type=json to specify json as the mapping type.

  11. Connect the sink you added to the ProductionAlertStream output stream by dragging and dropping the cursor from the ProductionAlertStream output stream to the sink.
  12. To allign the Siddhi components that you have added to the grid, click Edit and then click Auto-Align. As a result, all the components are horizontally aligned as shown below.
  13. Click Source View. The siddhi application is displayed as follows.
  14. Click File and then click Save as. The Save to Workspace dialog box appears. In the File Name field, enter SweetProductionAnalysis and click Save.
     

com.atlassian.confluence.content.render.xhtml.migration.exceptions.UnknownMacroMigrationException: The macro 'next_previous_links2' is unknown.