com.atlassian.confluence.content.render.xhtml.migration.exceptions.UnknownMacroMigrationException: The macro 'next_previous_link3' is unknown.

Creating a Siddhi Application

Siddhi applications are files that define the Siddhi logic to process the events sent to WSO2 SP. They are written in the Siddhi Query Language using the Stream Processor Studio tool shipped with WSO2 SP.

A Siddhi file contains the following configurations:

ConfigurationDescription
StreamA logical series of events ordered in time with a uniquely identifiable name, and set of defined attributes with specific data types defining its schema.
SourceThis consumes data from external sources (such as TCP, Kafka, HTTP, etc) in the form of events, then converts each event (that can be in XML, JSON, binary, etc. format) to a Siddhi event, and passes that to a stream for processing.
SinkThis takes events arriving at a stream, maps them to a predefined data format (such as XML, JSON, binary, etc), and publishes them to external endpoints (such as E-mail, TCP, Kafka, HTTP, etc).
Executional Element

An executional element can be one of the following:

  • Stateless query: Queries that only consider currently incoming events when generating an output. e.g., filters
  • Stateful query: Queries that consider both currently incoming events as well as past events when generating an output. e.g., windows, sequences, patterns, etc. 
  • Partitions: Collections of stream definitions and Siddhi queries separated from each other within a Siddhi application for the purpose of processing events in parallel and in isolation.

Creating a Siddhi application

To create a Siddhi application, follow the steps below:

  1. Start WSO2 SP in the editor mode and access the Stream Processor Studio. For detailed instructions, see Starting Stream Processor Studio. The Stream Processor Studio opens as shown below.
  2. Click New to start defining a new Siddhi application. A new file opens as shown below.
  3.  Add the following sample Siddhi application to the file.

    @App:name("SweetProductionAnalysis")
    
    @Source(type = 'tcp', context='SweetProductionData', @map(type='binary'))
    define stream SweetProductionStream (name string, amount double);
    
    @sink(type='log', @map(type='json'))
    define stream ProductionAlertStream (name string, amount double);
    
    from SweetProductionStream
    select *
    insert into ProductionAlertStream;

    Note the following in this Siddhi application

    ConfigurationDescription
    Stream

    This stream contains two stream configurations:

    • SweetProductionStream

      define stream SweetProductionStream (name string, amount double);

      This is the input stream that defines the schema based on which events are selected to be processed by the SweetProductionAnalysis Siddhi application. Events received via the source in this applicationare directed to this stream.

    • ProductionAlertStream

      define stream ProductionAlertStream (name string, amount double);

      This is the output stream from which the sink configured in this application takes events to be published as the output.

    Source
    @Source(type = 'tcp', context='SweetProductionData', @map(type='binary'))


    This source configuration has the following sections:

    • @Source(type = ‘tcp’, context='SweetProductionData'

      This  configuration defines tcp as the transport via which events are received to be processed by the SweetProductionAnalysis Siddhi application. 

    • @map(type='binary'))
      This configuration defines the input mapping. In this case Binary Mapper is used which converts input events into binary events and feeds them into siddhi.
    Sink
    @sink(type='log', @map(type='json'))

    This source configuration has the following sections:

    • @sink(type='log')

      This  configuration defines log as the transport via which the processed events are published from the ProductionAlertStream output stream. Log sink simply publishes events into the console.

    • @map(type='json'))
      
      This configuration defines the output mapping. Events are published with the json mapping type. Json mapper converts the events in the ProductionAlertStream to the Json format.
    Executional Elements
    from SweetProductionStream
    select *
    insert into ProductionAlertStream;

    This is where the logic of the siddhi app is defined. In this scenario, all the events received in the SweetProductionStream input stream are inserted into the ProductionAlertStream output stream.

  4. To save this Siddhi application, click File, and then click Save. By default siddhi applications are saved in the <SP_HOME>/wso2/editor/deployment/workspace directory.

  5. To export the Siddhi application to your preferred location, click File, and then click Export File. 
com.atlassian.confluence.content.render.xhtml.migration.exceptions.UnknownMacroMigrationException: The macro 'next_previous_links2' is unknown.