com.atlassian.confluence.content.render.xhtml.migration.exceptions.UnknownMacroMigrationException: The macro 'next_previous_links' is unknown.

Publishing Events Using Apache Spark

You can publish events from WSO2 DAS using Spark SQL queries. This feature alerts any interested party whenever the content of an existing Spark table is changed. e.g., A scheduled Spark script can periodically check whether a Spark table meets specific conditions. If the conditions are satisfied, an event can be published downstream to notify the interested parties. The following are the functions carried out to publish events from WSO2 DAS using Apache Spark.

Creating the event stream to publish events from Spark

To publish events from Spark, an event stream with the required stream attributes (i.e., attributes for which values are published from Spark) should be defined as the first step. A sample event stream definition is as follows. For more information on event streams, see Understanding Event Streams and Event Tables.

{
  "streamId": "TestEventStream:1.0.0",
  "name": "TestEventStream",
  "version": "1.0.0",
  "nickName": "TestStream",
  "description": "Test Stream",
  "metaData": [],
  "correlationData": [],
  "payloadData": [
    {
      "name": "ip",
      "type": "STRING"
    },
    {
      "name": "name",
      "type": "STRING"
    },
    {
      "name": "testMessage",
      "type": "STRING"
    }
  ]
}

Creating the event receiver

Once you define the event stream, you need to create an event receiver of the WSO2Event type to receive events from Spark. For more information, see WSO2Event Event Receiver.

Publishing events to the event stream

Use the following Spark SQL queries to create a virtual table in the Spark table space to hold the published events, and to publish the rows of it into the defined event stream as events. The  org.wso2.carbon.analytics.spark.core.util.EventStreamProvider class works as the bridge between the existing Spark table and DAS event stream storage to fetch data from the existing Spark table and publish events to the defined event stream.

CREATE TEMPORARY TABLE <table_name>
USING org.wso2.carbon.analytics.spark.event.EventStreamProvider 
OPTIONS (receiverURL "<das_receiver_url>", 
         authURL "<das_receiver_auth_url>",
         username "<user_name>",
         password "<password>",
         streamName "<stream_name>",
         version "<stream_version>",
         description "<description>",
         nickName "<nick_name>"
         payload "<payload>
);

INSERT OVERWRITE TABLE <table_name> <select_query>;

The parameters of the above query are described below.

ParameterDescription
<table_name>
The name of the Spark table that maps to the created event stream.
<das_receiver_url>
The DAS Thrift receiver URL set. (e.g., tcp://10.100.0.40:7611)
<das_receiver_auth_url>
The DAS Thrift receiver auth URL set. (This is an optional parameter. If no value is specified, the default auth URL set is derived from receiver URL set. e.g., ssl://10.100.0.40:7711).
<user_name>
The username to be used to connect to the DAS receiver node.
<password>
The password to be used to connect to the DAS receiver node.
<stream_name>
The name of the stream to be published.
<stream_version>The version of the stream to be published. (e.g., 1.0.0)
<description>
A description of the stream being created. This is an optional parameter.
<nick_name>
A display name for the stream being created. This is an optional parameter.
<payload>

A string containing stream attributes as comma-separated pairs with the name of the attribute and its data type in the following format: [<attribute_name><space><attribute_type>]

(e.g., payload "ip STRING, name STRING, testMessage STRING")

<select query>
The select query to filter the required fields to be published to the event stream from the existing Spark table. e.g., select ip_address, name, message from EventStream

Once these functions are carried out, you can attach an event publisher (such as email or JMS) to the published events stream and get the events delivered to a preferred location. For detailed instructions to configure publishers, see Creating Alerts.

com.atlassian.confluence.content.render.xhtml.migration.exceptions.UnknownMacroMigrationException: The macro 'next_previous_links2' is unknown.