com.atlassian.confluence.content.render.xhtml.migration.exceptions.UnknownMacroMigrationException: The macro 'next_previous_link3' is unknown.

Publishing Events Using Apache Spark

You can publish events from WSO2 DAS using Spark SQL queries. This feature alerts any interested party whenever the content of an existing Spark table is changed. e.g., A scheduled Spark script can periodically check whether a Spark table meets specific conditions. If the conditions are satisfied, an event can be published downstream to notify the interested parties. The following are the functions carried out to publish events from WSO2 DAS using Apache Spark.

Creating the event stream to publish events from Spark

To publish events from Spark, an event stream with the required stream attributes (i.e., attributes for which values are published from Spark) should be defined as the first step. A sample event stream definition is as follows. For more information on event streams, see Understanding Event Streams and Event Tables.    

{
  "streamId": "TestEventStream:1.0.0",
  "name": "TestEventStream",
  "version": "1.0.0",
  "nickName": "TestStream",
  "description": "Test Stream",
  "metaData": [],
  "correlationData": [],
  "payloadData": [
    {
      "name": "ip",
      "type": "STRING"
    },
    {
      "name": "name",
      "type": "STRING"
    },
    {
      "name": "testMessage",
      "type": "STRING"
    }
  ]
}

Creating the event receiver

Once you define the event stream, you need to create an event receiver of the WSO2Event type to receive events from Spark. For more information, see WSO2Event Event Receiver.

Publishingeventstotheevent stream

Use the following Spark SQL queries to create a virtual table in the Spark table space to hold the published events, and to publish the rows of it into the defined event stream as events. The  org.wso2.carbon.analytics.spark.core.util.EventStreamProvider class works as the bridge between the existing Spark table and DAS event stream storage to fetch data from the existing Spark table and publish events to the defined event stream.

CREATE TEMPORARY TABLE <table_name>
USING org.wso2.carbon.analytics.spark.event.EventStreamProvider
OPTIONS (streamName "<stream_name>"
         version "<stream_version>",
         payload "<payload>"
);
INSERT OVERWRITE TABLE <table_name> <select_query>;

The parameters of the above query are described below.

ParameterDescription
<table_name>

The name of the Spark table that is mapped with the created event stream.

<stream_name>

The name of the stream that should send events.

<stream_version>
The version number of the stream that sends events.
<payload>

A string containing stream attributes as comma-separated pairs with the name of the attribute and its data type in the following format: <[attribute_name><space><attribute_type>]

( e.g., payload "ip STRING, name STRING, testMessage STRING ")

<select query>
The select query to filter the required fields that should be published to the event stream from the existing Spark table. ( e.g., select ip_address, name, message from EventStream)

Once these functions are carried out, you can attach an event publisher (such as email or JMS) to the published events stream and get the events delivered to a preferred location. For detailed instructions to configure publishers, see Creating Alerts.

com.atlassian.confluence.content.render.xhtml.migration.exceptions.UnknownMacroMigrationException: The macro 'next_previous_links2' is unknown.