Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

Table of Contents
maxLevel3

...

This sample demonstrates how you can send notifications through events published from WSO2 DAS using Apache Spark. The notifications are sent to alert about records of an existing table in the Spark environment satisfying a defined condition(s). This sample includes involves creating a table with a few product names and quantities in the DAL, and sending notifications when the quantity of a product falls below a defined value.  

...

  1. Log in to the DAS management console using the following URL: https://<DAS_HOST>:<DAS_PORT>/carbon/
  2. Click Main, and then click Streams.
  3. Click Add Event Stream.
  4. Enter the values as shown below to create an event stream named PRODUCTS_STREAM with two attributes as product name and quantity. For more information on creating event streams, see Understanding Event Streams and Event Tables.
    create the receiving event stream
  5. Click Next (Persist Event).
  6. Enter the values as shown below in the next screen to persist the created event stream. For  For more information on creating event streams, see  Persisting  Event Streams .
    persisting the created event stream
  7. Click Save Event Stream.

Sending events to the receiving event stream

...

  1. Log in to the DAS management console using the following URL, if you are not already logged in: https://<DAS_HOST>:<DAS_PORT>/carbon/
  2. Click Tools, and then click Event Simulator.
  3. Upload the events to be sent to it in a CSV file (e.g. footwear.csv), and click Configure as shown below.
    simulating of sending events to the receiving event stream
  4. Enter the details as shown below, and click Configure. 
    entering the event mapping configurations
  5. Click Play in the next screen as shown below.
    play events to simulate the sending of events
  6. Click Main, and then click Message Console and select Data Explorer. Select the name of the table to view it as shown below.
    view the products master table in the message consoleImage Removed Image Added

Creating the corresponding table in Apache Spark

...

When this query executes, output of the select query is inserted into the PRODUCT_ALERTS table. It  reads all the products from the  PRODUCTS_STREAM Spark table which have its quantity less than 50. During the query execution, individual rows returned from the select query are published into the  PRODUCT_ALERTS stream as events.

  1. You view the text events that are published to the CEP server in the logs of it in the CLI as shown below.
    output logs of the logger publisherImage Modified