com.atlassian.confluence.content.render.xhtml.migration.exceptions.UnknownMacroMigrationException: The macro 'next_previous_link3' is unknown.

Setting Up Real Time Samples

For information on understanding the general flow of DAS real time samples, see WSO2 CEP Samples. The following sections explain the generic setup instructions to execute the samples.

Prerequisites

Following applications are required for running the DAS real time samples in this documentation.

SampleRequirements
All samples
JMS related samples

The JMS samples are explained to be tried out using following JMS providers

MQTT related samples

The MQTT samples are explained to be tried out using following MQTT-supported servers

Kafka related samples

The Kafka samples are explained to be tried out using following Kafka Broker versions

WebSocket related samples

Java Development Kit / JRE version 1.7.*

Apache Storm related samples

Apache Storm version 0.9.3 or later (to run Storm samples.)

Starting sample configurations

To start the WSO2 DAS with a sample configuration, run the following command with -sn <n>, where <n> denotes the number assigned to the sample.

On Linux:	./wso2cep-samples.sh -sn <n>
On Windows:	wso2cep-samples.bat -sn <n> 

For example, to start the WSO2 DAS with the configuration of sample 0101, run the following command inside <DAS_HOME>/bin directory:

On Linux: 	./wso2cep-samples.sh -sn 0101 
On Windows:	wso2cep-samples.bat -sn 0101

The <DAS_HOME>/samples/cep/artifacts directory contains the sample configurations of DAS. Each configuration is inside a sub directory by the name of the sample numbered <n>. For example, the DAS artifacts for sample 0101 can be found in the <DAS_HOME>/samples/cep/artifacts/0101 directory.

In the normal mode the <DAS_HOME>/bin/wso2server.bat or  <DAS_HOME>/bin/wso2server.sh  script starts an instance of the DAS using the configuration files  in <DAS_HOME>/repository/deployment/server directory and any sample configurations passed in as -sn <n> is ignored.

These configurations on running the samples point the default Axis2 repo to <DAS_HOME>/samples/cep/artifacts/<SAMPLE_NUMBER>/directory. ( <DAS_HOME>/repository/deployment/server/ directory is used as the default Axis2 repo.)

Starting sample consumers

Each sample consumer service is saved in a separate directory as <DAS_HOME>/samples/cep/consumers/<consumer_name> .

  1. To start a sample consumer, go to its directory <DAS_HOME>/samples/cep/consumers/<comsumer_name> and type ant. For example,

    user@host:/tmp/wso2das-3.1.0/samples/consumers/wso2-event$ ant
    Buildfile: /home/user/tmp/wso2das-3.1.0/samples/consumers/wso2-event/build.xml
    ...
    run:[echo] To configure host, port and events use -Dhost=xxxx -Dport=xxx -Devents=xx
     [echo] Sending to : http://localhost:7661
                
     [java] Test Server starting on 10.100.0.75
     [java] Thrift Server started at 10.100.0.75
     [java] Thrift SSL port : 7761
     [java] Thrift port : 7661
     [java] Test Server Started

    To write a custom wso2Event data publisher (Thrift data publisher), use the pom file given here.

  2. Deploy the log service sample consumer, which is a Web service, by specifying the sample number as follows:

    ant -DsampleNo=<sample no>

    Running DsampleNo ant script deploys the log service in the axis2 repository that is relevant to the specified sample. After proper deployment, the Web service is able to receive messages from the DAS server.

    user@host:/tmp/wso2das-3.1.0/samples/consumers/logService$ ant -DsampleNo=0102
    Buildfile: /home/usre/tmp/wso2das-3.1.0/samples/consumers/logService/build.xml
    -folder.check:
    -assign.sample:
     [echo] Sample No : 0102
     [echo] Services Dir : ../../../samples/artifacts/0102/axis2services
    -assign.main:
    folder.set:
    clean:
    ...
    [jar] Building jar: /tmp/wso2das-3.1.0/repository/deployment/server/webapps/logService.war
    BUILD SUCCESSFUL
    Total time: 0 seconds

Starting sample producers

Starting a sample producer is similar to starting a consumer.

  1. Go to the sample producer's directory <DAS_HOME>/samples/cep/producers/<producer_name> and type ant with relevant input arguments. For example,

    user@host:/home/user/wso2das-3.1.0/samples/producers/pizza-shop$  ant pizzaOrderClient -Dservice=WSEventLocalAdaptorService 
    -DtopicName=BatchedPizzaOrder -DbatchedEvents=true
    Buildfile: /home/user/tmp/wso2das-3.1.0/samples/producers/pizza-shop/build.xml
    init:
    compile:
    [copy] Copying 1 file to /home/user/tmp/wso2das-3.1.0/samples/producers/pizza-shop/temp/classes
    pizzaOrderClient:
    [echo] To configure host and port use -Dhost=xxxx -Dport=xxx -Dservice=xxx -DtopicName=xxx
    [echo] Sending to : http://localhost:9763/services/WSEventLocalAdaptorService/BatchedPizzaOrder
                
    [echo] To send events in batches use -DbatchedEvents=true
    [echo] Sending events in batches : true
    BUILD SUCCESSFUL
    Total time: 1 second

Passing arguments to sample clients

Some sample clients take extra arguments. The following table presents the format in which these arguments can be passed.

PurposeSyntaxExampleDefault
To specify the publishing topic for the producer client.-DtopicName=XXXXant -DtopicName=AllStockQuotesAllStockQuotes
To publish to a specific host, which is an IP address.-Dhost=XXXXant  -Dhost=org.test.domainlocalhost
To publish to a specific port.-Dport=XXXXant pizzaOrderClient -Dport=97649763
To publish to a specific Web service.-Dservice=XXXXant pizzaOrderClient -Dservice=wsInAdaptorServiceWSEventLocalAdaptorService
To send events in batches (i.e., the adapter receives a batch of events).-DbatchedEvents={true|false}ant -DbatchedEvents=true 
To publish events to a specific client URL.

-Durl='client url'

-Durl=http://localhost:9763/endpoints/httpReceiver

 
To subscribe events from a JMS topic (consumer).-DtopicName=XXXXXXant topicConsumer -DtopicName=TestTopicTestTopic
To subscribe events from a JMS queue (consumer).-Dqueue=XXXXXant queueConsumer -Dqueue=DelayedFlightStatsDelayedFlightStats
To receive events from a specific format from the text document (producer).

-Dformat=xxxx(csv, text, json, xml)

ant -Dformat=csv  
To specify the JMS broker to which the DAS server listens.

-Dbroker=xxxx(activemq, mb, qpid)

ant -Dbroker=activemq  
To publish events in a specific event stream (producer).

-DstreamId=xxxx:x.x.x

ant -DstreamId=org.wso2.event.sensor.stream:1.0.0  
To publish events from the specific sample folder (producer).

-Dsn='sample number' or -DfilePath=xxxx

ant -Dsn=00  
To specify whether the protocol based on which events are received is thrift or binary.

-Dprotocol='thrift/binary'

ant -Dprotocol=binary 
To specify the username when an action performed by a sample requires user credentials to be specified.

-Dusername=xxxx

-Dusername=admin 
To specify the password when an action performed by a sample requires user credentials to be specified.

-Dpassword=xxxx

-Dpassword=admin 
When doing a performance test, this argument specifies the number of events with which the test should be carried out.

-Devents=xx or -DnoOfEvents=xxxx or -DeventCount=xxxx

-Devents=200000010000
When doing a performance test, this argument specifies the delay that occurs between events in milli seconds.

-Ddelay='delay between events in ms'

-Ddelay=1000 
When doing a performance test, the this argument specifies the number of events after which the throughput/latency should be calculated.

-DelapsedCount=xxxx

-DelapsedCount=1000010000
When doing a performance test, this argument specifies the number of publishers that should be used to publish events.

-DnoOfPublishers=xxxx

-DnoOfPublishers=50 
When doing a performance test, this argument specifies the number of events that should be sent to the event flow for the DAS server to warm up and reach a stabilize.

-DwarmUpCount=xxxx

-DwarmUpCount=20000010000

When doing a performance test, this argument is used to specify whether you want to calculate the throughput or the latency.

  • Throughput: This is the number of events processed concurrently at a given time by an event flow.
  • Latency: This is the time taken by the event flow to process a single event.

-DcalcType='throughput/latency'

-DcalcType=throughputthroughput

Setting up JMS for JMS sample clients

Before you run JMS samples, set up and start a JMS provider. Configure JMS providers by copying relevant JMS client libraries to <DAS_HOME>/samples/cep/lib folder as mentioned below.

For Apache ActiveMQ, the relevant JAR files are, 

  • <ActiveMQ_HOME>/activemq-all-5.7.0.jar
  • <ActiveMQ_HOME>/lib/ geronimo-jms_1.1_spec-1.1.1.jar

 

Previous Apache ActiveMQ versions may not contain  SLF4J  related files in the client JAR. Therefore, if you get an error, add SLF4J related JAR file  to the <CEP_HOME>/samples/cep/lib/ directory of the samples.

For Apache Qpid, the relevant JAR files are, 

  • <QPID-CLIENT_HOME>/lib/geronimo-jms_1.1_spec-1.1.1.jar   
  • <QPID-CLIENT_HOME>/lib/qpid-client-0.32.jar
  • <QPID-CLIENT_HOME>/lib/qpid-common-0.32.jar

For WSO2 Message Broker (MB) , the relevant JAR files are, 

  • <MB_HOME>/client-lib/andes-client-3.1.1.jar  
  • <MB_HOME>/client-lib/log4j-1.2.13.jar
  • <MB_HOME>/client-lib/slf4j-1.5.10.wso2v1.jar
  • <MB_HOME>/client-lib/geronimo-jms_1.1_spec-1.1.0.wso2v1.jar

  • <MB_HOME>/client-lib/org.wso2.securevault-1.0.0-wso2v2.jar

Setting up MQTT for MQTT sample clients

Before you run MQTT samples, set up and start a MQTT-supported server. Configure MQTT sample clients by copying relevant MQTT client libraries to <DAS_HOME>/samples/cep/lib folder as mentioned below.

  • Download and add MQTT client library (mqtt-client-0.4.0.jar) to  <DAS_HOME>/samples/cep/lib directory.

Setting up Kafka for Kafka sample clients

Before you run Kafka samples, set up and start a Kafka broker. Configure Kafka sample clients by copying relevant Kafka client libraries to <DAS_HOME>/samples/cep/lib folder as mentioned below.

  • Copy all the JAR files, which are located in <KAFKA_HOME>/libs/ directory to <DAS_HOME>/samples/cep/lib/ directory.
com.atlassian.confluence.content.render.xhtml.migration.exceptions.UnknownMacroMigrationException: The macro 'next_previous_links2' is unknown.