Unknown macro: {next_previous_links}
Skip to end of metadata
Go to start of metadata

You are viewing an old version of this page. View the current version.

Compare with Current View Page History

« Previous Version 6 Next »

Apache Spark is used as the core analytics engine in DAS 3.0.0. For information on writing Spark queries to analyze the collected data, see Data analysis using SQL

Spark scripts

Spark scripts are used when you have to execute a set of Spark queries in a sequence. Also, you can schedule a Spark script, to trigger it to execute the query automatically in a given period of time. (E.g. fire at 12 (noon) every day, or fire at every minute starting at 2 p.m. and ending at 2:59 p.m. every day etc.). You need to configure this scheduled time using a cron expression. For more information about cron expressions, see Cron Trigger Tutorial.

You can add/edit/delete scripts, and also you can provide your schedule time for your script to execute as described below.

Adding a new script

Follow the steps below to add a new Spark script.

  1. Log in to the WSO2 DAS management console.
  2. Click Main, and then click Add in the Scripts menu as shown below.
    add new script menu
  3. Enter the following details related to your script as shown in the example below.
    add new Spark script

    Script NameMyFirstSparkScript
    Spark SQL Queries

    define table Log (server_name string, ip STRING, tenant INTEGER, sequence LONG, summary STRING);

    SELECT ip FROM Log;

    SELECT server_name, count(*) FROM Log GROUP BY server_name;

    SELECT COUNT(*) FROM Log WHERE summary LIKE '%Joe%';

    SELECT substr(summary, 1, 5) FROM Log;

    SELECT LAST(ip) FROM Log;

    Cron Expression

    0 * * * * ?

    This cron expression defines the schedule time of the script to execute it in every minute. From the time you save the script, the script will be executed at the beginning of every minute. (E.g:.10:21:00, 10:22:00, 10:23:00,..)

  4. Click Execute, to execute the provided queries. This will display the results as follows.
    executing the added new script
  5. Click Add, to add the configured script.

Editing a script

Follow the steps below to add a new Spark script.

  1. Log in to the WSO2 DAS management console.
  2. Click Main, and then click Scripts in the Spark menu.
  3. Click List, and click the Edit link of the corresponding script as shown below.
    edit option of the script in list view

  4. Change the content of the script as required. You can update the scheduling information as well. 

    When you do not enter any value for the scheduling time, then your script is not scheduled to execute. However, if you want to ensure that your script is valid, click Execute. This will execute the queries that you give in the queries window.

    For example, you can edit the script  created above to unschedule the scheduled time as shown below.

    update the edited script

  5. Click Update to save the changes as shown above.

Deleting a script

Follow the steps below to delete a Spark script.

  1. Log in to the WSO2 DAS management console.
  2. Click Main, and then click Scripts in the Spark menu.
  3. Click List, and click the Delete link of the corresponding script as shown below.
    delete script
  4.  Click Yes in the dialog box which appears to confirm deletion. 

    If you delete the script you cannot undo that operation, and it will be completely removed from the system. Also, deleting the script will delete the scheduled task associated with it.

Executing a script

You can execute the script manually when you are adding/editing the script, without using any scheduled task. This will trigger the execution of the script content provided in the queries window at that moment. Also, you can execute the script content out of the edit mode as shown below. During this operation, WSO2 DAS fetches the script content and gives it to Spark to execute all the queries in the script. Once the execution is completed the results are displayed.

Follow the steps below to execute the script content.

  1. Log in to the WSO2 DAS management console.
  2. Click Main, and then click Scripts in the Spark menu.
  3. Click List, and click the Execute link of the corresponding script as shown below.
    execute option for the script list
  4. Now, the script execution job is immediately dispatched to Spark engine. It will display the results once the job is completed as shown below.
    execution results of the script
  • No labels