com.atlassian.confluence.content.render.xhtml.migration.exceptions.UnknownMacroMigrationException: The macro 'next_previous_link3' is unknown.

FAQ

I see an exception stating - java.io.IOException: Cannot run program "null/bin/java" when running DAS? What is going wrong?

This happens when you have not set the JAVA_HOME environment variable and pointed to the installed JRE location. This needs to be explicitly set in your environment.

How can I scale up DAS?

If you want to scale up DAS to receiver a large amount of data, you can setup multiple receiver nodes fronted by a load balancer. If you want to scale up the dashboards (presentation layer), you can setup multiple dashboard nodes fronted by a load balancer.

How do I completely clear the state of a DAS installation?

Clearing the current of a given DAS installation involving removing the contents of the following:

  • The databases/schemata defined as ANALYTICS_EVENT_STORE and ANALYTICS_PROCESSED_STORE
  • The <DAS_HOME>/repository/data directory
  • The <DAS_HOME>/work directory
  • The <DAS_HOME>/tmp directory
  • Possibly also the node's /tmp directory, if exists.
I only see one DAS distribution. How do I create a DAS receiver node/ analyzer node/ dashboard node? 

The DAS distribution will contain all the features you need. We prepare each node by uninstalling relevant features. You can do this by the feature installation/ uninstallation ability that comes with WSO2 DAS. If you want to create a receiver node, you uninstall the analytics feature and the dashboard feature, and you will have a receiver node.

Can I send custom data/ events to DAS?

Yes, you can. There is an SDK provided for this. For a detailed article on how to send custom data to DAS using this SDK, see Creating Custom Data Publishers to BAM/CEP

You can also send data to DAS using the REST API. For information on sending data to DAS using the REST API, see REST APIs for Analytics Data Service.

How do I define a custom KPI in DAS?

The model for this is to first publish the custom data. After you send custom data to DAS, you need to define your analytics to match your KPI. To visualize the result of this KPI, you can write a gadget using HTML and JS or use the Gadget generation tool. For all artifacts related to defining a custom KPI, see KPI definition and monitoring sample.

Can DAS do real time analytics

WSO DAS can process large data volumes in real time. This is done via WSO2 CEP components that are also installed in WSO2 DAS by default. The WSO2 CEP server is a powerful real time analytics engine capable of defining queries based on temporal windows, pattern matching and much more.

I see that in the DAS samples, it writes the results to a RDBMS? Why does it do this?

The DAS does this for 2 reasons. One is to promote a polyglot data architecture. It can be stored in a RDBMS or any other data store. The second is that there is extensive support for many 3rd party reporting tools such as Jasper, Pentaho, etc. already support RDBMSs. With this sort of support for a polyglot data architecture, any reporting engine or dashboard can be plugged into DAS without any extra customization effort.

I get a read timeout in the analytics UI after executing a query?

This happens when there is a large amount of data to analyze. The UI will timeout after 10 minutes, if the data to be processed takes more time than this.

I am getting error "Thrift error occurred during processing of message..." Any idea why?

If you are getting the following while trying to publish data from DAS mediator data agent then check whether you have specified the receiver and authentication ports properly and have not mixed them up. The default values are

receiver port = 7611

authentication port = 7711

TID: [0] [BAM] [2012-11-28 22:46:40,102] ERROR {org.apache.thrift.server.TThreadPoolServer} -  Thrift error occurred during processing of message. {org.apache.thrift.server.TThreadPoolServer}
    org.apache.thrift.protocol.TProtocolException: Bad version in readMessageBegin
        at org.apache.thrift.protocol.TBinaryProtocol.readMessageBegin(TBinaryProtocol.java:208)
        at org.apache.thrift.TBaseProcessor.process(TBaseProcessor.java:22)
        at org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:176)
        at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
        at java.lang.Thread.run(Thread.java:619)

I am getting an error "OpenAjax.hub.SecurityAlert.LoadTimeout" Any idea why?

The OpenAjax.hub.SecurityAlert.LoadTimeout error can occur when you have pub-sub gadgets in your dashboard. Shindig uses the pubsub-2 feature of OpenAjax.hub. Timeout is used in OpenAjax.hub to avoid frame phishing attacks. If an iframe with the gadget is not loaded within a particular time period, it throws this security exception and immediately stops the iframe loading process to avoid iframe phishing.

The load timeout is 15 secs for all the pub-sub gadgets regardless of complexity of the gadgets. As Shindig does not provide a method to configure timeouts, WSO2 has extended the shindig features to provide this facility.

WSO2 has changed the default timeout to 60 secs, because 15 secs is not sufficient in a production environment. Furthermore, WSO2 allows you to decide and configure the timeout per gadget using either one of the following methods.

  • Include a timeoutInterval (in milliseconds) under settings in the  gadget.json file.

    "settings":{  
       "timeoutInterval":100000
    }
  • In the gadget configuration panel, you can set the timeout interval by manually entering the timeout Interval in milliseconds.
com.atlassian.confluence.content.render.xhtml.migration.exceptions.UnknownMacroMigrationException: The macro 'next_previous_links2' is unknown.