Table of Contents | ||
---|---|---|
|
This extension facilitates you to use the machine learning models which you generate using WSO2 ML within WSO2 Complex Event Processor (CEP) for making predictions. Thereby, it integrates WSO2 ML with WSO2 CEP, to perform realtime predictions on an event stream by applying a model generated by WSO2 ML. An input event stream which is received by an event receiver of WSO2 CEP is processed by executing an execution plan within WSO2 CEP. This execution plan is written using Siddhi language. It processes the input event stream by applying the model generated using WSO2 ML. The output of this processing which includes the prediction will be published to an output stream through an event publisher of WSO2 CEP. For more information on WSO2 CEP, go to WSO2 CEP Documentation.
Info |
---|
Machine Learner models are not backward compatible. e.g., models generated using WSO2 ML 1.0.0 cannot be used with WSO2 ML 1.1.0 functionality. |
AnchorSiddhi syntax for the extension Siddhi syntax for the extension
Siddhi syntax for the extension
Siddhi syntax for the extension | |
Siddhi syntax for the extension |
There are two possible Siddhi query syntaxes to use the extension in an execution plan as follows.
<double|float|long|int|string|boolean>
predict(<string>
pathToMLModel, <string> dataType)- Extension Type: StreamProcessor
- Description: Returns an output event with the additional attribute with the response variable name of the model, set with the predicted value, using the feature values extracted from the input event.
- Parameter: pathToMLModel: The file path or the registry path where ML model is located. If the model storage location is registry, the value of this this parameter should have the prefix “
registry:
” Parameter: dataType: Data type of the predicted value (double, float, long, integer/int, string, boolean/bool).
Parameter: percentileValue: Percentile value for the prediction. It should be a double value between 0 - 100. This parameter is only relevant when the algorithm of the model used for prediction is of the
Anomaly Detection
type.Example:
predict(‘registry:/_system/governance/mlmodels/indian-diabetes-model’,'double')
<double|float|long|int|string|boolean>
predict(<string>
pathToMLModel, <string> dataType,<double>
input)- Extension Type: StreamProcessor
- Description: Returns an output event with the additional attribute with the response variable name of the model, set with the predicted value, using the feature values extracted from the input event.
- Parameter: pathToMLModel: The file path or the registry path where ML model is located. If the model storage location is registry, the value of this parameter should have the prefix “
registry:
” Parameter: dataType: Data type of the predicted value (double, float, long, integer/int, string, boolean/bool).
Parameter: input: A variable attribute value of the input stream which is sent to the ML model as feature values for predictions. Function does not accept any constant values as input parameters. You can have multiple input parameters.
Parameter: percentileValue: Percentile value for the prediction. It should be a double value between 0 - 100. This parameter is only relevant when the algorithm of the model used for prediction is of the
Anomaly Detection
type.Example:
predict(‘registry:/_system/governance/mlmodels/indian-diabetes-model’,'double', NumPregnancies, TSFT, DPF, BMI, DBP, PG2, Age, SI2)
...
Code Block | ||
---|---|---|
| ||
@Import('InputStream:1.0.0') define stream InputStream (NumPregnancies double, TSFT double, DPF double, BMI double, DBP double, PG2 double, Age double, SI2 double); @Export('PredictionStream:1.0.0') define stream PredictionStream (NumPregnancies double, TSFT double, DPF double, BMI double, DBP double, PG2 double, Age double, SI2 double, prediction string); from InputStream#ml:predict('registry:/_system/governance/ml/indian-diabetes-model', 'string', 95.0) select * insert into PredictionStream; |
...
Info |
---|
In the above examples, the path to the |
Anchor | ||||
---|---|---|---|---|
|
...