Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

Table of Contents
maxLevel3

Introduction

This sample guide demonstrates how to use a downloaded ML model generated using WSO2 ML in plain Java code. This guide uses the <ML_HOME>/samples?ml-model-usage/ sample Java client implementation.

Prerequisites

Before you run this sample, do the following.

...

  1. Create a Maven project.
  2. Include the following dependencies in the pom.xml file of the project.

    Code Block
    languagexml
    <dependency>
    	<groupId>org.wso2.carbon.ml</groupId>
    	<artifactId>org.wso2.carbon.ml.core</artifactId>
    	<version>1.0.1-SNAPSHOT</version>
    </dependency>
    <dependency>
    	<groupId>org.wso2.carbon.ml</groupId>
    	<artifactId>org.wso2.carbon.ml.commons</artifactId>
    	<version>1.0.1-SNAPSHOT</version>
    </dependency>
    Runtime
    Dependencies
    dependencies 
    Code Block
    languagexml
    <!--Dependencies required for MLModel deserialization-->
    <dependency>
    	<groupId>org.wso2.orbit.org.apache.spark</groupId>
    	<artifactId>spark-core_2.10</artifactId>
    	<version>1.4.1.wso2v1</version>
    </dependency>
    <dependency>
    	<groupId>org.wso2.orbit.org.apache.spark</groupId>
    	<artifactId>spark-mllib_2.10</artifactId>
    	<version>1.4.1.wso2v1</version>
    </dependency>
    <dependency>
    	<groupId>org.scala-lang</groupId>
    	<artifactId>scala-library</artifactId>
    	<version>2.10.4</version>
    </dependency>
    <dependency>
    	<groupId>org.wso2.orbit.org.scalanlp</groupId>
    	<artifactId>breeze_2.10</artifactId>
    	<version>0.11.1.wso2v1</version>
    </dependency>
    <dependency>
    	<groupId>org.wso2.orbit.github.fommil.netlib</groupId>
    	<artifactId>core</artifactId>
    	<version>1.1.2.wso2v1</version>
    </dependency>
    <!--Dependencies required for predictions-->
    <dependency>
    	<groupId>org.wso2.orbit.sourceforge.f2j</groupId>
    	<artifactId>arpack_combined</artifactId>
    	<version>0.1.wso2v1</version>
    </dependency>
  3. The downloaded model contains a serialized MLModel object. Deserialize the MLModel as follows.

    Code Block
    languagejava
    private MLModel deserializeMLModel() throws IOException, ClassNotFoundException, URISyntaxException {
      // Path to downloaded-ml-model
      String pathToDownloadedModel = “/tmp/downloaded-ml-model”;
      FileInputStream fileInputStream = new FileInputStream(pathToDownloadedModel);
      ObjectInputStream in = new ObjectInputStream(fileInputStream);
      MLModel mlModel = (MLModel) in.readObject();
    
      logger.info("Algorithm Type : " + mlModel.getAlgorithmClass());
      logger.info("Algorithm Name : " + mlModel.getAlgorithmName());
      logger.info("Response Variable : " + mlModel.getResponseVariable());
      logger.info("Features : " + mlModel.getFeatures());
      return mlModel;
    }
  4. Use org.wso2.carbon.ml.core.impl.Predictor to make predictions using the downloaded ML Model. Predictor accepts a list of string arrays as feature values. The feature values should be in the order of the feature index inside the array as shown below.

    Code Block
    languagejava
    MLModelUsageSample modelUsageSample = new MLModelUsageSample();
    // Deserialize
    MLModel mlModel = modelUsageSample.deserializeMLModel();
    // Predict
    String[] featureValueArray1 = new String[] { "6", "148", "72", "35", "0", "33.6", "0.627", "50" };
    String[] featureValueArray2 = new String[] { "0", "101", "80", "40", "0", "26", "0.5", "33" };
    ArrayList<String[]> list = new ArrayList<String[]>();
    list.add(featureValueArray1);
    list.add(featureValueArray2);
    modelUsageSample.predict(list, mlModel);
    public void predict(List<String[]> featureValueLists, MLModel mlModel) throws MLModelHandlerException {
    	Predictor predictor = new Predictor(0, mlModel, featureValueLists);
    	List<?> predictions = predictor.predict();
    	logger.info("Predictions : " + predictions);
    }

The complete project created above can be found in the <ML_HOME>/samples/ml-model-usage directory.  To build this, run execute the following command from this directory.

...

Executing the sample

To execute this sample, run execute the following command from the <ML_HOME>/samples/ml-model-usage directory.

...

Analysing the output

The following output while be is displayed in the ML console once you execute the sample.

...