Workflow: Predicting a binary variable with the MLP block

IanBD
IanBD
Altair Employee
edited October 2022 in Altair RapidMiner

The MLP block enables you to apply a multilayer perceptron (MLP) neural network model to a dataset.

The following demonstrates how to use the MLP block to make predictions on a dependent variable from an input dataset basketball_shots.csv (containing observations describing a basketball shot in a professional game and the person taking the shot) based on other independent variables from the input dataset:

  1. Import the basketball_shots.csv dataset onto a Workflow canvas using the Text File Import block.
  2. Expand the Model Training group in the Workflow palette, then click and drag an MLP block onto the Workflow canvas.
  3. Click the Output port of the basketball_shots dataset block and drag a connection towards the Input port of the MLP block.
  4. Double-click the MLP block to display the Multilayer Perceptron view and the MLP Preferences dialog box.
  5. In the MLP Preferences dialog box:
    1. In the Train drop-down list, select Working Dataset.
    2. Click Variable Selection to display the Variable Selection panel:
      1. in the Dependent variable drop-down list, select Score.
      2. In the Unselected Independent Variables list, press and hold CTRL and select the angle, distance_feet, height, position, and weight variables.
      3. Click Select to move the specified variables to the Selected Independent Variables list.
    3. Click Optimiser, and in the Optimiser drop-down list select ADAM.
    4. Click Stopping Criteria, and in the Max training time (s) enter 10
    5. Click OK to save the configuration and close the MLP Preferences dialog box.
  6. In the Multilayer Perceptron view, click Train to train the MLP model.

  7. Close the Multilayer Perceptron view and save the configuration when prompted.

A green execution status is displayed in the Output port of the MLP block. The MLP block output can be used with a Score Block in order to make predictions on a dataset.