In this problem, we continue with the computer experiment described in Problem 6.25 on support vector machines, page 312. Specifically, we address a difficult pattern-classification experiment involving the tightly fisted multicir cular structure of Fig. P6.25 reproduced here as Fig. P15.18 for convenience of presentation this time however, we study the supervised
training of a multilayer perceptron, based on the extended Kalman filtering algorithm, along the lines described in Section 15.10.
For the multilayer perceptron, use the following structure:
• Two hidden layers, with four neurons in the first hidden layer and three in the second hidden layer; the activation function φ() = tanh() is to be used for all the hidden neurons.
• Linear output layer
To perform the pattern classification, generate 100 epochs with each one consisting of 200 randomly distributed training examples and an equal size of test data for the two regions of Fig. P15.17. Hence, do the following:
1. For a varying number of epochs, construct the decision boundary computed by the EKF algorithm so as to determine the “best” classification performance.
2. For the classification performance considered to be the “best,” determine the misclassification error.
Finally, compare the results of your findings for the EKF algorithm against the corresponding results obtained for the support vector machine in Problem 6.25.
Among the supervised-learning algorithms studied thus far, the support vector machine stands out as the most powerful. In this problem, the performance of the support vector machine is to be challenged by using it to classify the two multi circular regions that constitute the “tightly fisted” structure shown in Fig. P6.24.The radii of the three concentric circles in this figure are d1 = 0.2, d2 = 0.5, and d3 = 0.8.
(a) Generate 100 epochs, each of which consists of 200 randomly distributed training examples, and an equal number of test data for the two regions of Fig. P6.24.
(b) Train a support vector machine, assigning the value C = 500. Hence, construct the decision boundary computed by the machine.
(c) Test the network and thereby determine the classification error rate
(d) Repeat the experiment for C = 100 and C = 2,500.
Comment on your results.