Cross Validation

In some experiments, the number of cases available to solve a classification problem using an ANN is limited. There are several options to handle this kind of problems. For these problems, it is not recommended splitting the cases available in two sets: the training set and the validation set. Instead, all cases may be used for training and the validation is performed using a procedure called cross-validation.


One specific from of cross-validation is called Leave One Out Cross Validation (LOOCV). This type of cross-validation is described in the figure below. The process has n steps. In step 1, the first case (row) is removed and the ANN is trained using the remaining cases. Once the training has been completed, the trained network is used to classify the case that was left out (the first case). The process continues by removing one row at each step from the original data set. The ANN is always trained using the remaining cases. The process ends when the last row is removed and the ANN is trained using all rows but the last one. The validation is, of course, performed using the last row in the original data set. Finally, it is necessary to report the total number of errors.


Problem 1
Search the Internet to find and download a database for classification (you may use the keywords: database, classification, class data, classification text file, and classification data). Design a complete experiment using LOOCV to solve the classification problem using an ANN used project name that best describe the experiment (You may select the Cross Validation option when creating the project). The figure below shows the main differences between traditional use of ANN and LOOCV. You may use the databases is:


When the validation has been completed, you must use all cases to train the network before using it for production.

© Copyright 2000-2019 Wintempla selo. All Rights Reserved. Sep 05 2019. Home