SVM training - memory (Bug #1430)
I have a scenario where I train a set of SVM classifiers using "one against one" strategy. Since I have 62 classes to classify that means I need to train some 1891 SVMs. So far so good.
I train the SVMs using the train_auto function, and I notice that the memory consumption is growing really fast, even though the SVMs themselves are not very large (200 variables, 100 examples for each class => max 200 support vectors).
I have created a small program that illustrates the problem. I noticed that when I save the SVM model to a file, delete the instance and then load it from the file again, the memory consumption is much lower. When running the sample program as it is, it reached 407.776 kB of memory in use at the end, while with the save/load section uncommented, the program stopped at 35.560 kB.
Is it possible that the SVM does not clear all memory allocated during the training unless we call clear or delete? Or am I just missing something? Shouldn't the memory consumption be the same when I load the model from a file and when I use the training function?