'GridSearchCV with Leave-One-Out and (Spearman) Correlation Scoring
I am working on a ML regression task with few data points. I would like to use GridSearchCV for hyperparameter optimization. I am planning to use a Leave-One-Out (LOO) cross-validation scheme. However, in this project, the final model evaluation is based on the spearman correlation coefficient between predictions and ground truth labels. As the spearman correlation coefficient cannot be computed for single data points (one left-out prediction with its according ground truth label), I am not sure, how to implement this kind of scoring in the GridSearchCV using LOO. Every model with its hyperparameters defined in the Grid should first compute all predictions based on the LOO-scheme and then compute the final score based on the correlation between all the predictions and their corresponding labels. Is this possible with GridSearchCV?
P.S.: I used the sklearn.metrics.make_scorer function to implement the correlation coefficient evaluation score but this yields an error for LOO-CV. However, it works for usual k-fold CV.
Sources
This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.
Source: Stack Overflow
Solution | Source |
---|