1 | initial version |
Yes, it is possible to obtain precision and recall scores for the negative class in a k-fold cross validation using the classification_report
function from sklearn.metrics
.
Here's an example code snippet:
from sklearn.metrics import classification_report
from sklearn.model_selection import KFold
from sklearn.linear_model import LogisticRegression
X = ... # feature matrix
y = ... # target variable
kf = KFold(n_splits=5, shuffle=True, random_state=42)
lr = LogisticRegression()
for train_index, test_index in kf.split(X):
X_train, X_test = X[train_index], X[test_index]
y_train, y_test = y[train_index], y[test_index]
lr.fit(X_train, y_train)
y_pred = lr.predict(X_test)
report = classification_report(y_test, y_pred, digits=4, labels=[0])
print(report)
In this example, we are performing a 5-fold cross-validation with logistic regression as the classifier. Inside the loop, we fit the model to the training data and obtain predictions on the test data. We then pass the true labels and predicted labels to the classification_report
function along with the desired digits and the label of interest (in this case, the negative class). The output will include precision, recall, and F1-score for only the negative class.