'Neural Network for Regression with tflearn
My question is about coding a neural network which does regression (and NOT classification) using tflearn.
Dataset:
fixed acidity volatile acidity citric acid ... alcohol quality
7.4 0.700 0.00 ... 9.4 5
7.8 0.880 0.00 ... 9.8 5
7.8 0.760 0.04 ... 9.8 5
11.2 0.280 0.56 ... 9.8 6
7.4 0.700 0.00 ... 9.4 5
I want to build a neural network which takes in 11 features (chemical values in wine) and outputs or predicts a score i.e., quality(out of 10). I DON'T want to classify the wine like quality_1, quality_2,... I want the model to perform a regression function for my features and predict a value out of 10(could be even a float).
The quality column in my data only has values = [3, 4, 5, 6, 7, 8, 9]. It does not contain 1, 2, and 10.
Due to the lack in experience, I could only code a neural network that CLASSIFIES the wine into classes like [score_3, score_4,...] and I used one hot encoding to do so.
Processed Data:
Features:
[[ 7.5999999 0.23 0.25999999 ..., 3.02999997 0.44
9.19999981]
[ 6.9000001 0.23 0.34999999 ..., 2.79999995 0.54000002
11. ]
[ 6.69999981 0.17 0.37 ..., 3.25999999 0.60000002
10.80000019]
...,
[ 6.30000019 0.28 0.47 ..., 3.11999989 0.50999999
9.5 ]
[ 5.19999981 0.64499998 0. ..., 3.77999997 0.61000001
12.5 ]
[ 8. 0.23999999 0.47999999 ..., 3.23000002 0.69999999
10. ]]
Labels:
[[ 0. 1. 0. ..., 0. 0. 0.]
[ 0. 0. 0. ..., 0. 0. 0.]
[ 0. 0. 0. ..., 1. 0. 0.]
...,
[ 0. 0. 0. ..., 0. 0. 0.]
[ 0. 0. 0. ..., 0. 0. 0.]
[ 0. 0. 1. ..., 0. 0. 0.]]
Code for a neural network which CLASSIFIES into different classes:
import pandas as pd
import numpy as np
import tflearn
from tflearn.layers.core import input_data, fully_connected
from tflearn.layers.estimator import regression
from sklearn.model_selection import train_test_split
def preprocess():
data_source_red = 'F:\Gautam\...\Datasets\winequality-red.csv'
data_red = pd.read_csv(data_source_red, index_col=False, sep=';')
data = pd.get_dummies(data, columns=['quality'], prefix=['score'])
x = data[data.columns[0:11]].values
y = data[data.columns[11:18]].values
x = np.float32(x)
y = np.float32(y)
return (x, y)
x, y = preprocess()
train_x, test_x, train_y, test_y = train_test_split(x, y, test_size = 0.2)
network = input_data(shape=[None, 11], name='Input_layer')
network = fully_connected(network, 10, activation='relu', name='Hidden_layer_1')
network = fully_connected(network, 10, activation='relu', name='Hidden_layer_2')
network = fully_connected(network, 7, activation='softmax', name='Output_layer')
network = regression(network, batch_size=2, optimizer='adam', learning_rate=0.01)
model = tflearn.DNN(network)
model.fit(train_x, train_y, show_metric=True, run_id='wine_regression',
validation_set=0.1, n_epoch=1000)
The neural network above is a poor one(accuracy=0.40). Moreover, it classifies the data into different classes. I would like to know how to code a regression neural network which gives a score out of 10 for the input features (and NOT CLASSIFICATION). I would also prefer tflearn as I'm quite comfortable with it.
Solution 1:[1]
This is the line in your code which makes your network a classifier with seven categories, instead of a regressor:
network = fully_connected(network, 7, activation='softmax', name='Output_layer')
I don't use TFLearn any more, I have switched over to Keras (which is similar, and has better support). However, I will suggest that you want the following output layer instead:
network = fully_connected(network, 1, activation='linear', name='Output_layer')
Also, your training data will need to change. If you want to perform a regression, you want a one-dimensional scalar label instead. I assume that you still have the original data, which you say that you altered? If not, the UC Irvine Machine Learning Data Repository has the wine quality data with a single, numerical Quality column.
Sources
This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.
Source: Stack Overflow
Solution | Source |
---|---|
Solution 1 |