Issue
I have a model in Keras:
import tensorflow as tf
import numpy as np
import pandas as pd
from tensorflow import keras
from tensorflow.keras import layers
from sklearn.model_selection import train_test_split
import random
df = pd.read_csv('/home/Datasets/creditcard.csv')
output = df['Class']
features = df.drop('Class', 1)
train_features, test_features, train_labels, test_labels = train_test_split(df, output, test_size = 0.2, random_state = 42)
train_features = train_features.to_numpy()
test_features = test_features.to_numpy()
train_labels = train_labels.to_numpy()
test_labels = test_labels.to_numpy()
model = tf.keras.Sequential()
num_nodes = [1]
act_functions = [tf.nn.relu]
optimizers = ['SGD']
loss_functions = ['categorical_crossentropy']
epochs_count = ['10']
batch_sizes = ['500']
act = random.choice(act_functions)
opt = random.choice(optimizers)
ep = random.choice(epochs_count)
batch = random.choice(batch_sizes)
loss = random.choice(loss_functions)
count = random.choice(num_nodes)
model.add(tf.keras.layers.Dense(31, activation = act, input_shape=(31,)))
model.add(tf.keras.layers.Dense(count, activation = act))
model.add(tf.keras.layers.Dense(1, activation = act))
model.compile(loss = loss,
optimizer = opt,
metrics = ['accuracy'])
epochs = int(ep)
batch_size = int(batch)
model.fit(train_features, train_labels, epochs=epochs, batch_size=batch_size)
The train labels are binary:
[0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0
But, the output of:
z = model.predict(test_features)
is:
array([[ 4574.6 ],
[ 4896.158 ],
[ 3867.8225],
...,
[15516.117 ],
[ 6441.43 ],
[ 5453.437 ]], dtype=float32)
Why would it be predicting these values?
Thanks
Solution
On your last layer use sigmoid activation,
model.add(tf.keras.layers.Dense(1, activation='sigmoid'))
Answered By - ferdy
0 comments:
Post a Comment
Note: Only a member of this blog may post a comment.