TensorFlow
Contents
TensorFlow#
Layers#
Module: tf.keras.layers
InputLayer:
tf.keras.layers.InputLayer(input_shape=(10,))Dense:
tf.keras.layers.Dense(units, activation=None)Flatten:
tf.keras.layers.Flatten()Dropout:
tf.keras.layers.Dropout(rate)Conv2D:
tf.keras.layers.Conv2D(filters, kernel_size, activation=None, input_shape)input_shapeis expected to be a 3D tuple, with the last dimension being the colour channels, e.g.(28,28,3)for 28 x 28 pixels and 3 colour channels (RGB). An alpha channel would probably be a fourth channel?
MaxPooling2D:
tf.keras.layers.MaxPooling2D(pool_size=(2, 2))
Activation Functions#
Module: tf.keras.activations
The activation functions are also available as individual layers, e.g. when you would like the model to output logits and would like to build a separate probability model with an additional softmax layer appended.
ReLu:
"relu",tf.keras.activations.reluSoftmax:
"softmax",tf.keras.activations.softmaxSigmoid:
"sigmoid",tf.keras.activations.sigmoidNormally, the number of output neurons should match the number of classes in a classification problem. One exception is binary classification, where it is possible to use one single output neuron with sigmoid activation.
Optimizers#
Module: tf.keras.optimizers
SGD:
optimizer='sgd',tf.keras.optimizers.SGD(learning_rate=0.01, momentum=0.0)Adam:
optimizer='adam',tf.keras.optimizers.Adam(learning_rate=0.001)RMSProp:
tf.keras.optimizers.RMSprop(learning_rate=0.001)
Loss Functions#
Module: tf.keras.losses
Binary Crossentropy:
tf.keras.losses.BinaryCrossentropy(from_logits=False)Categorical Crossentropy:
tf.keras.losses.CategoricalCrossentropy(from_logits=False)Sparse Categorical Crossentropy:
tf.keras.losses.SparseCategoricalCrossentropy(from_logits=False)Mean Squared Error:
tf.keras.losses.MeanSquaredError
Metrics#
"accuracy"
Inspection / Information about the Model#
model.summary()model.input_shapemodel.output_shapemodel.layers
Regularisation#
Module: tf.keras.regularizers
L1 / lasso:
kernel_regularizer='l1',tf.keras.regularizers.L1L2 / ridge:
kernel_regularizer='l2',tf.keras.regularizers.L2L1L2:
kernel_regularizer='l1_l2',tf.keras.regularizers.L1L2
Treatment of Inputs#
Split into training and testing datasets