'Making neural network training reproducible using RStudio's Keras interface
I'm trying to make neural network training reproducible using RStudio's Keras interface. Setting a seed in the R script (set.seed(42)
) doesn't seem to work. Is it possible to pass seeding as an argument to layer_dense()
? I can choose RandomUniform
as an initializer but I'm having difficulty passing a seeding argument along with it. The following line throws an error:
model %>% layer_dense(units = 12, activation = 'relu', input_shape = c(8), kernel_initializer = "RandomUniform(seed=1)")
But a layer can be added without the attempt to pass a seed argument:
model %>% layer_dense(units = 12, activation = 'relu', input_shape = c(8), kernel_initializer = "RandomUniform")
RandomUniform
is suppose to take a seed argument according to the Keras initializer documents.
Solution 1:[1]
kernel initializer argument syntax should be like this. kernel_initializer=initializer_random_uniform(minval = -0.05, maxval = 0.05, seed = 104)
Try these steps.
1) Set seed for R environment before importing keras/tensorflow
2) Set tensorflow session configuration to use single thread
3) Set tensorflow random seed
4) Create tensorflow session with this seed and assign it to keras backend.
5) Finally in your model layers, if you are using random initializers like random_uniform(this is the default one) or random_normal then you will have to change the seed argument to some integer Below is an example
# Set R random seed
set.seed(104)
library(keras)
library(tensorflow)
# TensorFlow session configuration that uses only a single thread. Multiple threads are a
# potential source of non-reproducible results, see: https://stackoverflow.com/questions/42022950/which-seeds-have-to-be-set-where-to-realize-100-reproducibility-of-training-res
#session_conf <- tf$ConfigProto(intra_op_parallelism_threads = 1L,
# inter_op_parallelism_threads = 1L)
# Set TF random seed (see: https://www.tensorflow.org/api_docs/python/tf/set_random_seed)
tf$set_random_seed(104)
# Create the session using the custom configuration
sess <- tf$Session(graph = tf$get_default_graph(), config = session_conf)
# Instruct Keras to use this session
K <- backend()
K$set_session(sess)
#Then in your model architecture, set seed to all random initializers.
model %>%
layer_dense(units = n_neurons, activation = 'relu', input_shape = c(100),kernel_initializer=initializer_random_uniform(minval = -0.05, maxval = 0.05, seed = 104)) %>%
layer_dense(units = n_neurons, activation = 'relu',kernel_initializer=initializer_random_uniform(minval = -0.05, maxval = 0.05, seed = 104)) %>%
layer_dense(units =c(100) ,kernel_initializer=initializer_random_uniform(minval = -0.05, maxval = 0.05, seed = 104))
References: https://rstudio.github.io/keras/articles/faq.html#how-can-i-obtain-reproducible-results-using-keras-during-development https://rstudio.github.io/keras/reference/initializer_random_normal.html#arguments
Solution 2:[2]
library(keras)
use_session_with_seed(42)
The use_session_with_seed() function establishes a common random seed for R, Python, Numpy, and Tensorflow. For further details, see https://keras.rstudio.com/articles/faq.html
Solution 3:[3]
Sources
This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.
Source: Stack Overflow
Solution | Source |
---|---|
Solution 1 | Narahari B M |
Solution 2 | BRCN |
Solution 3 | dfrankow |