'Differential Privacy in Tensorflow Federated

I try to run mnist_dpsgd_tutorial.py in Tensorflow Privacy, and check number of dimensions of the gradient. I think gradient is calculated by dp_optimizer. Is there a way to check and operate the gradient?



Solution 1:[1]

It is optimizer, for loss optimizer there are basics methods as this below, you can adjust to your method.

optimizer =  tf.keras.optimizers.Adam(
    learning_rate=learning_rate, beta_1=0.9, beta_2=0.999, epsilon=1e-07, amsgrad=False,
    name='Adam'
)

var1 = tf.Variable(10.0)
var2 = tf.Variable(10.0)
X_var = tf.compat.v1.get_variable('X', dtype = tf.float32, initializer = tf.random.normal((1, 10, 1)))
y_var = tf.compat.v1.get_variable('Y', dtype = tf.float32, initializer = tf.random.normal((1, 10, 1)))
Z = tf.nn.l2_loss((var1 - X_var) ** 2 + (var2 - y_var) ** 2, name="loss")

cosine_loss = tf.keras.losses.CosineSimilarity(axis=1)
loss = tf.reduce_mean(input_tensor=tf.square(Z))
training_op = optimizer.minimize(cosine_loss(X_var, y_var))

init = tf.compat.v1.global_variables_initializer()
loss_summary = tf.compat.v1.summary.scalar('LOSS', loss)
merge_summary = tf.compat.v1.summary.merge_all()
file_writer = tf.compat.v1.summary.FileWriter(logdir, tf.compat.v1.get_default_graph())

with tf.compat.v1.Session() as sess:
    if exists(savedir + '\\invader_001') :
        ## model.load_weights(checkpoint_path)
        saver = tf.compat.v1.train.Saver()
        saver.restore(sess, tf.train.latest_checkpoint(savedir + '\\invader_001'))
        print("model load: " + savedir + '\\invader_001')
    
    init.run()
    train_loss, _ = sess.run([loss, training_op], feed_dict={var1:X, var2:Y})
    sess.close()

print(train_loss)
print(merge_summary)
print(loss_summary)

Loss optimizers means square

Sources

This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.

Source: Stack Overflow

Solution Source
Solution 1 Martijn Pieters