This directory contains code examples using attacking method with Keras platform (Tensorflow as backend, of course). A couple of things to tweak before using Keras with Tensorflow tensors. You could refer to the official document on Keras+Tensorflow.
Essentially, we need to notify Keras that we want to manage the session ourselves.
import tensorflow as tf
from keras import backend as K
sess = tf.Session()
# sess = tf.InteractiveSession()
K.set_session(sess)The model is defined using the Keras syntax.
model = Sequential([
Conv2D(filters=32, kernel_size=(3, 3), padding='same',
input_shape=input_shape),
MaxPooling2D(pool_size=(2, 2), padding='same'),
Flatten(),
Dense(10),
Activation('softmax')])
model.compile(optimizer='adam', loss='categorical_crossentropy',
metrics=['accuracy'])And the adversarial crafting graph is created via
x = tf.placeholder(tf.float32, (None, img_rows, img_cols, img_chas))
def _model_fn(x, logits=False):
ybar = model(x)
logits_, = ybar.op.inputs
if logits:
return ybar, logits_
return ybar
x_adv = fgsm(_model_fn, x, epochs=9, eps=0.02)And the _model_fn needs to have the signature that
- The first parameter must be the network input variable, usually a
tf.placeholder. - The second parameter must be the
logits, with default valueFalse. ifTrue, return the output tensor and the logits. Otherwise, return the output tensor only. The rational is that we usetf.nn.softmax_cross_entropy_with_logitsin the attacking method to calculate the loss. - You may pass other parameters to your function provided that they all have default values.
Currently there is a bug keras/issues/5469 when using Dropout layer
in Keras with Tensorflow as backend. If Dropout is used, then The
number of epochs for tf.while_loop cannot exceed 10 (the default
value of parallel_iterations, I do not know why…), otherwise the
program will hang. Refer to the bug report for more information.
To workaround the bug, all Dropout layers are commented out. I
tried pure code with Tensorflow and it works fine, so it should be the
Keras internal implementation problem.