I have a stored Tensorflow model, which has been trained for a long period. For dropout ops, I used a fixed value for the "keep_prob" parameter instead of a placeholder, which now turns out as a mistake, since for evaluation I would like to set the value of "keep_prob" for 1.0.
Is there a way to change this op, to operate with keep_prob = 1.0, for the stored model? In other words, is it possible to change a fixed dropout rate of a stored Tensorflow model?
Specifically, I used the fixed value in these ops: