4 TensorFlow Model Persistence

TensorFlow provides a very simple API to save and restore a neural network model.This API is the tf.train.Saver class.The following code gives a way to save the TesnsorFlow calculation diagram.

import tensorflow as tf

#Declare two variables and calculate their sum
v1 = tf.Variable(tf.constant(1.0, shape = [1]), name = "v1")
v2 = tf.Variable(tf.constant(2.0, shape = [1]), name = "v2")
result = v1 + v2

init_op = tf.initialize_all_variables()
#Declare the tf.train.Saver class to save the model
saver = tf.train.Saver()

with tf.Session() as sess:
    sess.run(init_op)
    #Save model to/path/to/model/model.ckpt file
    saver.save(sess, "/path/to/model/model.ckpt")

The above code implements the ability to persist a simple TensorFlow model.The TensorFlow model is saved to the specified path through the saver.save function.Although the program specifies only one file path, there are three files in this file directory.Because TensorFlow keeps the structure of the calculated graph separate from the values of the parameters on the graph.

The first file, model.ckpt.meta, holds the structure of the TensorFlow calculation diagram, which can be simply understood as the network structure of a neural network.The second file is model.ckpt, which holds the values of each variable in the TensorFlow program.The last file is the checkpoint file, which holds a list of all model files in a directory.

The following is how to load this saved TensorFlow model.

import tensorflow as tf

#Declare variables in the same way you save model code
v1 = tf.Variable(tf.constant(1.0, shape = [1]), name = "v1")
v2 = tf.Variable(tf.constant(2.0, shape = [1]), name = "v2")
result = v1 + v2

saver = tf.train.Saver()

with tf.Session() as sess:
    #Load the saved model and calculate the addition from the values of the variables in the saved model
    saver.restore(sess, "path/to/model/model.ckpt")
    print(sess.run(result))

The only difference between the two pieces of code is that the code loading the model does not run the initialization of the variable, but loads the value of the variable through the saved model.

You can load persisted diagrams directly.

import tensorflow as tf
#Loading persisted diagrams directly
saver = tf.train.import_meta_graph("/path/to/model.ckpt/model.ckpt.meta")
with tf.Session() as sess:
    saver.restore(sess, "/path/to/medel/model.ckpt")
    #Getting a tensor by its name
    print(sess.run(tf.get_default_graph().get_tensor_by_name("add:0"))

In the above program, all variables defined on the TensorFlow calculation diagram are saved and loaded by default.Sometimes, however, you only need to save or load some variables, and you need to provide a list when declaring the tf.train.Saver class to specify the variables you want to save or load.

The tf.train.Saver class also supports renaming variables when saved or loaded.

#The variable name declared here is different from the name of the variable in the saved model
v1 = tf.Variable(tf.constant(1.0, shape = [1]), name = "other-v1")
v2 = tf.Variable(tf.constant(2.0, shape = [1]), name = "other-v2")

#Loading a model directly using tf.train.Saver() will result in an error where the variable cannot be found.

#Renaming variables with a dictionary loads the original model.This dictionary specifies that the variable Center originally named v1 is loaded
#To variable V1 (named other-v1), variable V2 is loaded into variable V2 (named other-v2)
saver = tf.train.Saver({"v1":v1, "v2": v2})

One of the main purposes of this is to facilitate the use of sliding averages for variables.In TensorFlow, the sliding average of each variable is maintained by a shadow variable, so getting the sliding average of a variable is essentially getting the value of the shadow variable.If the loading model maps the shadow variable directly to the variable itself, then you do not need to call the function to get the sliding average of the variable when using the trained model.

The following is an example of a sliding average model preserved.

import tensorflow as tf

v= tf.Varibale(0, dtype = tf.float32, name = "v")
#There is only one variable V when there is no application for the East China Average Model, so the following statement will only output "v:0".
for variables in tf.all_variables():
    print(variables.name)

ema = tf.train.ExponentialMovingAverage(0.99)
maintain_average_op = ema.apply(tf.all_variables())
#After applying for a sliding average model, TensorFlow automatically generates a shadow variable
#v/ExponentialMoving Average.So the following statement will output
#"v:0" and "v/ExpinentialMovingAverage:0".
for variables in tf.all_variables():
    print(variables.name)

saver = tf.train.Saver()
with tf.Session() as sess:
    init_op = tf.initialize_all_variables()
    sess.run(init_op)

    sess.run(tf.assign(v, 10))
    sess.run(maintain_average_op)
    #When saved, TensorFlow saves both v:0 and v/ExponentialMovingAverage:0 variables
    saver.save(sess, "path/to/model.ckpt")
    print(sess.run([v, ema.average(v)]))  #Output [10.0, 0.999905]
    

The following code shows how to read the sliding average of a variable directly by renaming it.As you can see from the output of the program below, the value of the read variable v is actually a sliding average of the variable v in the code above.Using this method, the results of forward propagation of the sliding average model can be calculated using exactly the same code.

v  = tf.Variable(0, dtype = tf.float32, name = "v")
#The sliding average of the original variable v is assigned to v by variable renaming.
saver = tf.train.Saver({"v/ExponentialMovingAverage": v})
with tf.Session() as sess:
    saver.restore(sess, "/path/to/model/model.ckpt")
    print(sess.run(v))#Output 0.0999905, which is the sliding average of the variable v in the original model

Using tf.train.Saver saves all the information needed to run the TensorFlow program, but sometimes some information is not needed.TensorFlow then provides the convert_variables_to_constnats function, which allows you to save variables and values in the calculation diagram as constants so that the entire TensorFlow calculation diagram can be stored in a single file.

import tensorflow as tf
from tensorflow.python.framework import graph_util

v1 = tf.Variable(tf.constant(1.0, shape = [1]), name = "v1")
v2 = tf.Variable(tf.constant(2.0, shape = [1]), name = "v2")
result = v1 + v2

init_op = tf.initialize_all_variables()
with tf.Session() as sess:
    sess.run(init_op)
    #Export the GraphDef part of the current calculation diagram, which is needed to complete the calculation from the input layer to the output layer
    graph_def = tf.get_default_graph().as_graph_def()

    #Convert the variables and their values in the diagram to constants, while removing unnecessary nodes from the diagram in the next line of code
    #The last parameter ['add'] gives the name of the node that needs to be saved, and the add node is the addition of the above two variables
    #operation
    output_grpah_def = graph_util_convert_variables_to_constants(sess, graph_def, ['add'])
    #Save the exported model to a file
    with tf.gfile.GFile("/path/to/model/combined_model.pb", "wb") as f:
        f.write(output_grpah_def.SerializerToString())
        

The following program allows you to directly calculate the result of a defined addition operation, and here is a simpler method when you only need to get the value of one of the nodes in the calculation diagram.

import tensorflow as tf
from tensorflow.python.platform import gfile

with tf.Session() as sess:
    model_filename = "/path/to/model/combined_model.pb"
    #Read the saved model file and parse it into the corresponding GraphDef Protocol Buffer
    with gfile.FastGFile(model_filename, 'rb') as f:
        graph_def = tf.GraphDef()
        graph_def.ParseFromString(f.read())

    #Load the graph saved in graph_def into the current graph.return_elements = ["add:0"] gives a return
    #The name of the tensor of.The name of the computed node is given when saving, so it is "add".Give when loading
    #Is the name of the tensor, so add:0
    result = tf.import_graph_def(graph_def, return_elements = ["add:0"])
    print(sess.run(result))

Keywords: PHP Session network Python

Added by Rhiknow on Wed, 24 Jul 2019 21:40:37 +0300