Several small cases of Tensorflow implementation of TF notes

Use TensorFlow to complete the following small functions and get familiar with the basic usage:

  • 1. Implement an accumulator, and output the result value of the accumulator in each step.
import tensorflow as tf

# Define a variable
x = tf.Variable(1, dtype=tf.int32, name='x')

# Update variables
assign_op = tf.assign(ref=x, value=x+1)
'''
tf.assign():
ARGS: 
ref: A variable tensor.Expected from variable node.Node may not be initialized.
value: tensor.Must have and ref Same type.Is the value to assign to the variable.
validate_shape: An optional bool.Default is True.If so true, The operation will verify "value" Whether the shape of the matches the shape of the tensor assigned to; if false, "ref" Will be right "value" For reference.
use_locking: An optional bool.Default is True.If so True, Distribution will be protected by lock;otherwise, The behavior is undefined, But it may show less contention.
name: Name of the operation(Optional).
//Return:
//A tensor that retains the new value of "ref" after the assignment
'''

# Initialization of variables
init_op = tf.global_variables_initializer()

# Start session
with tf.Session() as sess:
    # Variable initialization
    sess.run(init_op)
    
    # Simulation iteration update accumulator
    for i in range(5):
        # Perform update operation
        sess.run(assign_op)
        print("x:{}".format(sess.run(x)))

Output:

  • 2. Write a piece of code to dynamically update the number of dimensions of variables
import tensorflow as tf

# Define a variable of indefinite shape
x = tf.Variable(
        initial_value=[], # Given a null value
        dtype=tf.int32,
        trainable=False, #It means that it is not loaded into memory space, and this variable is maintained separately. In general, it is True
        validate_shape=False) # # Set to True to check the shape when the variable is updated. The default value is True

# Change of variables
concat = tf.concat([x, [0, 0]], axis=0) #It is mainly used to connect two arrays, values: the array to be connected, axis: which dimension to connect the array from
assign_op = tf.assign(x, concat, validate_shape=False)

# Initialization of variables
init_op = tf.global_variables_initializer()

# Start session
with tf.Session() as sess:
    # Variable initialization
    sess.run(init_op)
    
    # Dynamically update variable dimensions
    for i in range(5):
        sess.run(assign_op)
        print("x:{}".format(sess.run(x)))

Output:

  • 3. Implement a code to solve factorial
import tensorflow as tf

# Define a variable
sum = tf.Variable(1, dtype=tf.int32)

# Define a placeholder
i = tf.placeholder(dtype=tf.int32)

# update operation
tmp_sum = sum * i
assign_op = tf.assign(sum, tmp_sum, name='factorial')

# Variable initialization operation
init_op = tf.global_variables_initializer()

# Start session
with tf.Session() as sess:
    # initialize variable
    sess.run(init_op)
    
    # Iterative factorial
    for j in range(2,6):
        sess.run(assign_op, feed_dict={i: j})
    print("5!={}".format(sess.run(sum)))

Output:

Conclusion:

  • Graph is used to represent computing tasks;
  • Execute the graph in the context of the session;
  • Use sensor to represent data;
  • Maintain state through variable;
  • With feeds and feeds, you can assign values to or get data from any operation / Op.

The above five points are the most basic concepts of TensorFlow: when programming under the TensorFlow framework, we must have the concept of data flow Graph. We are using tensors or variables, through data flow, through each node, through each node, and through each node, we conduct an Operation. Different nodes are connected to form a picture of us We want to compute Graph, which is similar to a mesh computing framework. Finally, we can execute the nodes in the Graph we built through the build Session to get the results we want. (personal idea, please correct if it's wrong!)

Keywords: Programming Session less

Added by webstyler on Sat, 09 Nov 2019 20:38:42 +0200