Prefetch_queue.py

def prefetch_queue(tensors,

                   capacity=8,

                   num_threads=1,

                   dynamic_pad=False,

                   shared_name=None,

                   name=None):

  """Creates a queue to prefetch tensors from `tensors`.



  A queue runner for enqueuing tensors into the prefetch_queue is automatically

  added to the TF QueueRunners collection.



  Example:

  This is for example useful to pre-assemble input batches read with

  `tf.train.batch()` and enqueue the pre-assembled batches.  Ops that dequeue

  from the pre-assembled queue will not pay the cost of assembling the batch.



  images, labels = tf.train.batch([image, label], batch_size=32, num_threads=4)

  batch_queue = prefetch_queue([images, labels])

  images, labels = batch_queue.dequeue()

  logits = Net(images)

  loss = Loss(logits, labels)



  Args:

    tensors: A list or dictionary of `Tensors` to enqueue in the buffer.

    capacity: An integer. The maximum number of elements in the queue.

    num_threads: An integer.  Number of threads running the enqueue op.

    dynamic_pad: Boolean.  Whether to allow variable dimensions in input shapes.

    shared_name: (optional). If set, this queue will be shared under the given

      name across multiple sessions.

    name: (Optional) A name for the operations.



  Returns:

    A queue from which you can dequeue tensors with the same type and shape

    as `tensors`.

  """

The function of this function: prefetch tensor into the queue from the data 'Tensor',

example:

 images, labels = tf.train.batch([image, label], batch_size=32, num_threads=4)

1.tf.train.batch package the data set

batch_queue = prefetch_queue([images, labels])

2 prefetch_queue statement put the packaged data set into the queue

images, labels = batch_queue.dequeue()

3. Use the dequeue statement to dequeue the data

logits = Net(images)

loss = Loss(logits, labels)

4. Finally, calculate the logit and loss.