Skip to content Skip to sidebar Skip to footer

How To Make The Generated Data In Remote Worker Span Iterations In In-graph Replica In Distributed Tensorflow?

I use the in-graph replication of tensorflow to do distributed training. For reducing communicaiton cost purpose, i need hold some generated data (such as the cell states in LSTM)

Solution 1:

Variable objects and persistent tensors can replace feed_dict, as Yaroslav proposed. Thanks Yaroslav.


Post a Comment for "How To Make The Generated Data In Remote Worker Span Iterations In In-graph Replica In Distributed Tensorflow?"