min¶
-
paddle.distributed.fleet.metrics.metric.
min
( input, scope=None, util=None ) [source] -
distributed min in fleet
- Parameters
-
input (numpy.array|Variable|string) – output of a layer
scope (Scope) – specific scope
- Returns
-
min array
- Return type
-
global_metric(numpy.array)
Example
# in model.py input = fluid.layers.cast(some_input, dtype='float32') cnt = fluid.layers.reduce_sum(input) global_cnt = fluid.layers.create_global_var(persistable=True, dtype='float32', shape=[1], value=0) tmp = fluid.layers.elementwise_min(cnt, global_cnt) fluid.layers.assign(tmp, global_cnt) # in train.py, after train or infer res = np.array(scope.find_var(global_cnt.name).get_tensor()) print("min array: ", paddle.distributed.fleet.min(res))