memory_reserved¶
- paddle.device.cuda. memory_reserved ( device=None ) [source]
- 
         Return the current size of GPU memory that is held by the allocator of the given device. - Parameters
- 
           device (paddle.CUDAPlace or int or str) – The device, the id of the device or the string name of device like ‘gpu:x’. If device is None, the device is the current device. Default: None. 
- Returns
- 
           The current size of GPU memory that is held by the allocator of the given device, in bytes. 
- Return type
- 
           int 
 Examples # required: gpu import paddle memory_reserved_size = paddle.device.cuda.memory_reserved(paddle.CUDAPlace(0)) memory_reserved_size = paddle.device.cuda.memory_reserved(0) memory_reserved_size = paddle.device.cuda.memory_reserved("gpu:0") 
