tensorflow2.0 basic operations
Others
2019-08-04 17:29:39
views: null
Broadcasting (Radio)
- why?
- concise
- Provincial memory
- Example
-
computation
outline
- +-*/
- **,pow,square
- sqrt
- //,%
- exp,log
- @ MATMUL
- linear layer
Merger and division
- tf.concat([],axis=)
- merge
- It does not create a new dimension
- tf.stack([],axis=)
- merge
- It creates a new dimension
- tf.unstack([],axis=)
- And tf.stack () is reversible
- tf.split([],axis=,num_or_size_splits=[])
- num_or_size_splits: into what
Statistics
- tf.norm ([], ord =, axis =) # tensor norm
- Two norm: square and square root
- A norm: absolute and
- word: 1.2
- tf.reduce_min/max/mean
- tf.argmax/argmin
- Returns the value of the index most
- tf.equal(a,b)
- tf.unique
Sorting data
- black / argsort
- tf.sort([],direction='')
- tf.argsort: Back to index
- top_k
- tf.math.top_k()
- Applications: top-k accuracy
- top-5 acc.
Copy filling and
- tf.pad
- Filling of data
- tf.pad(a,[1,1],[1,1])
- tf.tile
- Copy data
- tf.tile(a,[2,2])
- broadcast_to
Tensor limiter
- clip_by_value
- tf.maximum(a,2)
- tf.minimum(a,8)
- tf.clip_by_value(a,2,8)
- Relu
- clip_by_norm
- gradient
Higher-order OP
Origin www.cnblogs.com/dhp-2016/p/11184213.html