torch container && Table Containers

1
Container operates directly on the input Tensor, while Table Containers operates on the input table.

2

1) The most common container is nn.Sequential, its purpose is to connect modules in series, which is series!

mlp = nn.Sequential()
mlp:add(nn.Linear(10, 25)) -- Linear module (10 inputs, 25 hidden units)
mlp:add(nn.Tanh())         -- apply hyperbolic tangent transfer function on each hidden units
mlp:add(nn.Linear(25, 1))  -- Linear module (25 inputs, 1 output)

2) The second common container is Parallel(input Dimension, outputDimension), which means to cut the input along the input Dimension, apply his ith child to the cut ith data, and then finally along the outputDimension concat together

mlp = nn.Parallel(2,1);   -- Parallel container will associate a module to each slice of dimension 2
                           -- (column space), and concatenate the outputs over the 1st dimension.

mlp:add(nn.Linear(10,3)); -- Linear module (input 10, output 3), applied on 1st slice of dimension 2
mlp:add(nn.Linear(10,2))  -- Linear module (input 10, output 2), applied on 2nd slice of dimension 2

                                  -- After going through the Linear module the outputs are
                                  -- concatenated along the unique dimension, to form 1D Tensor
> mlp:forward(torch.randn(10,2)) -- of size 5.
-0.5300
-1.1015
 0.7764
 0.2819
-0.6026

Cut the input randn(10,2) along the second dimension to generate two randn(10,1), and then apply the corresponding (10,1) to the corresponding submodule, that is, the first (10 ,1) is applied to nn.Linear(10,3), the second (10,1) is applied to nn.Linear(10,2), and finally produces a result of size (3,1), a size of is the result of (2,1), then concatenated together along the first dimension
3) concat said before

Properties of container:
1) Container is inherited from module, module has some properties, it has
2) get(index), get the module at index in container
3) size(), get the number of modules in container

For the difference between self.model:listModules() and self.model:get(iii) of the network built by nngraph, nngraph is a container, and
self.model:get() is to get the Node in
nngraph self.model:listModules() It is to obtain all elements of the entire model. For a Node, it can continue to be dismembered until it is disassembled into the smallest unit container or a single node, so in general, the second method will obtain more dismemberment information, because its purpose is to The network is disassembled into the smallest modules, and the get() function only obtains the corresponding nngraph Node relative to nngraph. For Node, it does not continue to be decomposed. If a nn.Sequential contains many internal sub-modules, but because of its is a Node, so will not be dismembered.

        threshold_nodes, container_nodes = self.model:findModules('cudnn.SpatialConvolution')
        for i = 1,#threshold_nodes do
          print(threshold_nodes[i])
          print(container_nodes[i]) 
        end

self.model:findModules will return the searched module and find the corresponding container. For those without a container as a parent node, they can be regarded as containers. This function is to find all modules in the network, so it must be It can only be completely dismembered. It corresponds to self.model:listModules(), which is the property of the container parent class module class, which is inherited by itself.

3 The general printing network output is to use

        for iii = 1,self.model:size(),1 do 
            print(iii,self.model:get(iii))
        end

Find the label of the corresponding module, which is iii, and print the corresponding content.

Guess you like

Origin http://43.154.161.224:23101/article/api/json?id=325675483&siteId=291194637