lstm few pictures

Conditional entropy
\ [\ begin {equation} H
(Y | X) \ quad = \ sum_ {x \ in X} p (x) H (Y | X = x) \ end {equation} \] Information Gain: Entropy - conditional entropy

img

img

img

img

original blog

img

img

\ [\ Begin {aligned} i_ {t} & = \ sigma \ left (W_ {ii} x_ {t} + b_ {ii} + W_ {hi} h _ {(t-1)} + b_ {hi} \ right) \\ f_ {t} & = \ sigma \ left (W_ {if} x_ {t} + b_ {if} + W_ {hf} h _ {(t-1)} + b_ {hf} \ right) \ \ g_ {t} & = \ tanh \ left (W_ {ig} x_ {t} + b_ {ig} + W_ {hg} h _ {(t-1)} + b_ {hg} \ right) \\ o_ { t} & = \ sigma \ left (W_ {io} x_ {t} + b_ {io} + W_ {ho} h _ {(t-1)} + b_ {ho} \ right) \\ c_ {t} & = f_ {t} * c _ {(t-1)} + i_ {t} * g_ {t} \\ h_ {t} & = o_ {t} * \ tanh \ left (c_ {t} \ right) \ end {aligned} \]
in the end is spliced together, the two vectors is processed by a weight matrix weight respectively together?

Guess you like

Origin www.cnblogs.com/qizhien/p/11838738.html