Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion doc/lenet.txt
Original file line number Diff line number Diff line change
Expand Up @@ -216,7 +216,7 @@ one of Figure 1. The input consists of 3 features maps (an RGB color image) of s
# but also to insert new ones along which the tensor will be
# broadcastable;
# dimshuffle('x', 2, 'x', 0, 1)
# This will work on 3d tensors whith no broadcastable
# This will work on 3d tensors with no broadcastable
# dimensions. The first dimension will be broadcastable,
# then we will have the third dimension of the input tensor as
# the second of the resulting tensor, etc. If the tensor has
Expand Down
2 changes: 1 addition & 1 deletion doc/mlp.txt
Original file line number Diff line number Diff line change
Expand Up @@ -175,7 +175,7 @@ both upward (activations flowing from inputs to outputs) and backward
self.b = theano.shared(value= b_values, name ='b')


Note that we used a given non linear function as the activation function of the hidden layer. By default this is ``tanh``, but in many cases we might want
Note that we used a given non-linear function as the activation function of the hidden layer. By default this is ``tanh``, but in many cases we might want
to use something else.

.. code-block:: python
Expand Down
8 changes: 4 additions & 4 deletions doc/rbm.txt
Original file line number Diff line number Diff line change
Expand Up @@ -663,17 +663,17 @@ log-PL:
where the expectation is taken over the uniform random choice of index :math:`i`,
and :math:`N` is the number of visible units. In order to work with binary
units, we further introduce the notation :math:`\tilde{x}_i` to refer to
:math:`x` with bit-i being flipped (1->0, 0->1). The log-PL for an RBM with binary unit is
:math:`x` with bit-i being flipped (1->0, 0->1). The log-PL for an RBM with binary units is
then written as:

.. math::
\log PL(x) &\approx N \cdot \log
\frac {e^{-FE(x)}} {e^{-FE(x)} + e^{-FE(\tilde{x}_i)}} \\
&\approx N \cdot \log[ sigm (FE(\tilde{x}_i) - FE(x)) ]

We therefore return this cost as well as the RBM updates in the `get_cost_updates` function of the `RBM` class.
We therefore return this cost as well as the RBM updates in the ``get_cost_updates`` function of the ``RBM`` class.
Notice that we modify the updates dictionary to increment the
index of bit :math:`i`. This will result in bit i cycling over all possible
index of bit :math:`i`. This will result in bit :math:`i` cycling over all possible
values :math:`\{0,1,...,N\}`, from one update to another.

Note that for CD training the cost-entropy cost between the input and the
Expand Down Expand Up @@ -721,7 +721,7 @@ himself with the function ``tile_raster_images`` (see :ref:`how-to-plot`). Since
RBMs are generative models, we are interested in sampling from them and
plotting/visualizing these samples. We also want to visualize the filters
(weights) learnt by the RBM, to gain insights into what the RBM is actually
doing. Bare in mind however, that this does not provide the entire story,
doing. Bear in mind however, that this does not provide the entire story,
since we neglect the biases and plot the weights up to a multiplicative
constant (weights are converted to values between 0 and 1).

Expand Down