Skip to content
Open
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion 01.大语言模型基础/1.激活函数/1.激活函数.md
Original file line number Diff line number Diff line change
Expand Up @@ -119,7 +119,7 @@ $$
\operatorname{PReLU}(z)=\left\{\begin{array}{ll}\alpha \cdot z & \text { if } z \leqslant 0 \\ z & \text { if } z>0\end{array}\right.
$$

RReLU 的公式如下,这里的 $\alpha$是从一个高斯分布中随机产生的,在训练过程中每次这个 $\alpha$ 都是不相同的;在推理时会将这个$\alpha$都是不相同的;在推理时会将这个
RReLU 的公式如下,这里的 $\alpha$是从一个高斯分布中随机产生的,在训练过程中每次这个 $\alpha$ 都是不相同的

$$
\operatorname{RReLU}(z)=\left\{\begin{array}{ll}\alpha \cdot z & \text { if } z \leqslant 0 \\ z & \text { if } z>0\end{array}\right.
Expand Down