Skip to content
Open
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion docs/chapter2/第二章 Transformer架构.md
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@

随着 NLP 从统计机器学习向深度学习迈进,作为 NLP 核心问题的文本表示方法也逐渐从统计学习向深度学习迈进。正如我们在第一章所介绍的,文本表示从最初的通过统计学习模型进行计算的向量空间模型、语言模型,通过 Word2Vec 的单层神经网络进入到通过神经网络学习文本表示的时代。但是,从 计算机视觉(Computer Vision,CV)为起源发展起来的神经网络,其核心架构有三种:

- 全连接神经网络(Feedforward Neural Network,FNN),即每一层的神经元都和上下两层的每一个神经元完全连接,如图2.1所示:
- 全连接神经网络(Fully Connected Neural Network, FCNN),即每一层的神经元都和上下两层的每一个神经元完全连接,如图2.1所示:

<div align="center">
<img src="https://raw.githubusercontent.com/datawhalechina/happy-llm/main/docs/images/2-figures/1-0.png" alt="图片描述" width="90%"/>
Expand Down