Skip to content

Basic acknowledge of natural language processing models and machine learning algorithms.

Notifications You must be signed in to change notification settings

PearlCoastal/NLP_Note

Repository files navigation

Natural Language Processing

Basic acknowledge of natural language processing model: Transformer.

  1. Attention Mechanism: Reasons why attention.
  2. Kinds of attention: soft & hard attention mechanism.
  3. Difference between attention and self-attention.
  4. How self-attention worked.
  5. About multi-head self-attention.
  6. Some classifier function: Sigmoid, Softmax, Tanh and ReLu.
  7. Little about CNN and RNN.
  8. Vanishing gradient problem & explosion gradient problem.
  9. How transformer solved long-distance dependencies problem.
  10. Descrption of transformer models: Transformer Encoder & Transformer Decoder.
  11. Difference between Transformer Encoder & Transformer Decoder.

👉 Note

Machine Learning Note

  1. About logistic regression & linear regression.
  2. Some supervised learning methods: KNN, SVM, Kernel-SVM, Decision Tree and Naive Bayes.
  3. Ensemble Learning: Bagging & Boosting.
  4. Comparison of Bagging & Boosting.
  5. Difference between logistic regression & linear regression.

👉Note

Optimization Function

  1. Frame of Optimization function.
  2. SGD.
  3. SGD-Momentum.
  4. SGD with Nesterov Accerleratin(NAG).
  5. AdamGrad.
  6. AdamDelta.
  7. Adam.
  8. Nadam.
  9. 2 shortbacks of Adam.
  10. Adam + SGD.

👉Note

RNN and LSTM

  1. Why RNN.
  2. How RNN works.
  3. About LSTM.
  4. How LSTM solved long-distance dependencies problem.
  5. Gate Control of LSTM:
    • forget gate
    • input gate
    • output gate
    • sigmoid & tanh
  6. Simpler LSTM: GRU
    • reset gate
    • update gate

👉 Note

Back Propagation

  1. Aim of back propagation.
  2. How Back propagation works.
  3. Difference between back and forward.
  4. Good point of BP.

👉Note

CFG -> CNF

  1. How to transform CFG to CNF.

👉 Note

About

Basic acknowledge of natural language processing models and machine learning algorithms.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published