wesfggfd commited on
Commit
7062271
·
verified ·
1 Parent(s): a6a1053

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +68 -1
README.md CHANGED
@@ -9,13 +9,80 @@ V2 NLP to monitor Twitter for natural disasters Natural Language Processing (
9
  v3 Intro to 🤗 Transformers performance:0.83481 V3
10
  ```
11
 
 
 
12
 
 
13
 
 
14
 
15
 
16
- <h1 align="center" style="color:green;font-size: 3em;" >Implementing Transformers From Scratch Using Pytorch</h1>
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
17
 
 
 
18
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
19
 
20
 
21
  * [1. Introduction](#section1)
 
9
  v3 Intro to 🤗 Transformers performance:0.83481 V3
10
  ```
11
 
12
+ <a name='1'></a>
13
+ ## 1 - Forward Propagation for the Basic Recurrent Neural Network
14
 
15
+ Later this week, you'll get a chance to generate music using an RNN! The basic RNN that you'll implement has the following structure:
16
 
17
+ In this example, Tx = Ty.
18
 
19
 
20
+ <img src="Images/RNN.png" style="width:500;height:300px;">
21
+ <caption><center><font color='purple'><b>Figure 1</b>: Basic RNN model </center></caption>
22
+
23
+
24
+ <a name='1-1'></a>
25
+ ### 1.1 - RNN Cell
26
+
27
+ You can think of the recurrent neural network as the repeated use of a single cell. First, you'll implement the computations for a single time step. The following figure describes the operations for a single time step of an RNN cell:
28
+
29
+ <img src="Images/rnn_step_forward_figure2_v3a.png" style="width:700px;height:300px;">
30
+ <caption><center><font color='purple'><b>Figure 2</b>: Basic RNN cell.
31
+ </center></caption>
32
+
33
+ <a name='1-2'></a>
34
+ ### 1.2 - RNN Forward Pass
35
+
36
+ <img src="Images/rnn_forward_sequence_figure3_v3a.png" style="width:800px;height:180px;">
37
+ <caption><center><font color='purple'><b>Figure 3</b>: Basic RNN.
38
+ </center></caption>
39
+
40
+ <a name='2'></a>
41
+ ## 2 - Long Short-Term Memory (LSTM) Network
42
+
43
+ The following figure shows the operations of an LSTM cell:
44
+
45
+ <img src="Images/LSTM_figure4_v3a.png" style="width:500;height:400px;">
46
+ <caption><center><font color='purple'><b>Figure 4</b>: LSTM cell.
47
+ Note, the softmax includes a dense layer and softmax.
48
+ </center></caption>
49
+
50
+
51
+ <a name='2-2'></a>
52
+ ### 2.2 - Forward Pass for LSTM
53
+
54
+ Now that you have implemented one step of an LSTM, you can iterate this over it using a for loop to process a sequence of Tx inputs.
55
+
56
+ <img src="Images/LSTM_rnn.png" style="width:500;height:300px;">
57
+ <caption><center><font color='purple'><b>Figure 5</b>: LSTM over multiple time steps. </center></caption>
58
 
59
+ <a name='3-1'></a>
60
+ ### 3.1 - Basic RNN Backward Pass
61
 
62
+ Begin by computing the backward pass for the basic RNN cell. Then, in the following sections, iterate through the cells.
63
+
64
+ <img src="Images/rnn_backward_overview_3a_1.png" style="width:500;height:300px;"> <br>
65
+ <caption><center><font color='purple'><b>Figure 6</b>: The RNN cell's backward pass.
66
+ </center></caption>
67
+
68
+ <img src="Images/rnn_cell_backward_3a_c.png" style="width:800;height:500px;"> <br>
69
+ <caption><center><font color='purple'><b>Figure 7</b>: This implementation of `rnn_cell_backward` does **not** include the output dense layer and softmax which are included in `rnn_cell_forward`.
70
+
71
+
72
+ #### 1. One Step Backward
73
+ The LSTM backward pass is slightly more complicated than the forward pass.
74
+
75
+ <img src="Images/LSTM_cell_backward_rev3a_c2.png" style="width:500;height:400px;"> <br>
76
+ <caption><center><font color='purple'><b>Figure 8</b>: LSTM Cell Backward. Note the output functions, while part of the `lstm_cell_forward`, are not included in `lstm_cell_backward`
77
+ </center></caption>
78
+
79
+
80
+
81
+
82
+
83
+
84
+
85
+ <h1 align="center" style="color:green;font-size: 3em;" >Implementing Transformers From Scratch Using Pytorch</h1>
86
 
87
 
88
  * [1. Introduction](#section1)