Update README.md
Browse files
README.md
CHANGED
|
@@ -41,7 +41,22 @@ FAQ (as the author imagines):
|
|
| 41 |
|
| 42 |
Instead of the `bottleneck` block of ResNet50 which consists of 1x1, 3x3, 1x1 in succession, this simplest version of QLNet does a 1x1, splits into two equal halves and **multiplies** them, then applies a 3x3 (depthwise), and a 1x1, *all without activation functions* except at the end of the block, where a "radial" activation function that we call `hardball` is applied.
|
| 43 |
|
| 44 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 45 |
|
| 46 |
- **Developed by:** Yao Liu 刘杳
|
| 47 |
- **Model type:** Convolutional Neural Network (ConvNet)
|
|
|
|
| 41 |
|
| 42 |
Instead of the `bottleneck` block of ResNet50 which consists of 1x1, 3x3, 1x1 in succession, this simplest version of QLNet does a 1x1, splits into two equal halves and **multiplies** them, then applies a 3x3 (depthwise), and a 1x1, *all without activation functions* except at the end of the block, where a "radial" activation function that we call `hardball` is applied.
|
| 43 |
|
| 44 |
+
```python
|
| 45 |
+
class QLNet(nn.Module:
|
| 46 |
+
...
|
| 47 |
+
|
| 48 |
+
def forward(self, x):
|
| 49 |
+
x0 = self.skip(x)
|
| 50 |
+
x = self.conv1(x) # 1x1
|
| 51 |
+
C = x.size(1) // 2
|
| 52 |
+
x = x[:, :C, :, :] * x[:, C:, :, :]
|
| 53 |
+
x = self.conv2(x) # 3x3
|
| 54 |
+
x = self.conv3(x) # 1x1
|
| 55 |
+
x += x0
|
| 56 |
+
if self.act3 is not None:
|
| 57 |
+
x = self.act3(x)
|
| 58 |
+
return x
|
| 59 |
+
```
|
| 60 |
|
| 61 |
- **Developed by:** Yao Liu 刘杳
|
| 62 |
- **Model type:** Convolutional Neural Network (ConvNet)
|