Update README.md
Browse files
README.md
CHANGED
@@ -42,7 +42,7 @@ FAQ (as the author imagines):
|
|
42 |
Instead of the `bottleneck` block of ResNet50 which consists of 1x1, 3x3, 1x1 in succession, this simplest version of QLNet does a 1x1, splits into two equal halves and **multiplies** them, then applies a 3x3 (depthwise), and a 1x1, *all without activation functions* except at the end of the block, where a "radial" activation function that we call `hardball` is applied.
|
43 |
|
44 |
```python
|
45 |
-
class QLNet(nn.Module:
|
46 |
...
|
47 |
|
48 |
def forward(self, x):
|
@@ -50,7 +50,7 @@ class QLNet(nn.Module:
|
|
50 |
x = self.conv1(x) # 1x1
|
51 |
C = x.size(1) // 2
|
52 |
x = x[:, :C, :, :] * x[:, C:, :, :]
|
53 |
-
x = self.conv2(x) # 3x3
|
54 |
x = self.conv3(x) # 1x1
|
55 |
x += x0
|
56 |
if self.act3 is not None:
|
|
|
42 |
Instead of the `bottleneck` block of ResNet50 which consists of 1x1, 3x3, 1x1 in succession, this simplest version of QLNet does a 1x1, splits into two equal halves and **multiplies** them, then applies a 3x3 (depthwise), and a 1x1, *all without activation functions* except at the end of the block, where a "radial" activation function that we call `hardball` is applied.
|
43 |
|
44 |
```python
|
45 |
+
class QLNet(nn.Module):
|
46 |
...
|
47 |
|
48 |
def forward(self, x):
|
|
|
50 |
x = self.conv1(x) # 1x1
|
51 |
C = x.size(1) // 2
|
52 |
x = x[:, :C, :, :] * x[:, C:, :, :]
|
53 |
+
x = self.conv2(x) # 3x3 depthwise
|
54 |
x = self.conv3(x) # 1x1
|
55 |
x += x0
|
56 |
if self.act3 is not None:
|