Suspicious unnecessary reshape procedure in network.py #1231
Franklalalala
started this conversation in
General
Replies: 1 comment
-
I tested gelu in tensorflow 2.x, it didn't change the dimension either. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
I was reading deepmd-kit\deepmd\utils\network.py, puzzled by this part. ↓
deepmd-kit/deepmd/utils/network.py
Lines 61 to 76 in 159e45d
Since activation funcs do not change the dimension of variable hidden, it seems unnecessary to reshape it.
I tested “relu”, “relu6”, “softplus”, “sigmoid”, “tanh”. None of them change the dimension. (“gelu” was not supported by tensorflow 1.x.)
Or there is an exception I haven't found.
Beta Was this translation helpful? Give feedback.
All reactions