Replies: 2 comments
-
Hi @grudloff, I think you can add it by using Thanks! |
Beta Was this translation helpful? Give feedback.
0 replies
-
We have typically separated network outputs from activation so that you have the flexibility to do what you like with those raw values. Activation is done with postprocessing transforms like this: |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
I was checking the implementation of densenet and I noticed that the class layers, defined as:
There is no softmax in the output, and I find this odd. Am I missing something?
Beta Was this translation helpful? Give feedback.
All reactions