You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Thanks for sharing your code about the Keras implementation of ArcFace loss. I recently use this loss to train my model but found something that really confuses me.
In the code below, y_mask =+ K.epsilon() will make y_mask always be equal to K.epsilon(), which is default to 1e-7. This makes the whole loss be equivalent to Softmax since the term cos_tm_temp * y_mask has been almost eliminated.
On the other side, I tried deleting this line to make y_mask become the one-hot form true label, but then the loss become a constant so that the weights & biases do not updates anymore.
So I wonder if you have some advice on this. BTW, do you know how to print out the intermediate values in a custom loss layer? I debugged the loss function line by line and it seems right, while the outcomes when training is not. :(
Hi @ewrfcas ,
Thanks for sharing your code about the Keras implementation of ArcFace loss. I recently use this loss to train my model but found something that really confuses me.
In the code below, y_mask =+ K.epsilon() will make y_mask always be equal to K.epsilon(), which is default to 1e-7. This makes the whole loss be equivalent to Softmax since the term cos_tm_temp * y_mask has been almost eliminated.
On the other side, I tried deleting this line to make y_mask become the one-hot form true label, but then the loss become a constant so that the weights & biases do not updates anymore.
So I wonder if you have some advice on this. BTW, do you know how to print out the intermediate values in a custom loss layer? I debugged the loss function line by line and it seems right, while the outcomes when training is not. :(
Machine-Learning-Toolbox/loss_function/ArcFace_loss.py
Lines 55 to 58 in 127d6e5
The text was updated successfully, but these errors were encountered: