You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I have an own dataset with a different set of keypoints compared to H36M, with additional keypoints on the toes. Note that this dataset is small compared to H36M. The goal is to use the pre-trained (on H36M) model and train/finetune it on my own dataset using transfer learning, getting it to output my custom set of keypoints. How would one go about freezing the convolutional base layers and replacing the final fully connected layer / head layer with new ones having random weights on which we train? I assume this is a good option to solve the problem.
Best regards,
Jonathan
The text was updated successfully, but these errors were encountered:
Thank you for the great work with MotionAGFormer!
I have an own dataset with a different set of keypoints compared to H36M, with additional keypoints on the toes. Note that this dataset is small compared to H36M. The goal is to use the pre-trained (on H36M) model and train/finetune it on my own dataset using transfer learning, getting it to output my custom set of keypoints. How would one go about freezing the convolutional base layers and replacing the final fully connected layer / head layer with new ones having random weights on which we train? I assume this is a good option to solve the problem.
Best regards,
Jonathan
The text was updated successfully, but these errors were encountered: