-
Notifications
You must be signed in to change notification settings - Fork 121
arcface onnx input and output size have fixed batch size of 1 #139
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
@SthPhoenix , hello! Same question here |
Hi! I can't recall exact reason, something with backward compatibility, I guess, but models have fixed batch size. |
Thanks for a link. P.S. Do you know if it's even possible to make ArcFace produce N 512-element output vectors in ONNX? Currently I have 15 FPS for single face and 7 FPS for 6 faces on my GeForce 1660 (as I'm forced to execute ArcFace model sequentially multiple times). |
I don't fully understand your question, but it looks like you are speaking about batch inference, which is totally possible. |
hello,
first of all thankyou for your valuable contribution.
I have an issue, I am trying to find an arcface model that supports dynamic batching.
I was excited when I saw your onnx file which was described as having batch inference.
https://github.com/SthPhoenix/InsightFace-REST?tab=readme-ov-file#recognition
you can see here.
however when I download and open it on netron, I see that the batch size is 1.
could you help me with this, I have been trying for some time to find a arcface onnx with supports dynamic batching.
thankyou.
The text was updated successfully, but these errors were encountered: