You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi.
When running infer_llm_base.py in InternLM-XComposer/InternLM-XComposer-2.5-OmniLive/example, my 40GB A100 went out of memory.
I'm using the downloaded model locally(downloaded from hugging face`.
Is that a problem with the model or the code?
PS: Please provided more detailed guide for running inference of InternLM-XComposer-2.5-OmniLive. Thanks!
The text was updated successfully, but these errors were encountered:
Hi.
When running
infer_llm_base.py
inInternLM-XComposer/InternLM-XComposer-2.5-OmniLive/example
, my 40GB A100 went out of memory.I'm using the downloaded model locally(downloaded from hugging face`.
Is that a problem with the model or the code?
PS: Please provided more detailed guide for running inference of InternLM-XComposer-2.5-OmniLive. Thanks!
The text was updated successfully, but these errors were encountered: