Skip to content

TypeError: 'NoneType' object is not callable in __del__() when exiting #2002

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
4 tasks done
gbutiri opened this issue Apr 12, 2025 · 0 comments
Open
4 tasks done

Comments

@gbutiri
Copy link

gbutiri commented Apr 12, 2025

Prerequisites

  • I am running the latest code.
  • I followed the README.md.
  • I searched open/closed issues.
  • This is a reproducible error and fix.

Expected Behavior

When the model is closed or Python exits, it should clean up resources quietly.

Current Behavior

When using Llama(...), a crash occurs on exit:

TypeError: 'NoneType' object is not callable

This happens inside __del__() when it tries to close internal objects (e.g., ExitStack or model context) that are already None.

Environment and Context

  • Windows 11
  • Python 3.11.3
  • llama-cpp-python version: 0.3.8
  • Using quantized GGUF model via Llama(...)

Steps to Reproduce

  1. Load model via Llama(model_path=..., ...)
  2. Let script exit
  3. Observe destructor crash in logs

Recommended Fix

Wrap close() call in __del__() like so:

def __del__(self):
    try:
        self.close()
    except Exception:
        pass
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant