Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

New CholeskyCorr transform #7700

Draft
wants to merge 3 commits into
base: main
Choose a base branch
from

Conversation

jessegrabowski
Copy link
Member

@jessegrabowski jessegrabowski commented Feb 28, 2025

Description

For drawing from an LKJCholeskyCorr distribution. Seems nice, but needs a forward transform.

Marked as maintenance because our current transformation breaks for n > about 5.

Related Issue

Checklist

Type of change

  • New feature / enhancement
  • Bug fix
  • Documentation
  • Maintenance
  • Other (please specify):

📚 Documentation preview 📚: https://pymc--7700.org.readthedocs.build/en/7700/

@ricardoV94
Copy link
Member

CC @spinkney we confirmed that the log(det(jacob) thing works out to the same as the one defined in your algorithm. We tested by doing the sqrt(log(det(jacobian.T @ jacobian))) mentioned in the PDF, perhaps @jessegrabowski can share that code for reproducibility.

First results with sampling looked promising, but still very early :)

By the way, do you happen to also have worked out the forward transform (from the constrained matrix to the unconstrained vector)?

And thanks in advance! We must not forget to give the right attribution in the code @jessegrabowski

@spinkney
Copy link

CC @spinkney we confirmed that the log(det(jacob) thing works out to the same as the one defined in your algorithm. We tested by doing the sqrt(log(det(jacobian.T @ jacobian))) mentioned in the PDF, perhaps @jessegrabowski can share that code for reproducibility.

First results with sampling looked promising, but still very early :)

By the way, do you happen to also have worked out the forward transform (from the constrained matrix to the unconstrained vector)?

And thanks in advance! We must not forget to give the right attribution in the code @jessegrabowski

That is awesome!! I have not worked out the forward transform (we call this, confusingly, in Stan the unconstraining transform). It should be relatively straightforward since it's a stereographic like transform. I need to get the derivatives as well for both forward and backward versions. I noticed that the sampling really improved for 100 plus dimensions vs the onion method which is surprising. If you guys have all the code for the sampling tests we should just throw up a paper with everything.

@jessegrabowski
Copy link
Member Author

Jacobian check code:

from pymc.distributions.transforms import CholeskyCorr
import numpy as np
from pytensor.gradient import jacobian
import pytensor.tensor as pt


rng = np.random.default_rng(12345)

# Set up symbolic graph
values = pt.tensor('values', shape=(None, ))

# Some acrobatics so that everything can be done purely symbolically 
k = values.shape[0]
n = ((1 + pt.sqrt((1 + 4 * 2 * k))) / 2).astype(int)

transform = CholeskyCorr(n)
L, log_det_jac = transform._compute_L_and_logdet_scan(values)
J = jacobian(L.ravel(), values)

# Gram matrix
G = J.T @ J
log_det_jac_2 = pt.log(pt.sqrt(pt.linalg.det(G)))

# Numerical evaluation
n = 4
k = n * (n - 1) // 2

value_vals = rng.normal(size=(k,))

# This one comes from the loop method used in the stan model
log_det_jac.eval({values: value_vals})  # array(-3.11431652)

# This one is analytically computed from L (which also comes from the loop)
log_det_jac_2.eval({values: value_vals}) # array(-3.11431653)

we call this, confusingly, in Stan the unconstraining transform

Internally I'm advocating for us to change to this :) Forward and backward are extremely unclear -- forward from what?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants