Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Make sparse tensor types extend the existing tensor types #303

Closed

Conversation

brandonwillard
Copy link
Member

@brandonwillard brandonwillard commented Feb 14, 2021

This PR refactors the sparse tensor type structure so that sparse types extend the existing tensor types. A new meta type is used to construct a DenseTensorType that can be used to distinguish dense from sparse TensorTypes.

Closes #142.

  • Create a DenseTensorVariable (using the same approach as DenseTensorType)
  • Update more optimizations so that they exclude SparseTensors
  • Convert sparse to dense when used with some Ops (e.g. Elemwise, CAReduce)

@brandonwillard brandonwillard added important refactor This issue involves refactoring labels Feb 14, 2021
@brandonwillard brandonwillard self-assigned this Feb 14, 2021
@brandonwillard brandonwillard marked this pull request as draft February 14, 2021 07:11
@brandonwillard brandonwillard force-pushed the generalize-sparse-api branch 6 times, most recently from f593fbc to e76a322 Compare February 15, 2021 03:40
@codecov
Copy link

codecov bot commented Feb 15, 2021

Codecov Report

Merging #303 (00e3581) into master (13b08f0) will increase coverage by 0.01%.
The diff coverage is 86.36%.

Impacted file tree graph

@@            Coverage Diff             @@
##           master     #303      +/-   ##
==========================================
+ Coverage   71.89%   71.91%   +0.01%     
==========================================
  Files         166      166              
  Lines       54567    54632      +65     
==========================================
+ Hits        39232    39288      +56     
- Misses      15335    15344       +9     
Impacted Files Coverage Δ
aesara/gradient.py 87.20% <ø> (ø)
aesara/graph/optdb.py 84.36% <ø> (ø)
aesara/gpuarray/type.py 31.55% <33.33%> (ø)
aesara/sparse/type.py 74.44% <60.00%> (-5.81%) ⬇️
aesara/tensor/blas.py 86.39% <63.15%> (-0.49%) ⬇️
aesara/graph/opt.py 68.14% <86.66%> (-0.01%) ⬇️
aesara/sparse/basic.py 89.06% <92.85%> (+0.10%) ⬆️
aesara/tensor/basic.py 88.38% <98.11%> (+0.19%) ⬆️
aesara/graph/fg.py 92.58% <100.00%> (ø)
aesara/graph/toolbox.py 70.77% <100.00%> (+0.09%) ⬆️
... and 9 more

@brandonwillard brandonwillard marked this pull request as ready for review February 15, 2021 04:55
twiecki
twiecki previously approved these changes Feb 15, 2021
@@ -355,8 +360,7 @@ def __getitem__(self, args):
return ret


class SparseVariable(_sparse_py_operators, Variable):
dtype = property(lambda self: self.type.dtype)
class SparseVariable(_sparse_py_operators, TensorVariable):
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why are the supertypes in a different order compared to the SparseConstant below?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Not sure.

@@ -442,7 +451,7 @@ def bsr_matrix(name=None, dtype=None):
csr_fmatrix = SparseType(format="csr", dtype="float32")
bsr_fmatrix = SparseType(format="bsr", dtype="float32")

all_dtypes = SparseType.dtype_set
all_dtypes = list(SparseType.dtype_specs_map.keys())
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Did you make it a list so it can be appended at runtime?

else:
raise NotImplementedError(
f'unsupported dtype "{dtype}" not in list', list(self.dtype_set)
"You can't make SparseType object when SciPy is not available."
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is it even a supported use case to NOT have scipy?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

That's how this project was originally designed, I believe, but there's no point in supporting that optionality now.

@brandonwillard
Copy link
Member Author

This sparse implementation only supports matrices, but we could cover more ground with sparse. It's an additional dependency, but seems like it could be worth it.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request important refactor This issue involves refactoring sparse tensors
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Unify sparse and dense tensor class design
3 participants