Skip to content

Add support for sparse tensors #27

@thomasahle

Description

@thomasahle

The best way to handle convolutions, and other sparse tensors, might be to actually use a numerical sparse tensor class.
This will allow us to do things like contracting multiple of them together into larger, stranger sparse types.

The best option I've seen so far is the follow combination of libraries:

>>> import numpy as np
>>> import sparse
>>> import opt_einsum as oe
>>>
>>> # Create a dense NumPy array
>>> dense_array = np.arange(9).reshape(3,3)
>>>
>>> # Create a sparse COO array
>>> coords = np.array([[0, 1], [2, 0], [1, 2]])
>>> data = np.array([10, 20, 30])
>>> sparse_array = sparse.COO(coords.T, data, shape=(3, 3))
>>>
>>> # Perform Einstein summation using opt_einsum
>>> # Example operation: sum over the second axis of the dense array and the first axis of the sparse array
>>> result_dd = oe.contract('ij,jk->ik', dense_array, dense_array)
>>> result_ds = oe.contract('ij,jk->ik', dense_array, sparse_array)
>>> result_ss = oe.contract('ij,jk->ik', sparse_array, sparse_array)
>>>
>>> print("Result:\n", result_dd)
>>> print("Result:\n", result_ds)
>>> print("Result:\n", result_ss)
Result:
 [[ 15  18  21]
 [ 42  54  66]
 [ 69  90 111]]
Result:
 [[ 40   0  30]
 [100  30 120]
 [160  60 210]]
Result:
 <COO: shape=(3, 3), dtype=int64, nnz=3, fill_value=0>

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions