pygmtools.utils.dense_to_sparse

pygmtools.utils.dense_to_sparse(dense_adj, backend=None)[source]

Convert a dense connectivity/adjacency matrix to a sparse connectivity/adjacency matrix and an edge weight tensor.

Parameters
  • dense_adj\((b\times n\times n)\) the dense adjacency matrix. This function also supports non-batched input where the batch dimension b is ignored

  • backend – (default: pygmtools.BACKEND variable) the backend for computation.

Returns

if batched input: \((b\times ne\times 2)\) sparse connectivity matrix, \((b\times ne\times 1)\) edge weight tensor, \((b)\) number of edges

if non-batched input: \((ne\times 2)\) sparse connectivity matrix, \((ne\times 1)\) edge weight tensor,

Numpy Example
>>> import numpy as np
>>> import pygmtools as pygm
>>> pygm.set_backend('numpy')
>>> np.random.seed(0)

>>> batch_size = 10
>>> A = np.random.rand(batch_size, 4, 4)
>>> A[:, np.arange(4), np.arange(4)] = 0 # remove the diagonal elements
>>> A.shape
(10, 4, 4)

>>> conn, edge, ne = pygm.utils.dense_to_sparse(A)
>>> conn.shape # connectivity: (batch x num_edge x 2)
(10, 12, 2)

>>> edge.shape # edge feature (batch x num_edge x feature_dim)
(10, 12, 1)

>>> ne
[12, 12, 12, 12, 12, 12, 12, 12, 12, 12]
Pytorch Example
>>> import torch
>>> import pygmtools as pygm
>>> pygm.set_backend('pytorch')
>>> _ = torch.manual_seed(0)

>>> batch_size = 10
>>> A = torch.rand(batch_size, 4, 4)
>>> torch.diagonal(A, dim1=1, dim2=2)[:] = 0 # remove the diagonal elements
>>> A.shape
torch.Size([10, 4, 4])

>>> conn, edge, ne = pygm.utils.dense_to_sparse(A)
>>> conn.shape # connectivity: (batch x num_edge x 2)
torch.Size([10, 12, 2])

>>> edge.shape # edge feature (batch x num_edge x feature_dim)
torch.Size([10, 12, 1])

>>> ne
tensor([12, 12, 12, 12, 12, 12, 12, 12, 12, 12])
Paddle Example
>>> import paddle
>>> import pygmtools as pygm
>>> pygm.set_backend('paddle')
>>> paddle.seed(0)

>>> batch_size = 10
>>> A = paddle.rand((batch_size, 4, 4))
>>> paddle.diagonal(A, axis1=1, axis2=2)[:] = 0 # remove the diagonal elements
>>> A.shape
[10, 4, 4]

>>> conn, edge, ne = pygm.utils.dense_to_sparse(A)
>>> conn.shape # connectivity: (batch x num_edge x 2)
torch.Size([10, 16, 2])

>>> edge.shape # edge feature (batch x num_edge x feature_dim)
torch.Size([10, 16, 1])

>>> ne
Tensor(shape=[10], dtype=int64, place=Place(cpu), stop_gradient=True,
        [16, 16, 16, 16, 16, 16, 16, 16, 16, 16])
mindspore Example
>>> import mindspore
>>> import pygmtools as pygm
>>> pygm.set_backend('mindspore')
>>> _ = mindspore.set_seed(0)

>>> batch_size = 10
>>> A = mindspore.numpy.rand((batch_size, 4, 4))
>>> mindspore.numpy.diagonal(A, axis1=1, axis2=2)[:] = 0 # remove the diagonal elements
>>> A.shape
(10, 4, 4)

>>> conn, edge, ne = pygm.utils.dense_to_sparse(A)
>>> conn.shape # connectivity: (batch x num_edge x 2)
(10, 16, 2)

>>> edge.shape # edge feature (batch x num_edge x feature_dim)
(10, 16, 1)

>>> ne
[16 16 16 16 16 16 16 16 16 16]