pygmtools.utils.build_batch

pygmtools.utils.build_batch(input, return_ori_dim=False, backend=None)[source]

Build a batched tensor from a list of tensors. If the list of tensors are with different sizes of dimensions, it will be padded to the largest dimension.

The batched tensor and the number of original dimensions will be returned.

Parameters
  • input – list of input tensors

  • return_ori_dim – (default: False) return the original dimension

  • backend – (default: pygmtools.BACKEND variable) the backend for computation.

Returns

batched tensor, (if return_ori_dim=True) a list of the original dimensions

Numpy Example
>>> import numpy as np
>>> import pygmtools as pygm
>>> pygm.set_backend('numpy')

# batched adjacency matrices
>>> A1 = np.random.rand(4, 4)
>>> A2 = np.random.rand(5, 5)
>>> A3 = np.random.rand(3, 3)
>>> batched_A, ori_shape = pygm.utils.build_batch([A1, A2, A3], return_ori_dim=True)
>>> batched_A.shape
(3, 5, 5)
>>> ori_shape
([4, 5, 3], [4, 5, 3])

# batched node features (feature dimension=10)
>>> F1 = np.random.rand(4, 10)
>>> F2 = np.random.rand(5, 10)
>>> F3 = np.random.rand(3, 10)
>>> batched_F = pygm.utils.build_batch([F1, F2, F3])
>>> batched_F.shape
(3, 5, 10)
Pytorch Example
>>> import torch
>>> import pygmtools as pygm
>>> pygm.set_backend('pytorch')

# batched adjacency matrices
>>> A1 = torch.rand(4, 4)
>>> A2 = torch.rand(5, 5)
>>> A3 = torch.rand(3, 3)
>>> batched_A, ori_shape = pygm.utils.build_batch([A1, A2, A3], return_ori_dim=True)
>>> batched_A.shape
torch.Size([3, 5, 5])
>>> ori_shape
(tensor([4, 5, 3]), tensor([4, 5, 3]))

# batched node features (feature dimension=10)
>>> F1 = torch.rand(4, 10)
>>> F2 = torch.rand(5, 10)
>>> F3 = torch.rand(3, 10)
>>> batched_F = pygm.utils.build_batch([F1, F2, F3])
>>> batched_F.shape
torch.Size([3, 5, 10])
Paddle Example
>>> import paddle
>>> import pygmtools as pygm
>>> pygm.set_backend('paddle')

# batched adjacency matrices
>>> A1 = paddle.rand((4, 4))
>>> A2 = paddle.rand((5, 5))
>>> A3 = paddle.rand((3, 3))
>>> batched_A, ori_shape = pygm.utils.build_batch([A1, A2, A3], return_ori_dim=True)
>>> batched_A.shape
[3, 5, 5]
>>> ori_shape
(Tensor(shape=[3], dtype=int64, place=Place(cpu), stop_gradient=True, [4, 5, 3]),
 Tensor(shape=[3], dtype=int64, place=Place(cpu), stop_gradient=True, [4, 5, 3]))

# batched node features (feature dimension=10)
>>> F1 = paddle.rand((4, 10))
>>> F2 = paddle.rand((5, 10))
>>> F3 = paddle.rand((3, 10))
>>> batched_F = pygm.utils.build_batch([F1, F2, F3])
>>> batched_F.shape
[3, 5, 10]
mindspore Example
>>> import mindspore
>>> import pygmtools as pygm
>>> pygm.set_backend('mindspore')

# batched adjacency matrices
>>> A1 = mindspore.numpy.rand((4, 4))
>>> A2 = mindspore.numpy.rand((5, 5))
>>> A3 = mindspore.numpy.rand((3, 3))
>>> batched_A, ori_shape = pygm.utils.build_batch([A1, A2, A3], return_ori_dim=True)
>>> batched_A.shape
(3, 5, 5)
>>> ori_shape
(Tensor(shape=[3], dtype=Int64, value= [4, 5, 3]),
 Tensor(shape=[3], dtype=Int64, value= [4, 5, 3]))

# batched node features (feature dimension=10)
>>> F1 = mindspore.numpy.rand((4, 10))
>>> F2 = mindspore.numpy.rand((5, 10))
>>> F3 = mindspore.numpy.rand((3, 10))
>>> batched_F = pygm.utils.build_batch([F1, F2, F3])
>>> batched_F.shape
(3, 5, 10)