Skip to content

matmul

torch_to_nnef.op.aten.matmul

baddbmm

baddbmm(g, node, name_to_tensor, inference_target, **kwargs)

Map PyTorch: 'aten:baddbmm', 'aten:addmm' to NNEF.

conv_tbc

conv_tbc(g, node, name_to_tensor, null_ref, op_helper, inference_target, **kwargs)

Map PyTorch: 'aten:conv_tbc' to NNEF.

conv_tbc(input, weight, bias, pad) is a 1-D convolution over a (T, B, C) input -- time-batch-channel layout -- with weight (kernel, C_in, C_out). Equivalent semantically to conv1d(input.permute(1, 2, 0), weight.permute(2, 1, 0), bias, padding=pad).permute(2, 0, 1), which is exactly the permute -> conv -> permute chain we emit.

conv_transpose_nd

conv_transpose_nd(g, node, name_to_tensor, null_ref, inference_target, **kwargs)

Map PyTorch: 'aten:conv_transpose{1,2,3}d' to NNEF.

Marked CompositeImplicitAutograd upstream, so PyTorch usually decomposes these to aten::_convolution(transposed=True) before the trace reaches t2n. Registering them anyway keeps the support page accurate and gives a working path if PyTorch ever stops decomposing for some platform.

Signature: (input, weight, bias?, stride, padding, output_padding, groups, dilation) -- 8 positional args.

dot

dot(g, node, name_to_tensor, **kwargs)

Map PyTorch: 'aten:dot' to NNEF.

torch.dot(a, b) is the 1-D x 1-D inner product, returning a scalar. NNEF's matmul requires rank >= 2, so we unsqueeze the inputs to (1, K) and (K, 1), matmul, then squeeze the (1, 1) back to a scalar.

einsum

einsum(g, node, name_to_tensor, inference_target, **kwargs)

Map PyTorch: 'aten:einsum' to NNEF.

linear

linear(g, node, name_to_tensor, null_ref, inference_target, **kwargs)

Map PyTorch: 'aten:linear' to NNEF.

matmul

matmul(g, node, name_to_tensor, **kwargs)

Map PyTorch: 'aten:matmul', 'aten:bmm', 'aten:mm' to NNEF.

mv

mv(g, node, name_to_tensor, **kwargs)

Map PyTorch: 'aten:mv' to NNEF.

torch.mv(M, v) is matrix-vector with M rank-2 and v rank-1, returning a rank-1 result. NNEF matmul needs rank-2 on both sides, so unsqueeze v to (K, 1), matmul to (M, 1), squeeze back.