Skip to content

activation

torch_to_nnef.op.aten.activation

clamp

clamp(g, node, name_to_tensor, **kwargs)

Map PyTorch: 'aten:clamp' to NNEF.

clamp_max

clamp_max(g, node, name_to_tensor, **kwargs)

Map PyTorch: 'aten:clamp_max' to NNEF.

clamp_min

clamp_min(g, node, name_to_tensor, **kwargs)

Map PyTorch: 'aten:clamp_min' to NNEF.

elu

elu(**kwargs)

Map PyTorch: 'aten:elu' to NNEF.

erf

erf(g, node, name_to_tensor, null_ref, inference_target, **kwargs)

Op should be added to tract-nnef eventualy.

gelu

gelu(g, node, name_to_tensor, null_ref, inference_target, **kwargs)

Map PyTorch: 'aten:gelu' to NNEF.

glu

glu(g, node, name_to_tensor, **kwargs)

Map PyTorch: 'aten:glu' to NNEF.

hardswish

hardswish(inference_target, **kwargs)

Map PyTorch: 'aten:hardswish' to NNEF.

hardtanh

hardtanh(**kwargs)

Map PyTorch: 'aten:hardtanh' to NNEF.

leaky_relu

leaky_relu(**kwargs)

Map PyTorch: 'aten:leaky_relu' to NNEF.

log_softmax

log_softmax(inference_target, **kwargs)

Map PyTorch: 'aten:log_softmax' to NNEF.

prelu

prelu(**kwargs)

Map PyTorch: 'aten:prelu' to NNEF.

relu6

relu6(**kwargs)

Map PyTorch: 'aten:relu6' to NNEF.

selu

selu(**kwargs)

Map PyTorch: 'aten:selu' to NNEF.

silu

silu(**kwargs)

Map PyTorch: 'aten:silu' to NNEF.

softmax

softmax(**kwargs)

Map PyTorch: 'aten:softmax', 'aten:_softmax' to NNEF.

softplus

softplus(**kwargs)

Map PyTorch: 'aten:softplus' to NNEF.

Note: numerical stability applied in PyTorch is not done in NNEF vanilla implementation, nor case beta != 1.

PyTorch ref

y = (1/beta) * log(exp(beta * x) + 1) if ((beta * x) < thresh) else x

NNEF ref

y = log(exp(x) + 1.0)