speechbrain.nnet.loss.transducer_loss module

Transducer loss implementation (depends on numba)

Authors
  • Abdelwahab Heba 2020

class speechbrain.nnet.loss.transducer_loss.Transducer(*args, **kwargs)[source]

Bases: torch.autograd.function.Function

This class implements the Transducer loss computation with forward-backward algorithm Sequence Transduction with naive implementation : https://arxiv.org/pdf/1211.3711.pdf

This class use torch.autograd.Function. In fact of using the forward-backward algorithm, we need to compute the gradient manually.

This class can’t be instantiated, please refer to TransducerLoss class

It is also possible to use this class directly by using Transducer.apply

static forward(ctx, log_probs, labels, T, U, blank, reduction)[source]
static backward(ctx, grad_output)[source]
class speechbrain.nnet.loss.transducer_loss.TransducerLoss(blank=0, reduction='mean')[source]

Bases: torch.nn.modules.module.Module

This class implements the Transduce loss computation with forward-backward algorithm. Sequence Transduction with naive implementation : https://arxiv.org/pdf/1211.3711.pdf

The TranducerLoss(nn.Module) use Transducer(autograd.Function) to compute the forward-backward loss and gradients.

Example

>>> import torch
>>> loss = TransducerLoss(blank=0)
>>> acts = torch.randn((1,2,3,5)).cuda().log_softmax(dim=-1).requires_grad_()
>>> labels = torch.Tensor([[1,2]]).cuda().int()
>>> act_length = torch.Tensor([2]).cuda().int()
>>> # U = label_length+1
>>> label_length = torch.Tensor([2]).cuda().int()
>>> l = loss(acts, labels, act_length, label_length)
>>> l.backward()
forward(log_probs, labels, T, U)[source]
training: bool