theano.gpuarray.ctc – Connectionist Temporal Classification (CTC) loss

Warning

This is not the recomanded user interface. Use the CPUinterface. It will get movedautomatically to the GPU.

Note

Usage of connectionist temporal classification (CTC) loss Op, requires thatthe warp-ctc library isavailable. In case the warp-ctc library is not in your compiler’s library path,the config.ctc.root configuration option must be appropriately set to thedirectory containing the warp-ctc library files.

Note

Unfortunately, Windows platforms are not yet supported by the underlyinglibrary.

  • theano.gpuarray.ctc.gpuctc(_activations, labels, input_lengths)[source]
  • Compute CTC loss function on the GPU.

Parameters:

  • activations – Three-dimensional tensor, which has a shape of (t, m, p), wheret is the time index, m is the minibatch index, and p is the indexover the probabilities of each symbol in the alphabet. The memorylayout is assumed to be in C-order, which consists in the slowestto the fastest changing dimension, from left to right. In this case,p is the fastest changing dimension.
  • labels – A 2-D tensor of all the labels for the minibatch. In each row, thereis a sequence of target labels. Negative values are assumed to be padding,and thus are ignored. Blank symbol is assumed to have index 0 in thealphabet.
  • input_lengths – A 1-D tensor with the number of time steps for each sequence inthe minibatch.Returns: Cost of each example in the minibatch. Return type: 1-D array
  • class theano.gpuarray.ctc.GpuConnectionistTemporalClassification(compute_grad=True)[source]
  • GPU wrapper for Baidu CTC loss function.

Parameters:compute_grad – If set to True, enables the computation of gradients of the CTC loss function.