API Documentation
This documentation covers Theano module-wise. This is suited to finding theTypes and Ops that you can use to build and compile expression graphs.
compile
– Transforming Expression Graphs to Functionsconfig
– Theano Configurationd3viz
– d3viz: Interactive visualization of Theano compute graphsgof
– Theano Internals [doc TODO]gradient
– Symbolic Differentiationmisc.pkl_utils
- Tools for serialization.printing
– Graph Printing and Symbolic Print Statementsandbox
– Experimental Codescalar
– Symbolic Scalar Types, Ops [doc TODO]scan
– Looping in Theanosparse
– Symbolic Sparse Matricessparse
– Sparse Opsparse.sandbox
– Sparse Op Sandboxtensor
– Types and Ops for Symbolic numpytyped_list
– Typed List
There are also some top-level imports that you might find more convenient:
theano.
function
(…)[source]- Alias for
function.function()
theano.
functiondump
(…_)[source]- Alias for
theano.compile.function.function_dump()
theano.
shared
(…)[source]- Alias for
theano.compile.sharedvalue.shared()
- class
theano.
In
[source] - Alias for
function.In
theano.
dot
(x, y)[source]- Works like
tensor.dot()
for both sparse and dense matrix products
theano.
clone
(output, replace=None, strict=True, share_inputs=True, copy_inputs=theano.
sparsegrad
(_var)[source]- This function return a new variable whose gradient will bestored in a sparse format instead of dense.
Currently only variable created by AdvancedSubtensor1 is supported.i.e. a_tensor_var[an_int_vector].
New in version 0.6rc4.
当前内容版权归 deeplearning 或其关联方所有,如需对内容或内容相关联开源项目进行关注与资助,请访问 deeplearning .