functional-tf-nn
The commonly used libraries for implementing, training, and evaluating learning algorithms often improve usability at the expense of composability and research flexibility. However, a trade-off is not required to achieve a higher relational abstraction level.
The provided higher-order representation of function composition in neural networks, implemented within the constraints of a commonly used library (TensorFlow), demonstrates a segmentation of model construction with closure and enables code simplicity and research flexibility for modifications, such as experimenting with dropout and pruning schemes and accessing gradients. Generalizing this approach could potentially provide extensibility comparable to a language such as LISP.
functional-tf-nn/feedforward
Build a fully connected feedforward neural network from a topology list:
- build layer functions
- compose layer functions into a model function
- evaluate the model function
functional-tf-nn/lstm
Build a multilayer LSTM from a topology list:
- build a cell function for each layer
- build layer functions from cell functions
- compose layer functions into a model function
- evaluate the model function