Skip to content

A higher-order representation of function composition in neural networks, demonstrating a segmentation of model construction with closure, and enabling code simplicity and research flexibility.

master
Go to file
Code

Latest commit

 

Git stats

Files

Permalink
Failed to load latest commit information.
Type
Name
Latest commit message
Commit time
 
 
 
 
 
 
 
 

README.md

functional-tf-nn

The commonly used libraries for implementing, training, and evaluating learning algorithms often improve usability at the expense of composability and research flexibility. However, a trade-off is not required to achieve a higher relational abstraction level.

The provided higher-order representation of function composition in neural networks, implemented within the constraints of a commonly used library (TensorFlow), demonstrates a segmentation of model construction with closure and enables code simplicity and research flexibility for modifications, such as experimenting with dropout and pruning schemes and accessing gradients. Generalizing this approach could potentially provide extensibility comparable to a language such as LISP.

functional-tf-nn/feedforward

Build a fully connected feedforward neural network from a topology list:

  1. build layer functions
  2. compose layer functions into a model function
  3. evaluate the model function

functional-tf-nn/lstm

Build a multilayer LSTM from a topology list:

  1. build a cell function for each layer
  2. build layer functions from cell functions
  3. compose layer functions into a model function
  4. evaluate the model function

About

A higher-order representation of function composition in neural networks, demonstrating a segmentation of model construction with closure, and enabling code simplicity and research flexibility.

Topics

Resources

License

Releases

No releases published

Packages

No packages published

Languages

You can’t perform that action at this time.