Tell me more ×
Stack Overflow is a question and answer site for professional and enthusiast programmers. It's 100% free, no registration required.

I'm using a graph theory framework to model an information flow problem.

I'm looking for an algorithm that will adjust the a graph's edges' weight to result a specific behavior. As I'm working with Matlab, if there exists a dedicated toolbox that will assist in visualizing\modelling\writing the algorithm, it would be great.

The problem domain is As follows :

  • The graph's edges are weighted and directed.
  • The information flow in the graph is time dependent. That is, there is a global time variable that tics every delta t, in which some calculations are made for each node.
  • Nodes are assigned a numerical value that may change in each time step - let's call it the degree of activation of the node.

The model

Consist of 3 layers, each represented by a matrix where if node i and node j are connected by a node of strength w than the value of the cell at the i'th column and the j'th row is w.

Graph sketch

For the convenience of explanation, we'll name the layers : l1 - input, l2 - calculation, l3 - output. The layers are fully connected, that is each matrix-cell of l1 is connected to every cell of l2 and likewise each cell of l2 is connected to every cell of l3 (the edges are directed).

In time step t the degree of activation of each node is a function of the activation at time t-1 of all the nodes connected to it, each multiplied by the weight of the connecting edge. For example, if we consider 2 nodes of the input layer i1,i2 that connect to a single node of the calculation layer c1 with weights w1,w2 respectively, and we'll say that on time t=0 activation of i1,i2 are a1,a2 respectively, than the activation level of c1 at time t=1 equals to f(w1*a1,w2*a2).

Activation flow

Prior knowledge of the model dictates accurate predictions of what activation pattern (0\1 arrangement) of the input layer l1 will result in the output layer l3. Now, I would like to somehow train the network to adjust the weights of the edges such that after enough time (delta t), the dynamics of activation will result divergence of inputs from the input layer to the desired output at the output layer.

So my 2 questions are :

What's a good algorithm to train the network for adjusting weights in this system (a simple algorithm name or a link to Wikipedia would be a great start)? And second, if you know of an existing Matlab tool that could make life easier with the implementation, that would be even nicer.

Thanks!

share|improve this question

1 Answer

up vote 1 down vote accepted

Don't know about the algorithm, but it sounds like the Neural Network toolbox is what you need.

Arnaud

share|improve this answer

Your Answer

 
discard

By posting your answer, you agree to the privacy policy and terms of service.

Not the answer you're looking for? Browse other questions tagged or ask your own question.