Stack Overflow is a community of 4.7 million programmers, just like you, helping each other.

Join them; it only takes a minute:

Sign up
Join the Stack Overflow community to:
  1. Ask programming questions
  2. Answer and help your peers
  3. Get recognized for your expertise

I'm trying to train a basic autoencoder in MATLAB. My data is 430 ten-dimensional points, and my autoencoder code like

n_features = 25;
autoenc = trainAutoencoder(data, n_features, ...
    'SparsityRegularization', 1, ...
    'SparsityProportion', 0.1, ...
    'L2WeightRegularization', 0.001, ...
    'MaxEpochs', 1000, ...
    'DecoderTransferFunction','purelin');

I'm using a linear decoder, as you can see. When I run this on my dataset autoenc ends up just learning a constant function. All the input weights end up being different but running predict(autoenc, data) gives the same vector for each data point. What could be going on?

Edit: I looked around some more and it seems like a common phenomenon that autoencoders sometimes learn simply the average value of the data and call it a day. Apparently it is an optimization error, and this is a local minimum. Relevant link; somewhat relevant link. Neither provide satisfactory answers. MATLAB's autoencoder only provides conjugate gradient, as far as I can tell.

share|improve this question
    
These settings look pretty typical, so I wonder if there is something weird about the training data. Could you provide some more info on the data? – user1877862 Feb 2 at 3:09
    
It's a (nonlinear) low-dimensional embedding of cell data, originally ~6500-dimensional. Ten is sufficiently many dimensions to capture at least some structure in the data; I validated this using other methods. – jclancy Feb 2 at 17:14
    
How does the activation of the hidden layer look? i.e. does encode(autoenc, data) also give the same vector for each data point? – user1877862 Feb 2 at 19:15

Your Answer

 
discard

By posting your answer, you agree to the privacy policy and terms of service.

Browse other questions tagged or ask your own question.