Cogprints

Why feed-forward networks are in a bad shape

Smagt, P. van der and Hirzinger, G. (1998) Why feed-forward networks are in a bad shape. [Conference Paper] (In Press)

Full text available as:

[img] Postscript
197Kb

Abstract

It has often been noted that the learning problem in feed-forward neural networks is very badly conditioned. Although, generally, the special form of the transfer function is usually taken to be the cause of this condition, we show that it is caused by the manner in which neurons are connected. By analyzing the expected values of the Hessian in a feed-forward network it is shown that, even in a network where all the learning samples are well chosen and the transfer function is not in its saturated state, the system has a non-optimal condition. We subsequently propose a change in the feed-forward network structure which alleviates this problem. We finally demonstrate the positive influence of this approach.

Item Type:Conference Paper
Subjects:Computer Science > Neural Nets
Computer Science > Statistical Models
ID Code:495
Deposited By: van der Smagt, Patrick
Deposited On:03 Jul 1998
Last Modified:11 Mar 2011 08:54

Metadata

Repository Staff Only: item control page