Neural Network Architecture: 4-5-3 Structure Selection

Resource Overview

The neural network adopts a 4-5-3 architecture with a learning rate of 0.28, momentum coefficient of 0.04, and initial weight values randomized between -0.5 and 0.5. This configuration uses a feedforward design where the input layer processes 4 features, the hidden layer contains 5 neurons with activation functions, and the output layer generates 3-class classifications.

Detailed Documentation

The paper specifies a neural network configuration with 4-5-3 architecture, where the learning rate is set to 0.28 and momentum coefficient to 0.04. The initial weights are randomly initialized within the range of -0.5 to 0.5 using uniform distribution. In implementation, this typically involves creating weight matrices between layers: a 4×5 matrix connecting input to hidden layer, and a 5×3 matrix connecting hidden to output layer. These parameter selections significantly influence the network's convergence behavior and final performance during backpropagation training. The momentum term helps accelerate gradient descent in consistent directions while preventing oscillation in narrow valleys of the loss landscape.