HomeGlossaryNeural Network
Models & Architecture

Neural Network

A computational model loosely inspired by the brain, made of interconnected nodes (neurons) that process information in layers.

A neural network is a system of interconnected computational units called neurons, organized in layers. An input layer receives raw data, one or more hidden layers transform it, and an output layer produces a prediction or result. The connections between neurons carry weights that are adjusted during training.

Modern neural networks bear only superficial resemblance to biological brains. The math is fundamentally a series of matrix multiplications followed by activation functions. But the layered, distributed representation of information is genuinely powerful and has proven capable of learning highly complex functions.

A single neuron computes: output = activation(Σ(weight × input) + bias). Stacked millions of times across layers, this becomes a powerful function approximator.

Types of Neural Networks

  • Feedforward (MLP) — simplest form; data flows one direction
  • Convolutional (CNN) — specialized for spatial data like images
  • Recurrent (RNN) — handle sequential data with memory
  • Transformer — uses attention mechanisms; dominates modern AI

The number of parameters in a neural network determines its capacity. GPT-4 is estimated to have over 1 trillion parameters. Training these massive networks requires specialized hardware and weeks of compute time across thousands of GPUs.

Related Terms

← Back to Glossary