What’s a parameter in an LLM?

What’s a parameter in an LLM?

The size of current large language models (LLMs) is being measured by the number of parameters. GPT-3 reportedly has 175 billion parameters. Phi-1.5 has just 1.3 billion parameters, while Llama has versions that range between 7 billion and 70 billion parameters.

[...]

LLMs are neural network models. The building block of a neural network model is very similar to our current model for estimating house prices*. Neural networks are built by arranging many of these simple models into a grid arrangement — into layers that contain nodes