When we talk about the implementation details of AI systems, we usually think of topics of mathematics like equations, probability, regression, and reducing errors.
But the moment the idea of weight comes into the picture the whole thing shifts towards Physics. Force, balance, inertia, shift, bias etc become relevant. Weights influence the behavior of systems in ways much similar to physical phenomena, making the intersection of AI and physics more apparent than it might initially seem.
This year (2024) Nobel Laureates in Physics utilized tools from their discipline to develop methods that helped lay the groundwork for modern machine learning. John Hopfield pioneered a structure that can store and reconstruct information, while Geoffrey Hinton invented a method that autonomously uncovers properties in data, which is crucial for the large-scale artificial neural networks in use today.
https://www.nobelprize.org/prizes/physics/2024/press-release/
When working with multivariate/multidimensional vectors, an essential component is the assignment of weights. As soon as weights enter the equation, a host of physics-related phenomena come into play, such as:
- Direction
- Inclination
- Balance
- Gravitational pull
- Attraction
- Groups forming clusters
- Clusters forming thought processes
- Inertia
- Paradigm shifts
Physics, in many ways, shapes the core concepts behind how neural networks operate!h