[Question] Matrix Multiplication

What does “matrix multiplication” usually mean? (I know, context.)

I had previously often assumed that it means a matrix to matrix operation, but I now think that it almost never does, but instead it usually means matrix to vector multiplication.

In scientific computing it has to do with solving systems of linear equations.

In AI inference it has to do with weight to activation multiplication.

When I read journalistic texts about special hardware used for Graphics, AI, Simulation, HPC they usually just write that the HW does “matrix multiplications”.

Then the term tensor is used too. As an EE I know vector fields where a 4D input vector (place and time) is transformed into a 3D vector of complex numbers (phasor, Poynting vector) output. I suppose that is a tensor. But I do not understand what a tensor is in AI.

So, in what context do they mean what and does the HW actually have to deal with two matrices or one or is it actually just loads of dot products?