Reading:
Q: But isn't linear algebra mainly just vector and matrix operations?
(There is actual higher-order versions of linear algebra, referred to as Tensor operations, such as Tucker decompositions, but this is a small research niche, not related to the math behind Tensorflow etc.)
Behind the scenes it's just another list of numbers with some extra info regarding dimensions
https://docs.scipy.org/doc/numpy-1.13.0/reference/arrays.ndarray.html
A $d$-dimensional data structure, containing $n_1\times n_2 \times ... \times n_d$ numbers
np.random.rand(3)
array([0.57100166, 0.07586713, 0.90062779])
np.random.rand(3,2) # note not a tuple (unlike many other numpy functions)
array([[0.49662458, 0.23158612], [0.94269648, 0.84962285], [0.68847847, 0.73942405]])
np.random.rand(3,2,4)
array([[[0.09840227, 0.26310991, 0.58589047, 0.33265107], [0.89329091, 0.99117784, 0.49475921, 0.08047502]], [[0.09570909, 0.28074458, 0.15119848, 0.99682941], [0.46506878, 0.0333583 , 0.08488412, 0.26735574]], [[0.65350538, 0.43238726, 0.79292367, 0.58734204], [0.3019322 , 0.21985196, 0.41835067, 0.44614492]]])
T = np.random.rand(3,2,4,3,2)
T.shape
(3, 2, 4, 3, 2)
T.ndim
5