A tensor
It's been great having to have dealt with all the 1-D concepts in algebra and the 2-D functions in vector calculus, and now here we are, diving into the domain of high-dimensional mathematical computations. And this is how we start, from scalar beginnings. And the best is yet to come.
From a physicist, what we will be building can be considered as function approximators. From an engineering perspective, they're more analogous to controllers and adaptive filters. For mathematicians, they're universal approximators, derived from the universal approximation theorem.
Generally, we will consider the tools as likelihood estimators.
from the illustrations above, vector, not really visualized, is the row or column in the grayscale of the image, defined by dimension 1.
Tensor networks are a natural way to parameterize interesting and powerful machine learning models.
It's now time to playing with gradients in the next section, delving into differentiation, persisting of computations in graphs and what goes on in the backwards call.