What is a Tensor in Deep Learning?
The term “tensor” is often misunderstood. Let’s figure out what they are through vector examples like velocity, angular momentum, the stress tensor, and the electromagnetic tensor.
A tensor is a generalization of vectors and matrices and is easily understood as a multidimensional array. It is a term and set of techniques known in machine learning in the training and operation of deep learning models can be described in terms of tensors.
Tensorflow, Tensorlab, Deep Tensorized Networks, Tensorized LSTMs… it’s no surprise that the word “tensor” is embedded in the names of many machine learning technologies. But what are tensors? And how do they relate to machine learning? In part one of Quick ML Concepts, I aim to provide a short yet concise summary of what tensors are.
Don’t let the word “tensor” scare you. It is nothing more than a simple mathematical concept. Tensors are mathematical objects that generalize scalars, vectors and matrices to higher dimensions. If you are familiar with basic linear algebra, you should have no trouble understanding what tensors are. In short, a single-dimensional tensor can be represented as a vector. A two-dimensional tensor, as you may have guessed, can be represented as a matrix.
#tensor #tensors
Written by admin