Before we dive into liner regression model, let us consider functions of statistical models. It is obvious that we are already surrounded by a lot of data, web-logs, search engine query, location data from smartphones, and so on. We cannot understand what they mean for us by just looking at them because they are massive of amount data. Then what should we do in order to understand them and make better business decisions? We tend to lose our sights as massive data has too much information to us. How can we reduce the dimensions of the data so that we can understand what they mean?
Here I would like to introduce inner product. It is sometimes called dot product. I would like to refer to the definition of inner product according to Wikipedia.
In mathematics, the dot product, or scalar product (or sometimes inner product in the context of Euclidean space), is an algebraic operation that takes two equal-length sequences of numbers (usually coordinate vectors) and returns a single number.
This “a single number” is very important for us because we can understand what “a number” means. By using inner product, we can put a lot data into a single number. From 2 or 3 data to one million or billion data, we can convert a lot of data into a single number. Is it wonderful, isn’t it? we can understand data if it is only a single number!
It is simple but we can apply it to a lot of statistical models. For example, liner regression model, logistic regression model, support vector machine, and so on.
Inner product has an essence of functions of statistical models. It can convert a lot of data into a single number, which we can understand. This is what we want because we are surrounded by a lot of data now!
So going forward, I would like to focus on inner product when new statistical models are introduced. It enable us to understand how statistical models work! Especially for beginners of data analysis, I strongly recommend to get familiar with inner product. Then we can go to next phase and introduce liner regression model next week !