For a given nonlinear mapping Φ(𝑥), the input data space ℝ𝑛 can be mapped into the Feature space
Φ : ℝ𝑛 ↦ Feature space
𝑥 ↦ Φ(𝑥)
The feature map is an arbitrary map into the feature space 𝐻.
For example, if 𝐻 is finite dimensional (say, of dimension 𝑚), you can pick an orthonormal basis for it,
and think of each component of Φ in that basis as a feature. This is what scalar features are.
Inner product space
orthonormal basis
In mathematics, particularly linear algebra, an orthonormal basis for an inner product space V with finite dimension is a basis for V whose vectors are orthonormal, that is, they are all unit vectors and orthogonal to each other.
For example, the standard basis(표준기저) for a Euclidean space R" is an orthonormal basis, where the relevant inner product is the dot product of vectors. The image(공역) of the standard basis under a rotation or reflection (or any orthogonal transformation) is also orthonormal(=standard basis를 가지고 rotation과 같은 orthogonal transformation을 하면 그 transformed vector들도 orthonormal이다), and every orthonormal basis for R" arises in this fashion.
*standard basis: 유클리드 공간에서 직교 좌표계의 축을 향하는 단위 벡터의 집합
Every vector a in three dimensions is a linear combination of the standard basis vectors i, j and k.
* basis: 벡터 공간을 선형생성하는 선형독립인 벡터들이다. 달리 말해, 벡터 공간의 임의의 벡터에게 선형결합으로서 유일한 표현을 부여하는 벡터들
For a general inner product space V, an orthonormal basis can be used to define normalized orthogonal coordinates on V. Under these coordinates, the inner product becomes a dot product of vectors. (The inner product는 dot product의 일반화된 버전 to abstract vector spaces over a field of scalars, being either the field of real numbers or the field of complex numbers)
orthogonal (직교) | orthonormal |
Two vectors are said to be orthogonal if their dot product is zero Orthogonal vectors are perpendicular to each other |
1) not only orthogonal 2) also, unit length |
Reference
https://en.wikipedia.org/wiki/Standard_basis
https://ko.wikipedia.org/wiki/%EA%B8%B0%EC%A0%80_(%EC%84%A0%ED%98%95%EB%8C%80%EC%88%98%ED%95%99)
https://www.collimator.ai/reference-guides/what-is-orthogonal-vs-orthonormal
https://en.wikipedia.org/wiki/Inner_product_space
'AI > Data Science' 카테고리의 다른 글
[Statistics] Gaussian Prior (0) | 2023.11.08 |
---|---|
[ML/Statistics] Linear Equation 선형방정식 (0) | 2023.11.08 |
[확률] probability VS likelihood (0) | 2023.10.13 |
그리스 문자 기호... (0) | 2023.10.13 |
[선형대수학] 벡터 미분, 행렬 미분 (0) | 2023.10.12 |