Cauchy-Schwarz Inequality

This note briefly states and proves one of the most famous inequalities in geometry/analysis. Theorem Statement $$ \left( \sum_{i = 1}^{n} x_{i}y_{i} \right) ^{2} \leq \left( \sum_{i= 1}^{n} x^{2}_{i} \right) \left( \sum_{i = 1}^{n} y^{2}_{i} \right) $$$$ \lvert \langle u, v \rangle \rvert ^{2} \leq \langle u, u \rangle \cdot \langle v, v \rangle $$ with $u = \left( x_{1}, \dots, x_{n} \right)$ and $v = \left( y_{1}, \dots, y_{n} \right)$ and the $\langle \cdot, \cdot \rangle$ operator is the inner product. We have equality if and only if $u$ and $v$ are linearly dependent (this one is easy to prove if seen from the vectorial view). ...

3 min · Xuanqiang 'Angelo' Huang

Inner product spaces

This set of notes tries to fix what I haven’t learned in 2021 course in algebra. It’s about inner product spaces. A good online reference on the topic is wilkinson. Definitions Inner product space We define the vector space $V$ to be a inner product space, if we define a inner product operator ($\langle \cdot, \cdot \rangle : V \times V \to R$) such that the following are valid: It is linear on both arguments: $$ \langle \alpha x_{1} + \beta x_{2}, y \rangle = \alpha \langle x_{1}, y \rangle + \beta \langle x_{2}, y \rangle $$ It is a symmetric operator: $\langle x, y \rangle = \langle y, x \rangle$ It is positive definite that is we have $\forall x \in V: \langle x, x \rangle \geq 0$ with equality only if $x = \boldsymbol{0}$ An example of such operator is the classical cosine distance which is just the angle, or euclidean distance. Also all $p-\text{norms}$ are inner products. ...

4 min · Xuanqiang 'Angelo' Huang

Introduzione algebra

Tutta sta parte si fa in modo formale in Sistemi Lineari e determinanti, quindi potresti saltarla totalmente Equazioni lineari L’obiettivo dell’algebra lineare è risolvere n equazioni con n sconosciuti di primo grado. Cosa che ci riesce con grandissimo successo! Andiamo ora a definire meglio cosa è una equazione lineare Definizione Una equazione lineare è una equazione a coefficienti appartenenti a un certo campo (che può essere R) e incognite il cui grado è 1 e che siano indipendenti: ...

5 min · Xuanqiang 'Angelo' Huang

Multi Variable Derivatives

Multi-variable derivative To the people that are not used to matrix derivatives (like me) it could be useful to see how $$ \frac{ \partial u^{T}Su }{ \partial u } = 2Su $$ First, we note that if you derive with respect to some matrix, the output will be of the same dimension of that matrix. That notation is just deriving every single component independently and then joining them together, so it will be better understood as as $$ \frac{ \partial u^{T}Su }{ \partial u } = \begin{bmatrix} \frac{ \partial u^{T}Su }{ \partial u_{1} } \ \dots \ \frac{ \partial u^{T}Su }{ \partial u_{M} } \ \end{bmatrix} $$ So we can prove each derivative independently, it's just a lot of manual work! We see that $u^{T}Su$ is just a quadratic form, studied in [Massimi minimi multi-variabile#Forme quadratiche](/notes/massimi-minimi-multi-variabile#forme-quadratiche) so it is just computing this: $$ u^{T}Su = \sum_{i, j = 1, 1}^{M} u_{i}u_{j}S_{ij} \implies \frac{ \partial u^{T}Su }{ \partial u_{1} } =2u_{1}S_{11} + \sum_{j \neq 1}^{M}(u_{j}S_{1j} + u_{j}S_{j1}) = 2\left( u_{1}S_{11} + \sum_{j \neq 1}u_{j}S_{1j} \right) = 2(Su)_{1} $$ Last equation is true because $S$ is a symmetric matrix, then we easily see that indeed it’s true that indeed it’s the first row of the $Su$ matrix multiplied by 2. ...

5 min · Xuanqiang 'Angelo' Huang

Sistemi Lineari e determinanti

4.1 Sistemi lineari La cosa buona è che possiamo analizzare il sistema lineare utilizzando tutti i teoremi che abbiamo sviluppato finora, quindi siamo molto più potenti per attaccare questo problema. Definiamo un sistema lineare così $Ax = b$ con A la matrice associata. 4.1.1 Preimmagine Data una applicazione lineare $F:V \to W$, allora la controimmagine è l’insieme dei vettori di V che fanno a finire in quel punto, in matematichese: ...

6 min · Xuanqiang 'Angelo' Huang