Math401 Topic 2: Finite-dimensional Hilbert spaces
Recall the complex number is a tuple of two real numbers, with addition and multiplication defined by
or in polar form,
where and .
The complex conjugate of is .
Section 1: Finite-dimensional Complex Vector Spaces
Here, we use the field of complex numbers. or the field of real numbers as the field we are going to encounter.
Definition of vector space
A vector space over a field is a set equipped with an addition and a scalar multiplication, satisfying the following axioms:
- Addition is associative and commutative. For all ,
Associativity:
Commutativity:
-
Additive identity: There exists an element such that for all .
-
Additive inverse: For each , there exists an element such that .
-
Multiplicative identity: There exists an element such that for all .
-
Multiplicative inverse: For each and , there exists an element such that .
-
Distributivity: For all and ,
A vector is an ordered pair of elements over the field .
If we consider , , then are vectors.
The addition and scalar multiplication are defined by
.
The matrix transpose is defined by
The complex conjugate transpose is defined by
In physics, the complex conjugate is sometimes denoted by instead of . The complex conjugate transpose is sometimes denoted by instead of .
Hermitian inner product and norms
On , the Hermitian inner product is defined by
The norm is defined by
Definition of Inner product
Let be a complex vector space. An inner product on is a function satisfying the following axioms:
- For each , is a linear map.
For all and .
- For all , .
is a conjugate linear map.
- and if and only if .
Definition of norm
Let be a complex vector space. A norm on is a function satisfying the following axioms:
-
For all , and if and only if .
-
For all and , .
-
Triangle inequality: For all , .
Definition of inner product space
A complex vector space with an inner product is called a Hilbert space.
Cauchy-Schwarz inequality
For all ,
Parallelogram law
For all ,
Polarization identity
For all ,
Additional definitions
Let .
is the length of .
is a unit vector if .
are orthogonal if .
Definition of orthonormal basis
A set of vectors in a Hilbert space is called an orthonormal basis if
- for all .
- .
Subspaces and orthonormal bases
Definition of subspace
A subset of a vector space is a subspace if it is closed under addition and scalar multiplication.
Definition of orthogonal complement
Let be a subset of a Hilbert space . The orthogonal complement of is the set of all vectors in that are orthogonal to every vector in .
Definition of orthogonal projection
Let be a -dimensional subspace of a Hilbert space . An orthogonal projection of is a linear map
Definition of orthonormal direct sum
A inner product space is the direct sum of if
and for all .
That is, , there exists a unique such that .
Definition of meet and join of subspaces
Let and be two subspaces of a Hilbert space . The meet of and is the subspace such that
The join of and is the subspace such that
Null space and range
Definition of null space
Let be a linear map from a vector space to a vector space . The null space of is the set of all vectors in that are mapped to the zero vector in .
Definition of range
Let be a linear map from a vector space to a vector space . The range of is the set of all vectors in that are mapped from .
Dual spaces and adjoints of linear maps
Definition of linear map
A linear map is a function that satisfies the following axioms:
- Additivity: For all and ,
- Homogeneity: For all and ,
Definition of linear functionals
A linear functional is a linear map from to .
Here, is the field of complex numbers.
Definition of dual space
Let be a vector space over a field . The dual space of is the set of all linear functionals on .
If is a finite-dimensional Hilbert space, then is isomorphic to .
Note is a conjugate linear isomorphism.
Definition of adjoint of a linear map
Let be a linear map. The adjoint of is the linear map such that
for all and .
Definition of self-adjoint operators
A linear operator is self-adjoint if .
Definition of unitary operators
A linear map is unitary if .
Dirac’s bra-ket notation
Definition of bra and ket
Let be a Hilbert space. The bra-ket notation is a notation for vectors in .
is the inner product of and . That is, is a linear functional satisfying the property of inner product.
is the vector (or linear map) .
is a linear map from to .
The spectral theorem for self-adjoint operators
Spectral theorem for self-adjoint operators
Definition of spectral theorem
Let be a Hilbert space. A self-adjoint operator is a linear operator that is equal to its adjoint.
Then all the eigenvalues of are real and there exists an orthonormal basis of consisting of eigenvectors of .
Definition of spectrum
The spectrum of a linear operator on finite-dimensional Hilbert space is the set of all distinct eigenvalues of .
Definition of Eigenspace
If is an eigenvalue of , the eigenspace of corresponding to is the set of all eigenvectors of corresponding to .
We denote the orthogonal projection onto .
Definition of Operator norm
The operator norm of a linear operator is the largest eigenvalue of .
We say is bounded if .
We denote the set of all bounded linear operators on .
Partial trace
Definition of trace
Let be a linear operator on , be a basis of and be a basis of dual space . Then the trace of is defined by
This is equivalent to the sum of the diagonal elements of .
Note, I changed the order of the definitions for the trace to pack similar concepts together. Check the rest of the section defining the partial trace by viewing the tensor product section first, and return to this section after reading the tensor product of linear operators.
Definition of partial trace
Let be a linear operator on , where and are finite-dimensional Hilbert spaces.
An operator on can be written as (by the definition of tensor product of linear operators )
where is a linear operator on and is a linear operator on .
The -partial trace of () is the linear operator on defined by
Or we can define the map by
Note that .
Therefore, .
Then the partial trace of can also be defined by
Let be a set of orthonormal basis of .
Definition of partial trace with respect to a state
Let be a linear operator on , where and are finite-dimensional Hilbert spaces.
Let be a state on consisting of orthonormal basis and eigenvalue .
The partial trace of with respect to is the linear operator on defined by
Space of Bounded Linear Operators
Recall the trace of a matrix is the sum of its diagonal elements.
Hilbert-Schmidt inner product
Let . The Hilbert-Schmidt inner product of and is defined by
Note here, is the complex conjugate transpose of .
If we introduce the basis in , then we can write the the space of bounded linear operators as complex-valued matrices .
For , , we have
The inner product is the standard Hermitian inner product in .
Definition of Hilbert-Schmidt norm (also called Frobenius norm)
The Hilbert-Schmidt norm of a linear operator is defined by
The trace of operator does not depend on the basis.
Tensor products of finite-dimensional Hilbert spaces
Let be a Cartesian product of sets.
Let be a vector in . for .
Let for .
Let’s denote the space of all functions from to by and the space of all functions from to by .
Then we can define a basis of by .
Any function can be written as a linear combination of the basis vectors.
Proof
Note that a function is a map for all elements in the domain.
For each , if and otherwise. So
Now, let be a vector in , and be a vector in . Note that for .
Define
Then we can define a basis of by .
Any function can be written as a linear combination of the basis vectors.
Proof
This basically follows the same rascal as the previous proof. This time, the epsilon function only returns when for all .
Definition of tensor product of basis elements
The tensor product of basis elements is defined by
This is a basis of , here is the set of all functions from to .
Definition of tensor product of two finite-dimensional Hilbert spaces
The tensor product of two finite-dimensional Hilbert spaces (in ) is defined by
Let and be two finite dimensional Hilbert spaces. Let and .
is a bi-anti-linear map from (the Cartesian product of and , a tuple of two elements where first element is in and second element is in ) to (in this case, ). And ,
We call such forms decomposable. The tensor product of two finite-dimensional Hilbert spaces, denoted by , is the set of all linear combinations of decomposable forms. Represented by the following:
Note that for complex-vector spaces.
This is a linear space of dimension .
We define the inner product of two elements of (, ) by
Tensor products of linear operators
Let be a linear operator on and be a linear operator on , where and are finite-dimensional Hilbert spaces. The tensor product of and (denoted by ) on , such that on decomposable elements is defined by
for all and .
The tensor product of two linear operators and is a linear combination in the form as follows:
for all and .
Such tensor product of linear operators is well defined.
Proof
If , then for all and .
Then .
An example of
Tensor product of linear operators on Hilbert spaces
Let be a linear operator on and be a linear operator on , where and are finite-dimensional Hilbert spaces. The tensor product of and (denoted by ) on , such that on decomposable elements is defined by
Extended Dirac notation
Suppose with the standard basis .
and
The Hadamard Transform
Let with the standard basis .
The linear operator is defined by
The Hadamard transform is the linear operator on .
Singular value and Schmidt decomposition
Definition of SVD (Singular Value Decomposition)
Let be a linear operator between two finite-dimensional Hilbert spaces and .
We denote the inner product of and by .
Then there exists a decomposition of
with and such that:
- , for (
- is an isomorphism with inverse where is the range of the operator.
The are called the singular values of .
Basic Group Theory
Finite groups
Definition of group
A group is a set with a binary operation that satisfies the following axioms:
- Closure: For all , .
- Associativity: For all , .
- Identity: There exists an element such that for all , .
- Inverses: For all , there exists an element such that .
Symmetric group
The symmetric group is the group of all permutations of elements.
Unitary group
The unitary group is the group of all unitary matrices.
Such that , where is the complex conjugate transpose of . .
Cyclic group
The cyclic group is the group of all integers modulo .
Definition of group homomorphism
A group homomorphism is a function between two groups and that satisfies the following axiom:
A bijective group homomorphism is called group isomorphism.
Homomorphism sends identity to identity, inverses to inverses
Let be a group homomorphism. and are the identity elements of and respectively. Then
- .
More on the symmetric group
General linear group over
The general linear group over is the group of all invertible complex matrices.
The map is a group homomorphism.
Definition of sign of a permutation
Let be the group homomorphism. The sign of a permutation is defined by
We say is even if and odd if .
Fourier Transform in .
The vector space is the set of all complex-valued functions on with the inner product
An orthonormal basis of is given by .
in Dirac notation, we have
Definition of Fourier transform
Define for . is a function.
The Fourier transform of a function such that is defined by
Symmetric and anti-symmetric tensors
Let be the -fold tensor product of a Hilbert space .
We define the on by
Let be a permutation.
And extend to by linearity.
This gives the property that , .
Definition of symmetric and anti-symmetric tensors
Let be a finite-dimensional Hilbert space.
An element in is called symmetric if it is invariant under the action of . Let
It is called anti-symmetric if