# Linear Vector Spaces and Operators

### From FSUPhysicsWiki

Quantum mechanics can be conveniently formulated in the language of abstract state vectors, from which the various representations (wave mechanics, matrix mechanics, Schrödinger, Heisenberg and interaction pictures, etc.) can be derived. A formulation of quantum mechanics in terms of linear vector spaces hinges on the fact that the Schrödinger equation is linear. An operator defines a mathematical operation performed on a vector belonging to a linear vector space, the result of which is another vector belonging to the same linear vector space.

## The Vector (Ket) Space

In quantum mechanics, a physical state is represented by a state vector in a complex linear vector space. Following Dirac, we call such a vector a "ket", denoted by This state vector is postulated to contain complete information about the physical state (i.e. everything we are allowed to ask about the state is contained in the vector). The complex linear vector spaces that we work with in quantum mechanics are usually infinite dimensional. In this case, the vector space in question is known as a Hilbert space after D. Hilbert, who studied vector spaces in infinite dimensions.

One of the postulates of quantum mechanics is that and with represent the same physical state. In other words, only the ”direction” in vector space is of significance.

Since we assume that these vectors belong to a linear vector space, we may, given any set of state vectors form a superposition of the states, given by a linear combination of the vectors

## The Dual (Bra) Space

The vector space we have been dealing with is a ket space. We now introduce the notion of a "bra" space, a vector space ”dual to” the ket space. We postulate that corresponding to every ket there exists a bra denoted by in this dual or bra space. The dual space is spanned by a set of bra vectors which correspond to the set of kets Mathematically, the dual space is a set of linear functions that act on the members of the corresponding vector space where is the vector space and is the set of complex numbers.

There is a one-to-one correspondence between the members of a ket space and those of the corresponding bra space, where stands for dual correspondence. Roughly speaking, we can regard the bra space as some kind of "mirror image" of the ket space.

The bra dual to is postulated to be , not which is a very important point to note. More generally, we have

## The Hilbert Space

A Hilbert space consisting of a set of vectors and a set of scalars obeys the following properties.

(a) ** is a linear vector space.** It obeys all the properties of a linear vector space as mentioned in the previous section.

(b) **The scalar product defined in is strictly positive.** The scalar product of one element with another element is a complex number, denoted by . This scalar satisfies the following properties.

**(1)** The scalar product of with is same as the complex conjugate of the scalar product of with

**(2)** The scalar product of with is linear with respect to

**(3)** The scalar product of a state vector with itself is a positive real number.

In terms of this scalar product, we may express the normalization of a wave function as

## Schwartz Inequality

For two states and belonging to a linear vector space, the following theorem, known as the Schwartz inequality, holds:

If the vectors and are linearly dependent, i.e. then the above relation becomes an equality.

*Proof:*
Let and be arbitrary vectors in the vector space *V*. The inequality is trivial in the case that at least one of and so we will consider the case that both are nonzero. Let λ be a complex number. Then

The above expression is valid for any value of λ. The right-hand side of the above expression is minimized if we choose

Using this value of λ, we obtain

or

## Linear Operators

Let *V* be a linear vector space. A linear operator is an operation, denoted by that maps a given ket vector in *V* to a different vector in the same space, and has the property that

In addition, linear operators obey the following properties.

(1) If for every then is equal to .

(2) Commutative law:

(3) Associative law:

(4) Multiplication of operators:

(5) There exists an identity operator such that

For some, but not all, operators there exists an inverse operator such that

## The Hermitian Adjoint

The dual vector to , where is a linear operator, is where is known as the Hermitian adjoint of and is itself a linear operator. The properties obeyed by Hermitian adjoints are as follows.

(1)

(2) For any complex number

(3)

(4)

(5) For any complex number

(6)

(7)

## Linear Independence and Bases

Consider vectors (ket states) , belonging to a linear vector space *V*. They are linearly independent if the relation
necessarily implies for . They can be used as a basis in a vector space, and decomposition of any vector in terms of basis vectors in unique. Any such set of basis vectors must be complete; i.e., *any* vector can be written as a linear combination of vectors from the set.

While any set of linearly independent vectors can be used as a basis, it is usually easier to work in an orthonormal basis; i.e., For such a case, we may write a completeness relation for the basis. Consider an arbitrary vector expanded in terms of a given orthonormal basis:

We may find the coefficients by simply taking the scalar product of with each of the basis vectors, obtaining

Substituting back into the expansion for we find that

Since is arbitrary, we conclude that

where is the identity operator. This is the completeness relation that we sought. It is possible to derive similar relations for more general bases through a similar line of reasoning.

## Matrix Elements of a Linear Operator

The action of a linear operator is completely known once its action on each of the basis vectors of *V* is given. To see this, let us consider the action of such an operator on an arbitrary vector:

where Substituting in the expression for the derived earlier, we obtain

Again, since is arbitrary, we conclude that

The coefficients appearing in the above expression are known as the matrix elements of the operator This is because the expansion coefficients for the "input" and "output" vectors are related by a matrix equation, with the being the elements of the matrix appearing in said equation. Let us write the "output" as

We see, however, from our earlier derivation that

or where is the matrix whose elements are given by the and and are column vectors of the expansion coefficients of and respectively.

If we follow a similar line of reasoning for vectors in the dual vector space, we obtain where is the transposed complex conjugate, or Hermitian adjoint, of We have thus justified our use of the term "Hermitian adjoint" in describing the action of an operator in the dual space.

## Special Linear Operators in Quantum Mechanics

**Hermitian Operator:** An operator is called Hermitian if All physical observables in quantum mechanics are represented by Hermitian operators.

**Anti-Hermitian operator:** An operator is called anti-Hermitian or skew-Hermitian if

Every operator can be decomposed uniquely in terms of a Hermitian and an anti-Hermitian part: .

**Unitary Operator:** An operator is called unitary if there exits an unique and is equal to , i.e., . An important property of unitary operators is that they preserve the norm of a vector, which in quantum mechanics refers to the conservation of probability under physical operations. Most transformations of importance in quantum mechanics are given by unitary operators.

**Antilinear operator:** An operator is called antilinear if for any two vectors and and for any two complex numbers *c*_{ψ} and *c*_{φ}, All operators of importance in quantum mechanics are linear, with one important exception - the time reversal operator is an antilinear operator.

## Theorem on Eigenvalues and Eigenstates of Hermitian Operator

We will now prove that the eigenvalues of Hermitian operators are real and that two eigenvectors of a Hermitian operator that correspond to different eigenvalues are orthogonal.

*Proof:* Consider an eigenstate of a Hermitian operator , corresponding to an eigenvalue λ; i.e.,

Taking the Hermitian adjoint of both sides, and using the fact that , we get

Taking the scalar product of the first equation with and of the second with , we get

Because is not a null vector, we conclude that i.e, λ is real.

To prove the second part of the theorem, consider another eigenstate with a different eigenvalue i.e.,

Taking the scalar product of the first equation with and of the second equation withn , we get

.

Because , we conclude that i.e., and are mutually orthogonal.

Since the eigenstates of a Hermitian operator are orthogonal (in fact, if they are normalized, then they are orthonormal), they often form a convenient basis in which to expand vectors; we will in fact often use the eigenstates of some observable as a basis.

## Projection Operators

A projection operator is an operator that "projects" the vector that it acts on onto the direction of another given vector. The projection operator corresponding to the vector is given by We may see that this is the case by acting with this operator on an arbitrary vector

We see that the result of the operation is a vector "parallel" to as asserted.

Let us now state some properties of projection operators.

**(1) A projection operator is Hermitian and equal to its own square; i.e., it is idempotent.**

*Proof:* The fact that a projection operator is Hermitian follows immediately from the properties of the Hermitian adjoint stated above. The fact that it is also idempotent may be seen as follows:

**(2) The product of two commuting projection operators and is also a projection operator.**

*Proof:*

and

The sum of two projection operators is *not*, in general, a projection operator itself; it will only be a projection operator if the two original operators are orthogonal; i.e., if