Linear Algebra: #22 Dual Spaces
Again let
V be a vector space over a field F (and, although its not really necessary
here, we continue to take F = ℜ or ℂ).
Definition
The
dual space to
V is the set of all linear mappings f :
V → F. We
denote the dual space by
V*.
Examples
Theorem 56
Let
V be a finite dimensional vector space (over ℂ) and let
V*
be
the dual space. For each
v ∈
V, let φ
v
:
V → ℂ be given by φ
v(
u) = <
v,
u>. Then
given an orthonormal basis {
v1, . . . ,
vn} of
V, we have that {φ
v1, . . ., φ
vn} is a basis
of
V*. This is called the
dual basis to {
v1, . . . ,
vn}.
Proof
Let φ ∈
V* be an arbitrary linear mapping φ :
V → ℂ. But, as always, we
remember that φ is uniquely determined by vectors (which in this case are simply
complex numbers) φ(
v1), . . . , φ(
vn). Say φ(
vj)
∈ ℂ, for each j. Now take some
arbitrary vector
v ∈
V. There is the unique expression
Therefore, φ = c
1φ
v1
+ · · · + c
nφ
vn, and so {φ
v1, . . ., φ
vn} generates
V*.
To show that {φ
v1, . . ., φ
vn} is linearly independent, let φ = c
1φ
v1
+ · · · + c
nφ
vn
be
some linear combination, where c
j ≠ 0, for at least one j. But then φ(
vj)
= c
j ≠ 0,
and thus φ ≠ 0 in
V*.
Corollary
dim(
V*) = dim(
V).
Corollary
More specifically, we have an isomorphism
V →
V*, such that
v → φ
v
for each
v ∈ V.
But somehow, this isomorphism doesn’t seem to be very “natural”. It is defined
in terms of some specific basis of
V. What if
V is not finite dimensional so that we
have no basis to work with? For this reason, we do not think of
V and
V*
as being
“really” just the same vector space. [In case we have a scalar product, then there is a “natural” mapping V →
V*, where
v → φ
v,
such that φ
v(
u) = <
v,
u>, for all
u ∈
V.]
On the other hand, let us look at the dual space of the dual space (
V*)
*. (Perhaps
this is a slightly mind-boggling concept at first sight!) We imagine that “really” we
just have (
V*)
*
=
V. For let Φ ∈ (
V*)
*. That means, for each φ ∈
V*
we have
Φ(φ) being some complex number. On the other hand, we also have φ(
v) being
some complex number, for each
V ∈
V. Can we uniquely identify each
V ∈
V with
some Φ ∈ (
V*)
*, in the sense that both always give the same complex numbers, for
all possible φ ∈
V*?
Let us say that there exists a
v ∈
V such that Φ(φ) = φ(
v), for all φ ∈
V*. In
fact, if we define φ
v
to be Φ(φ) = φ(
v), for each φ ∈
V*, then we certainly have a
linear mapping,
V*
→ ℂ. On the other hand, given some arbitrary Φ ∈ (
V*)
*, do
we have a unique
v ∈
V such that Φ(φ) = φ(v), for all φ ∈
V*? At least in the case
where
V is finite dimensional, we can affirm that it is true by looking at the dual
basis.
Dual mappings
Let
V and
W be two vector spaces (where we again assume that the field is ℂ).
Assume that we have a linear mapping f :
V →
W. Then we can define a linear
mapping f
*
:
W*
→
V*
in a natural way as follows. For each φ ∈
W*, let f
*(φ) =
φ ◦ f. So it is obvious that f
*(φ) :
V → ℂ is a linear mapping. Now assume that
V
and
W have scalar products, giving us the mappings s :
V →
V*
and t :
W →
W*.
So we can draw a little “diagram” to describe the situation.
The mappings s and t are isomorphisms, so we can go around the diagram, using
the mapping f
adj
= s
−1 ◦ f
*
◦ t :
W →
V. This is the adjoint mapping to f. So
we see that in the case
V = W, we have that a self-adjoint mapping f :
V →
V is
such that f
adj
= f.
Does this correspond with our earlier definition, namely that <
u, f(
v)> = <f(
u),
v>
for all
u and
v ∈
V? To answer this question, look at the diagram, which now has
the form
where s(
v) ∈
V*
is such that s(
v)(
u) = <
v,
u>, for all
u ∈
V. Now f
adj
= s
−1 ◦ f
*
◦ s;
that is, the condition f
adj
= f becomes s
−1 ◦ f
*
◦ s = f. Since s is an isomorphism,
we can equally say that the condition is that f
*
◦ s = s ◦ f. So let
v be some arbitrary
vector in
V. We have s ◦ f(
v) = f
*
◦ s(
v). However, remembering that this is an
element of
V*, we see that this means
(s ◦ f(v))(u) = (f*
◦ s)(v)(u),
for all
u ∈
V. But (s ◦ f(
v))(
u) = <f(
v),
u> and (f
*
◦ s)(
v)(
u) = <
v, f(
u)>. Therefore
we have
<f(v), u> = <v, f(u)>
for all
v and
u ∈
V, as expected.
This is the last section for this series on Linear Algebra. But that is not to say that there is nothing more
that you have to know about the subject. For example, when studying the theory
of relativity you will encounter tensors, which are combinations of linear mappings
and dual mappings. One speaks of “covariant” and “contravariant” tensors. That
is, linear mappings and dual mappings.
But then, proceeding to the general theory of relativity, these tensors are used
to describe differential geometry. That is, we no longer have a linear (that is, a
vector) space. Instead, we imagine that space is curved, and in order to describe
this curvature, we define a thing called the tangent vector space which you can
think of as being a kind of linear approximation to the spacial structure near a
given point. And so it goes on, leading to more and more complicated mathematical
constructions, taking us away from the simple “linear” mathematics which we have
seen in this semester.
After a few years of learning the mathematics of contemporary theoretical physics,
perhaps you will begin to ask yourselves whether it really makes so much sense after
all. Can it be that the physical world is best described by using all of the latest
techniques which pure mathematicians happen to have been playing around with
in the last few years — in algebraic topology, functional analysis, the theory of
complex functions, and so on and so forth? Or, on the other hand, could it be
that physics has been loosing touch with reality, making constructions similar to the
theory of epicycles of the medieval period, whose conclusions can never be verified
using practical experiments in the real world?
IMPORTANT NOTE:
This series on Linear Algebra has been taken from the lecture notes prepared by Geoffrey Hemion. I used his notes when studying Linear Algebra for my physics course and it was really helpful. So, I thought that you could also benefit from his notes. The document can be found at his homepage.
Comments
Post a Comment