Finite Groups and their Representations (Mathematics 4H 1998–9) 7/11/2002 Dr A. J. Baker Department of Mathematics, University of Glasgow, Glasgow G12 8QW, Scotland. E-mail address: [email protected] URL: http://www.maths.gla.ac.uk/∼ajb Contents Chapter 1. Linear and multilinear algebra 1 1. Basic linear algebra 1 2. Class functions and the Cayley-Hamilton Theorem 5 3. Separability 8 4. Basic notions of multilinear algebra 9 Chapter 2. Recollections and reformulations on basic group theory 13 1. The Isomorphism and Correspondence Theorems 13 2. Some definitions and notation 14 3. Group actions 15 4. The Sylow theorems 17 5. Solvable groups 17 6. Product and semi-direct product groups 18 7. Some useful groups 18 8. Some useful Number Theory 19 Chapter 3. Representations of finite groups 21 1. Linear representations 21 2. G-homomorphisms and irreducible representations 23 3. New representations from old 27 4. Permutation representations 28 5. Properties of permutation representations 30 6. Calculating in permutation representations 32 7. Generalized permutation representations 33 Chapter 4. Character theory 35 1. Characters and class functions on a finite group 35 2. Properties of characters 37 3. Inner products of characters 38 4. Character tables 41 5. Examples of character tables 44 6. Reciprocity formulæ 49 7. Representations of semi-direct products 51 Chapter 5. Some applications to group theory 53 1. Characters and the structure of groups 53 2. A result on representations of simple groups 55 3. A Theorem of Frobenius 56 Chapter 6. Automorphisms and extensions 59 1. Automorphisms 59 2. Extensions 62 3. Classifying extensions [optional extra material] 65 3 4 CONTENTS Chapter 7. Some further applications 69 1. Fourier series and the circle group 69 CHAPTER 1 Linear and multilinear algebra In this chapter we will study the linear algebra required in representation theory. Some of this will be familiar but there will also be new material, especially that on ‘multilinear’ algebra. 1. Basic linear algebra Throughout the remainder of these notes k will denote a field, i.e., a commutative ring with unity 1 in which every non-zero element has an inverse. Most of the time in representation theory we will work the field of complex numbers C and occasionally the field of real numbers R. However, a lot of what we discuss will work over more general fields, including those of finite characteristic such as Z/p for a prime p. Here, the characteristic of the field k is defined to be the smallest natural number p ∈ N such that p1 = 1+···+1 = 0 if such a number exists then k is said to have finite characteristic), otherwise it has characteristic 0. In the finite characteristic case, the characteristic is always a prime. 1.1. Bases, linear transformations and matrices. LetV beafinitedimensionalvector space over k, i.e., a k-vector space. Recall that a basis for V is a linearly independent spanning set for V. The dimension of V (over k) is the number of elements in any basis, and is denoted dim V. We will often view k itself as a 1-dimensional k-vector space with basis {1} or indeed | any set {x} with x (cid:54)= 0. Given two k-vector spaces V,W, a linear transformation (or linear mapping) from V to W is a function ϕ: V −→ W such that ϕ(v +v ) = ϕ(v )+ϕ(v ), 1 2 1 2 ϕ(tv) = tϕ(v), for v ,v ,v ∈ V and t ∈ k. The set of all linear transformations V −→ W will be denoted 1 2 Hom (V,W). This is a k-vector space with the operations of addition and scalar multiplication | given by (ϕ+θ)(u) = ϕ(u)+θ(u), (tϕ)(u) = t(ϕ(u)) = ϕ(tu) for ϕ,θ ∈ Hom (V,W) and t ∈ k. | An important property of a basis is the following extension property. Proposition 1.1. Let V,W be k-vector spaces with V finite dimensional, and {v ,...,v } 1 m a basis for V where m = dim V. Given a function ϕ: {v ,...,v } −→ W, there is a unique 1 m | linear transformation Φ: V −→ W such that Φ(v ) = ϕ(v ) (1 (cid:54) j (cid:54) m). j j We can express this with the aid of the commutative diagram {v1,...,vm} inclusion (cid:47)(cid:47) V (cid:76)(cid:76)(cid:76)ϕ(cid:76)(cid:76)(cid:76)(cid:76)(cid:76)(cid:76)(cid:76)(cid:37)(cid:37) (cid:196)(cid:196) ∃!Φ W 1 2 1. LINEAR AND MULTILINEAR ALGEBRA in which the dotted arrow is supposed to indicate a (unique) solution to the problem of filling in the diagram {v1,...,vm} inclusion (cid:47)(cid:47) V (cid:76)(cid:76)(cid:76)ϕ(cid:76)(cid:76)(cid:76)(cid:76)(cid:76)(cid:76)(cid:76)(cid:37)(cid:37) W with a linear transformation so that composing the functions corresponding to the horizontal and right hand sides agrees with the functions corresponding to left hand side. Proof. The definition of Φ is (cid:88)m (cid:88)m Φ λ v = λ ϕ(v ). (cid:164) j j j j j=1 j=1 When using this result we will refer to Φ as the linear extension of ϕ and often write ϕ. Let V,W be finite dimensional k-vector spaces with bases {v ,...,v } and {w ,...,w }, 1 m 1 n wherem = dim V andn = dim W. ByProposition1.1,eachfunctionϕ : {v ,...,v } −→ W ij 1 m | | (1 (cid:54) i (cid:54) m, 1 (cid:54) j (cid:54) n) given by ϕ (v ) = δ w (1 (cid:54) k (cid:54) m) ij k ik j extends uniquely to a linear transformation ϕ : V −→ W. ij Proposition 1.2. The set of functions ϕ : V −→ W (1 (cid:54) i (cid:54) m, 1 (cid:54) j (cid:54) n) is a basis ij for Hom (V,W). Hence | dim Hom (V,W) = dim V dim W = mn. | | | | A particular and very important case of this is the dual space of V, V∗ = Hom(V,k). Notice that dim V∗ = dim V. Given a basis {v ,...,v } of V, V∗ has as a basis the set of 1 m | | functions {v∗,...,v∗ } which satisfy 1 m v∗(v ) = δ , i k ik where δ is the Kronecker δ-symbol for which ij (cid:40) 1 if i = j, δ = ij 0 otherwise. We can view this giving rise to an isomorphism V −→ V∗ under which v ←→ v∗. j j If we set V∗∗ = (V∗)∗, then there is an isomorphism V∗ −→ V∗∗ under which v∗ ←→ (v∗)∗. j j Here we use the fact that the v∗ form a basis for V∗. Composing these two isomorphisms we j obtain a third V −→ V∗∗ given by v ←→ (v∗)∗. j j In fact, this does not depend on the basis of V used, although the factors do! This is sometimes called the canonical isomorphism V −→ V∗∗. The set of all endomorphisms of V is End (V) = Hom (V,V), | | 1. BASIC LINEAR ALGEBRA 3 which is a ring (actually a k-algebra, and also non-commutative if dim V > 1) with addition | as above, and multiplication given by composition of functions. There is a ring monomorphism k −→ End (V) given by | t (cid:55)−→ tId V which embeds k into End (V) as the subring of scalars. We also have | dim End (V) = (dim V)2. | | | Let GL (V) denote the group of all invertible k-linear transformations V −→ V, i.e., the | group of units in End (V). This is usually called the general linear group of V or the group of | linear automorphisms of V and denoted GL (V) or Aut (V). | | Now let v = {v ,...,v } and w = {w ,...,w } be bases for V and W. Then given a linear 1 m 1 n transformation ϕ: V −→ W we may define the matrix of ϕ with respect to the bases v and w to be the n×m matrix with coefficients in k, [ϕ] = [a ], w v ij where (cid:88)n ϕ(v ) = a w . j kj k k=1 Now suppose we have a second pair of bases for V and W, v(cid:48) = {v(cid:48),...,v(cid:48) } and w(cid:48) = 1 m {w(cid:48),...,w(cid:48) }. Then we can write 1 n (cid:88)m (cid:88)n v(cid:48) = p v , w(cid:48) = q w , j rj r j sj s r=1 s=1 for some p ,q ∈ k. If we form the m×m and n×n matrices P = [p ] and Q = [q ], then we ij ij ij ij have the following standard result. Proposition 1.3. The matrices [ϕ] and [ϕ] are related by the formulæ w v w(cid:48) v(cid:48) [ϕ] = Q [ϕ] P−1 = Q[a ]P−1. w(cid:48) v(cid:48) w v ij In particular, if W = V, w = v and w(cid:48) = v(cid:48), then [ϕ] = P [ϕ] P−1 = P[a ]P−1. v(cid:48) v(cid:48) v v ij 1.2. Quotients and complements. Let W ⊆ V be a vector subspace. Then we define the quotient space V/W to be the set of equivalence classes under the equivalence relation ∼ on V defined by u ∼ v if and only if v−u ∈ W. We denote the class of v by v+W. This set V/W becomes a vector space with operations (u+W)+(v+W) = (u+v)+W, λ(v+W) = (λv)+W and zero element 0 + W. There is a linear transformation, usually called the quotient map q: V −→ V/W, defined by q(v) = v+W. Then q is surjective, has kernel kerq = W and has the following universal property. 4 1. LINEAR AND MULTILINEAR ALGEBRA Theorem 1.4. Let f: V −→ U be a linear transformation with W ⊆ kerf. Then there is a unique linear transformation f: V/W −→ U for which f = f ◦q. This can be expressed in the diagram q V (cid:47)(cid:47) V/W (cid:63) (cid:63) (cid:63) (cid:63) (cid:63) (cid:63) (cid:63) f (cid:63)(cid:194)(cid:194) (cid:124)(cid:124) ∃!f U in which all the sides represent linear transformations. Proof. We define f by f(v+W) = f(v), which makes sense since if v(cid:48) ∼ v, then v(cid:48)−v ∈ W, hence f(v(cid:48)) = f((v(cid:48)−v)+v) = f(v(cid:48)−v)+f(v) = f(v). The uniqueness follows from the fact that q is surjective. (cid:164) Notice also that (1.1) dim V/W = dim V −dim W. | | | A linear complement (in V) of a subspace W ⊆ V is a subspace W(cid:48) ⊆ V such that the restriction q : W(cid:48) −→ V/W is a linear isomorphism. The next result sums up properties of |W(cid:48) linear complements and we leave the proofs as exercises. Theorem 1.5. Let W ⊆ V and W(cid:48) ⊆ V be vector subspaces of the k-vector space V with dim V = n. Then the following conditions are equivalent. | a) W(cid:48) is a linear complement of W in V. b) Let {w ,...,w } be a basis for W, and {w ,...,w } a basis for W(cid:48). Then 1 r r+1 n {w ,...,w } = {w ,...,w }∪{w ,...,w } 1 n 1 r r+1 n is a basis for V. c) Every v ∈ V has a unique expression of the form v = v +v 1 2 for some elements v ∈ W, v ∈ W(cid:48). In particular, W ∩W(cid:48) = {0}. 1 2 d) Every linear transformation h: W(cid:48) −→ U has a unique extension to a linear transfor- mation H: V −→ U with W ⊆ kerH. e) W is a linear complement of W(cid:48) in V. f) There is a linear isomorphism J: V −→∼= W × W(cid:48) for which imJ = W × {0} and |W imJ = {0}×W(cid:48). |W(cid:48) g) There are unique linear transformations p: V −→ V and p(cid:48): V −→ V for which p2 = p◦p = p, p(cid:48)2 = p(cid:48)◦p(cid:48) = p(cid:48), imp = W, imp(cid:48) = W(cid:48), Id = p+p(cid:48). V We often write V = W ⊕W(cid:48) whenever W(cid:48) is a linear complement of W. The maps p,p(cid:48) of Theorem 1.5(g) are often called the (linear) projections onto W and W(cid:48). This can be extended to the situation where there are r subspaces V ,...,V ⊆ V for which 1 r (cid:88)r V = V +···+V = v : v ∈ V , 1 r j j j j=1 2. CLASS FUNCTIONS AND THE CAYLEY-HAMILTON THEOREM 5 and we inductively have that V is a linear complement of (V ⊕···⊕V ) in (V +···+V ). k 1 k−1 1 k A linear complement for a subspace W ⊆ V always exists since we can extend a basis {w ,...,w } of W to a basis {w ,...,w ,w ,...,w } for V and then take W(cid:48) to be the 1 r 1 r r+1 n subspace spanned by {w ,...,w }. Theorem 1.5(b) implies that W(cid:48) is a linear complement. r+1 n 2. Class functions and the Cayley-Hamilton Theorem In this section k can be any field. Let A = [a ] be an n×n matrix over k. ij Definition 1.6. The characteristic polynomial of A is the polynomial (in the variable X) (cid:88)n char (X) = det(XI −[a ]) = c (A)Xk ∈ k[X], A n ij k k=0 where I is the n×n identity matrix. n This polynomial is monic and of degree n in X. The coefficients c (A) ∈ k are functions of r the entries a . The following is an important result about this polynomial. ij Theorem 1.7 (Cayley-Hamilton Theorem: matrix version). The matrix A satisfies the polynomial identity (cid:88)n char (A) = c (A)Ak = 0. A k k=0 Example 1.8. Let (cid:183) (cid:184) 0 −1 A = ∈ R[X]. 1 0 Then (cid:183) (cid:184) X 1 char (X) = det = X2+1. A −1 X By calculation we find that A2+I = O as claimed. 2 2 Lemma 1.9. Let A = [a ] and P be an n×n matrix with coefficients in k. Then if P is ij invertible, char (X) = char (X). PAP−1 A Thus each of the coefficients c (A) (0 (cid:54) k (cid:54) n) satisfies k c (PAP−1) = c (A). k k Proof. We have char (X) = det(XI −PAP−1) PAP−1 n = det(P(XI )P−1−PAP−1) n = det(P(XI −A)P−1) n = detP det(XI −A)detP−1 n = char (X). A Now comparing coefficients we obtain the result. (cid:164) ThisresultshowsthatasfunctionsofA(henceofthea ),thecoefficientsc (A)areinvariant ij k or class functions in the sense that they are invariant under conjugation, c (PAP−1) = c (A). r r 6 1. LINEAR AND MULTILINEAR ALGEBRA Recall that for an n×n matrix A = [a ], the trace of A, TrA ∈ k, is defined by ij (cid:88)n TrA = a . jj j=1 Proposition 1.10. For any n×n matrix over k we have c (A) = −TrA and c (A) = (−1)ndetA. n−1 n Proof. Calculating the coefficient of Xn−1 in det(XI −[a ]) we get n ij (cid:88)n − a = −Tr[a ]. rr ij r=1 Putting X = 0 gives c (A) = det([−a ]) = (−1)ndet[a ]. (cid:164) n ij ij Now let ϕ: V −→ V be a linear transformation on a finite dimensional k-vector space with a basis v = {v ,...,v }. Consider the matrix of ϕ relative to v, 1 n [ϕ] = [a ], v ij where (cid:88)n ϕv = a v . j rj r r=1 Then the trace of ϕ with respect to the basis v is Tr ϕ = Tr[ϕ] . v v If we change to a second basis w say, there is an invertible n×n matrix P = [p ] such that ij (cid:88)n w = p v , j rj r r=1 and then [ϕ] = P[ϕ] P−1. w v Hence, (cid:161) (cid:162) Tr ϕ = Tr P[ϕ] P−1 = Tr ϕ. w v v Thus we see that the quantity Trϕ = Tr ϕ v only depends on ϕ, not the basis v. We call this the trace of ϕ. We can similarly define detϕ = detA. More generally, we can consider the polynomial char (X) = char (X) ϕ [ϕ]v which by Lemma 1.9 is independant of the basis v. Thus all of the coefficients c (A) are k functions of ϕ and do not depend on the basis used, so we may write c (ϕ) in place of c (A). k k In particular, an alternative way to define Trϕ and detϕ is as Trϕ = c (ϕ) and detϕ = (−1)ndetA. n−1 We also call char (X) the characteristic polynomial of ϕ. The following is a formulation of the ϕ Cayley-Hamilton Theorem for a linear transformation.