ebook img

Spectral Projector-Based Graph Fourier Transforms PDF

1.2 MB·
Save to my drive
Quick download
Download
Most books are stored in the elastic cloud where traffic is expensive. For this reason, we have a limit on daily download.

Preview Spectral Projector-Based Graph Fourier Transforms

1 Spectral Projector-Based Graph Fourier Transforms Joya A. Deri, Member, IEEE, and José M. F. Moura, Fellow, IEEE Abstract—The paper presents the graph Fourier trans- to this repeated eigenvalue (frequency)—defining the form (GFT) of a signal in terms of its spectral decompo- spectral or frequency components (“harmonics”) be- sition over the Jordan subspaces of the graph adjacency comes an issue; and 3) Nondiagonalizability: If the shift 7 matrix 𝐴. This representation is unique and coordinate 1 free,anditleadstounambiguousdefinitionofthespectral is not diagonalizable, as it happens in many real world 0 components(“harmonics”)ofagraphsignal.Thisispartic- applications with large sparse graphs, the matrix 𝐴 is 2 ularlymeaningfulwhen𝐴hasrepeatedeigenvalues,andit defective, introducing additional degrees of freedom in n isveryusefulwhen𝐴isdefectiveornotdiagonalizable(as the coordinate form definition of the GFT. These topics itmaybethecasewith directed graphs).Manyrealworld a are particularly relevant when applying GSP to datasets J large sparse graphs have defective adjacency matrices. arising in real world problems. In many of these, the We present properties of the GFT and show it to satisfy 0 a generalized Parseval inequality and to admit a total graphs are large and sparse and their adjacency matrix 1 variation ordering of the spectral components. We express is defective. ] the GFT in terms of spectral projectors and present an This paper addresses these issues. We present the I illustrative example for a real world large urban traffic coordinate free GFT that leads to a unique spectral S dataset. decomposition of graph signals and to an unambiguous . s IndexTerms—Signalprocessingongraphs,graphsignal definition of spectral components, regardless if there are c processing, graph Fourier transform, spectral projection, [ repeated eigenvalues or not, or if the shift is defective. graph spectral components, Jordan decomposition, gener- 1 alized eigenspaces, directed graphs, sparse matrices, large Spectralcomponents[5]aresignalsthatareleftinvariant v networks by graph filters. For repeated eigenvalues or defective 0 matrices, it makes sense to consider in this context 9 irreducible invariant subspaces—signal subspaces that I. INTRODUCTION 6 are invariant to filtering and are irreducible. This is 2 Graph signal processing (GSP) extends traditional achieved by decomposing the signal space into a direct 0 signal processing to data indexed by nodes of graphs. sumofirreduciblefilter-invariant(spectral)subspaces.If 1. Such data arises in many domains from genomics to the dimension of the filter-invariant subspaces is larger 0 business to social networks, to name a few. In GSP, the than one, the choice of basis for these subspaces is not 7 graphFouriertransform(GFT)hasbeendefinedthrough unique, and neither is the coordinate form of the GFT 1 theeigendecompositionoftheadjacencymatrix𝐴ofthe : or identifying spectral components with basis vectors. v graph,takenasthegraphshiftoperator[1]–[3],orofthe The coordinate free GFT and the spectral decomposi- i graph Laplacian 𝐿 [4]. In the GSP approach in [1]–[3] X tion we present successfully addresses these challenges. and according to the algebraic signal processing in [5]– r We present a spectral oblique projector-based GFT that a [7] the eigenvectors of the shift are the graph frequency allows for a unique and unambiguous spectral repre- or graph spectral components and the eigenvalues are sentation of a signal over defective adjacency matrices. the graph frequencies. Invariance to filtering follows from invariance to the Contributions. There are several issues that need shift operator (adjacency matrix 𝐴) since, by GSP [1]– further study: 1) Unicity: the matrix form of the GFT [3], shift invariant filters are polynomials in the shift 𝐴. in [1]–[4] is not unique, depending on (implicit or The spectral components are the Jordan subspaces of explicit) choice of bases for underlying spaces. This is the adjacency matrix. We show that the GFT allows true, even if the matrix of interest is diagonalizable; characterization of the signal projection energies via a 2) Spectral components: If 𝐴 or 𝐿 have repeated eigen- generalized Parseval’s identity. Total variation ordering values, there may be several eigenvectors corresponding of the spectral components with respect to the Jordan ThisworkwaspartiallysupportedbyNSFgrantsCCF-1011903and subspaces is also discussed. CCF-1513936andanSYS-CMUgrant. Synopsis of approach. Before we formally introduce The authors are with the Department of Electrical and Computer the concepts and as a way of introduction and moti- Engineering,CarnegieMellonUniversity,Pittsburgh,PA15213USA (email:jderi,[email protected]) vation, we explain very concisely our approach. From algebraic signal processing (ASP) [5], [6], we know Other methods determine low-dimensional represen- that the basic component is the signal processing model tations of high-dimensional data by projecting the data Ω=(𝒜,ℳ,Φ).Foravectorspace𝑉 ofcomplex-valued ontolow-dimensionalsubspacesgeneratedbysubsetsof signals, we can then generalize for this signal model Ω, an eigenbasis [11]–[14]. References [11], [12] embed linear filtering theory, where algebra 𝒜 corresponds to high-dimensional vectors onto low-dimensional mani- a filter space, module ℳ corresponds to a signal space, folds determined by a weight matrix with entries cor- and bijective linear mapping Φ : 𝑉 → ℳ generalizes responding to nearest-neighbor distances. In [13], em- the 𝑧-transform [5]. One way to create a signal model bedding data in a low-dimensional space is described in is to specify a generator (or generators) for 𝒜, the termsofthegraphLaplacian,wherethegraphLaplacian shift filter or shift operator. The Fourier transform is is an approximation to the Laplace-Beltrami operator the map from the signal module ℳ to an irreducible on manifolds. Reference [15] also proves that the algo- decomposition of ℳ where the irreducible components rithm [11] approximates eigenfunctions of a Laplacian- are invariant to the shift (and to the filters). We are then based matrix. interested in studying the invariant irreducible compo- These methods [11]–[14] focus on discovering low- nentsofℳ.ThesearetheJordansubspacesasexplained dimensional representations for high-dimensional data, below.InGSP,wechooseasshifttheadjacencymatrix𝐴 capturing relationships between data variables into a of the underlying graph. Similarly, then, the Jordan matrix for their analysis. In contrast, our problem treats subspaces play an important role in the graph Fourier the data as a signal that is an input to a graph-based transform defined in Section IV and, in this context, the filter.Ourapproachemphasizesnode-basedweights(the irreducible, 𝒜-invariant submodules ℳ′ ⊆ ℳ are the signal) instead of edge-based weights that capture data spectral components of (signal space) ℳ. The Jordan dependencies. Related node-based methods in the graph subspaces are invariant, irreducible subspaces of C𝑁 signal processing framework are discussed next. with respect to the adjacency matrix 𝐴; they represent Data indexed by graphs and Laplacian-based the spectral components. This motivates the definition GFTs. The graph signal processing framework devel- of a spectral projector-based graph Fourier transform in oped in this paper assumes that data is indexed by Section IV. graphs. Studies that analyze data indexed by nodes of a Section II describes related spectral analysis methods graph include [16]–[18], which use wavelet transforms and graph signal processing frameworks. Section III to study data on distributed sensor networks. Other provides the graph signal processing and linear algebra approaches, such as those in [4], [19]–[24], use the background for the graph Fourier transform defined in graph Laplacian and its eigenbasis for localized data SectionIV.SectionVpresentsthegeneralizedParseval’s processing.Inparticular,[4],[20]defineagraphFourier identity as a method for ranking spectral components. transform(GFT)assignalprojectionsontotheLaplacian Total variation-based orderings of the Jordan subspaces eigenvectors. These eigenvectors form an orthonormal arediscussedindetailinSectionVI.SectionVIIshiows basissincethegraphLaplacianissymmetricandpositive anapplicationonarealworlddataset.Limitationsofthe semidefinite. Graph-based filter banks are constructed method are briefly discussed in Section VIII. with respect to this GFT in [21]. Analyses based on the graph Laplacian do not take II. PREVIOUSWORK intoaccountfirst-ordernetworkstructureofthenetwork, that is, any asymmetry like in a digraph or directed This section presents a brief review of the literature edges in a graph. These asymmetries affect network and some background material. flows, random walks, and other graph properties, as studied,forexample,in[25],[26].Theapproachwetake A. Spectral methods here preserves the influence of directed edges in graph Principal component analysis (the Karhunen-Loève signal processing by projecting onto the eigenbasis of Transform) is an early signal decomposition method the adjacency matrix. proposed and remains a fundamental tool today. This Adjacency matrix-based GFTs. References [1]–[3] approach orthogonally transforms data points, often develop GSP, including filtering, convolution, graph via eigendecomposition or singular value decomposition Fouriertransform,fromthegraphadjacencymatrix𝐴∈ (SVD) of an empirical covariance matrix, into linearly C𝑁×𝑁, taken to play the role of shift operator 𝑧−1 in uncorrelated variables called principal components [8]– digitalsignalprocessing.Accordingtothealgebraicsig- [10]. The first principal components capture the most nalprocessingtheoryof[5]–[7],[27],theshiftgenerates variance in the data; this allows analysis to be restricted all linear shift-invariant filters for a class of signals (un- to these first few principal components, thus increasing der certain shift invariance assumptions). In the context the efficiency of the data representation. of GSP [1], shift invariant filters are polynomials on the 2 shift 𝐴. The graph Fourier transform is defined also in terms of the adjacency matrix. GSP as presented in [1]– [3] preserves the directed network structure, in contrast to second order methods like those based on the graph Laplacian. Fig. 1: Directed cycle graph. The graph Fourier transform of [1] is defined as follows. For a graph 𝒢 = 𝐺(𝐴) with adjacency matrix 𝐴 ∈ C𝑁×𝑁 and Jordan decomposition 𝐴 = 𝑉𝐽𝑉−1, A. Graph Signal Processing the graph Fourier transform of a signal 𝑠 ∈ C𝑁 over 𝒢 1) Graph Signals: Let 𝒢 = 𝒢(𝐴) = (𝒱,ℰ) be the is defined as graph corresponding to matrix 𝐴 ∈ C𝑁×𝑁, where 𝒱 is 𝑠=𝑉−1𝑠, (1) ̃︀ the set of 𝑁 nodes and a nonzero entry [𝐴] denotes a 𝑖𝑗 where 𝑉−1 is the Fourier transform matrix of 𝒢. This directed edge 𝑒𝑖𝑗 ∈ ℰ from node 𝑗 to node 𝑖. In real- is essentially a projection of the signal onto the eigen- world applications, such nodes can be represented by vectors of 𝐴. It is an orthogonal projection when 𝐴 geo-locations of a road network, and the edges can be is normal (𝐴𝐻𝐴 = 𝐴𝐴𝐻) and the eigenvectors form specified by one-way or two-way streets. Define graph a unitary basis (i.e., 𝑉−1 = 𝑉𝐻). This is of course signal 𝑠 : 𝒱 → 𝒮 on 𝒢, where 𝒮 represents the signal guaranteed with the graph Laplacian. Left unanswered space over the nodes of 𝒢. We take 𝒮 = C𝑁 such that in these approaches is the lack of unicity1 of 𝑉−1, the 𝑠 = (𝑠1,...,𝑠𝑁) ∈ C𝑁 and 𝑠𝑖 represents a measure at appropriatedefinitionofspectralcomponentswhenthere node𝑣𝑖 ∈𝒱.Inreal-worldapplications,suchsignalscan are repeated eigenvalues, and finally how to define it be specified by sensor measurements or datasets. uniquely when the shift is defective. 2) Graph Shift: As in [1], [3], the graph shift is Thispaperaddressesthesetopicsand,inparticular,fo- the graph signal processing counterpart to the shift cuses on graph signal processing over defective, or non- operator𝑧−1 indigitalsignalprocessing.Thegraphshift diagonalizable, adjacency matrices. These matrices have is defined as the operator that replaces the element 𝑠𝑖 of at least one eigenvalue with algebraic multiplicity (the graph signal 𝑠 = (𝑠1,...,𝑠𝑁) corresponding to node exponent in the characteristic polynomial of 𝐴) greater 𝑣𝑖 ∈ 𝑉 with the linear combination of the signal ele- than the geometric multiplicity (the kernel dimension of ments at its in-neighbors (nodes 𝑣𝑘 ∈𝑉 that participate 𝐴), which results in an eigenvector matrix that does not in an edge 𝑒𝑖𝑘 ∈ℰ), denoted by set 𝒩𝑖; i.e., the shifted span C𝑁. signal has elements 𝑠̃︀𝑖 =∑︀𝑣𝑗∈𝒩𝑖[𝐴]𝑖𝑗𝑠𝑗, or The basis can be completed by computing Jordan 𝑠=𝐴𝑠. (2) chains of generalized eigenvectors [28], [29], but the ̃︀ computation introduces degrees of freedom that ren- Consistency with discrete signal processing can be seen der these generalized eigenvectors non-unique; in other by considering the directed cycle graph in Figure 1, words, the transform (1) may vary greatly depending on which represents a finite, periodic time-series signal. the particular generalized eigenvectors that are chosen. The adjacency matrix of the graph is circulant matrix Our approach defines the GFT in terms of spectral (elements not shown are zero) projections onto the Jordan subspaces (i.e., the span of ⎡ ⎤ 1 the Jordan chains) of the adjacency matrix2. ⎢1 ⎥ III. BACKGROUND 𝐶 =⎢⎢⎣ ... ⎥⎥⎦. (3) 1 Thissectionreviewstheconceptsofgraphsignalpro- cessingandprovidesareferencefortheunderlyingmath- Theshift𝑠̃︀=𝐶𝑠yieldsthetimedelay𝑠̃︀𝑖 =𝑠𝑖−1mod𝑁. ematics. Section III-A defines the graph Fourier trans- Reference [1] shows that the graph shift motivates formandgraphfilters;seealso[1]–[3].SectionIII-Bde- defining the graph Fourier transform as the signal pro- fines the generalized eigenspaces and Jordan subspaces jection onto the eigenvectors of 𝐴. Our transform in of a matrix [28], [29], [32]. SectionIVbuildsonthisconcepttodevelopaframework to handle defective adjacency matrices. 1Eigenvaluesaredefineduptoaconstant.Differentchoicesleadto 3) Graph Filter: The graph shift is a simple graph scaled polynomial transforms. The discrete Fourier transform corre- filter, where a graph filter H∈C𝑁×𝑁 represents a (lin- spondstoaveryspecificchoiceofbasis[5]. 2WerecognizethatcomputingtheJordandecompositionisnumer- ear) system with output H𝑠 for any graph signal 𝑠∈𝒮. ically unstable. This paper is focused on the concepts of a spectral As shown in Theorem 1 of [1], graph filter H is shift- projection coordinate free definition of the GFT and spectral compo- invariant, or nents. Section VIII will address these computational issues that, for lackofspace,arefullydiscussedin[30],[31]. 𝐴(H)𝑠=H(𝐴𝑠), (4) 3 Fig. 2: Illustration of generalized eigenspace partitions and Jordan chains of adjacency matrix 𝐴∈C𝑁×𝑁 for (a) a single Jordan block, (b) one eigenvalue and two Jordan blocks, (c) one eigenvalue and multiple Jordan blocks, and (d)multipleeigenvalues.In(a)-(c)(bottom),eachpointrepresentsavectorinaJordanchainof𝐴;pointsconnected by lines illustrate a single Jordan chain. The partial multiplicities depicted for 𝜆 are (a) 𝑁, (b) 𝑟 =𝑁−2 and 2, 1 11 and (c) 𝑟 =𝑁 −6, 2, 2, 1, and 1. Each generalized eigenspace G in (d) can be visualized by (a)-(c). 11 𝑖 ifandonlyifapolynomialℎ(𝑥)=∑︀𝐿 ℎ 𝑥𝑖 existsfor Jordan chain of length 𝑟. The Jordan chain vectors are 𝑖=0 𝑖 constants ℎ ,ℎ ,...,ℎ ∈ C such that H = ℎ(𝐴) = linearly independent and span a Jordan subspace, or, 0 1 𝐿 ∑︀𝐿 ℎ 𝐴𝑖. This condition holds whenever the charac- 𝑖=0 𝑖 J =span(𝑣 ,𝑣 ,...,𝑣 ). (7) teristic and minimal polynomials of 𝐴 are equal [1]. 1 2 𝑟 For defective 𝐴 with unequal characteristic and min- Order the Jordan subspaces of 𝜆 by decreasing di- 𝑖 imal polynomials such as the examples seen in this mension and denote by J the 𝑗th Jordan subspace 𝑖𝑗 paper, shift-invariance cannot be claimed; however, an of 𝜆 with dimension 𝑟 ≤ 𝑚 , where {𝑟 }𝑔𝑖 are 𝑖 𝑖𝑗 𝑖 𝑖𝑗 𝑗=1 equivalent graph filter can be designed in terms of a called the partial multiplicities of 𝜆 . Then the gener- 𝑖 matrix that is the image of a polynomial of 𝐴 [1]. The alized eigenspace G𝑖 = Ker(𝐴−𝜆𝐼)𝑚𝑖 of 𝜆𝑖 can be properties of such graph filters are established in [1]. decomposed as ⨁︁𝑔𝑖 G = J (8) 𝑖 𝑖𝑗 B. Eigendecomposition 𝑗=1 ThissectionandAppendixAprovideareviewofJor- as depicted in Figures 2a-c. Combining (8) and (5), the dandecompositions.Thereaderisdirectedto[28],[29], Jordan subspaces uniquely decompose C𝑁 as [32], [33] for additional background. Jordan subspaces ⨁︁𝑘 ⨁︁𝑔𝑖 andtheJordandecompositionaredefinedinthissection. C𝑁 = J . (9) 𝑖𝑗 The generalized eigenspaces G𝑖 = Ker(𝐴−𝜆𝑖𝐼)𝑚𝑖 𝑖=1 𝑗=1 of 𝐴 ∈ C𝑁×𝑁 corresponding to its 𝑘 distinct eigenval- Furthermore, the cyclic Jordan subspaces cannot be rep- ues 𝜆 decompose C𝑁 in terms of the direct sum 𝑖 resented as direct sums of smaller invariant subspaces; 𝑘 that is, the Jordan subspaces are irreducible components ⨁︁ C𝑁 = G𝑖 (5) of C𝑁 (see, e.g., p. 318 of [32]). 𝑖=1 Figure2illustratespossibleJordansubspacestructures as depicted in Figure 2d; see also Appendix A. of𝐴,withthetoprowshowingthetessellationofvector Let𝑣1 ∈Ker(𝐴−𝜆𝑖𝐼),𝑣1 ̸=0,beoneofthe𝑔𝑖proper space C𝑁 by the generalized or root eigenspace G𝑖 = eigenvectors of 𝐴 corresponding to the eigenvalue 𝜆𝑖. It Ker(𝐴−𝜆𝑖𝐼)𝑚𝑖 and by the Jordan spaces J𝑖𝑗, and generates by recursion the generalized eigenvectors thebottomrowillustratingthetelescopingofC𝑁 bythe generalized eigenspaces of order 𝑝. Figure 2a illustrates 𝐴𝑣 =𝜆 𝑣 +𝑣 , 𝑝=2,...,𝑟 (6) 𝑝 𝑖 𝑝 𝑝−1 C𝑁 for a matrix with a single Jordan chain, represented where 𝑟 is the minimal positive integer such that by connected points in C𝑁. The case of a matrix with (𝐴−𝜆 𝐼)𝑟𝑣 = 0 and (𝐴−𝜆 𝐼)𝑟−1𝑣 ̸= 0. Such a two Jordan blocks corresponding to one eigenvalue is 𝑖 𝑟 𝑖 𝑟 sequence of vectors (𝑣 ,...,𝑣 ) that satisfies (6) is a shown in Figure 2b. Figure 2c shows C𝑁 for a matrix 1 𝑟 4 withasingleeigenvalueandmultipleJordanblocks,and components of graph frequency 𝜆 are the Jordan sub- 𝑖 Figure 2d depicts the tessellation of the space in terms spaces J . The total number of frequency components 𝑖𝑗 of the generalized eigenspaces for the case of multiple corresponding to 𝜆 is its geometric multiplicity 𝑔 . In 𝑖 𝑖 distinct eigenvalues. thisway,when𝑔 >1,frequency𝜆 correspondstomore 𝑖 𝑖 Jordan decomposition. Appendix A defines eigen- than one frequency component. vector matrix 𝑉 and Jordan normal form 𝐽 such that Tohighlightthesignificanceof(12)and(13),consider 𝐴 = 𝑉𝐽𝑉−1. An important property of this decom- the signal expansion of a graph signal 𝑠 with respect to position is that Jordan chains are not unique and not graph 𝒢(𝐴): necessarily orthogonal. For example, the 3×3 matrix 𝑠=𝑠 𝑣 +···+𝑠 𝑣 =𝑉𝑠, (14) ⎡ ⎤ ̃︀1 1 ̃︀𝑁 𝑁 ̃︀ 0 1 1 𝐴=⎣0 0 1⎦ (10) where 𝑣𝑖 is the 𝑖th basis vector in a Jordan basis 0 0 0 of 𝐴, 𝑉 is the corresponding eigenvector matrix, and 𝑠 is the 𝑖th expansion coefficient. As discussed in can have distinct eigenvector matrices ̃︀𝑖 Section III-B, the choice of Jordan basis has degrees ⎡ ⎤ ⎡ ⎤ 1 −1 1 1 0 0 of freedom when the dimension of a cyclic Jordan 𝑉1 =⎣ 0 1 −2 ⎦,𝑉2 =⎣ 0 1 −1 ⎦, (11) subspaceis greaterthan one.Therefore, ifdimJ𝑖𝑗 ≥2, 0 0 1 0 0 1 there exists eigenvector submatrix 𝑋 ̸= 𝑉 such that 𝑖𝑗 𝑖𝑗 span{𝑋 } = span{𝑉 } = J . Thus, the signal ex- where the Jordan chain vectors are the columns of 𝑉 𝑖𝑗 𝑖𝑗 𝑖𝑗 1 pansion(14)isnotuniquewhen𝐴hasJordansubspaces and 𝑉 and so satisfy (6). Since Jordan chains are not 2 of dimension 𝑟 >1. unique, the Jordan subspace is used in Section IV to In contrast, the Fourier transform given by (12) characterize the possible generalized eigenvectors. and(13)yieldsauniquesignalexpansionthatisindepen- IV. SPECTRALPROJECTOR-BASEDGRAPHSIGNAL dentofthechoiceofJordanbasis.GivenanyJordanba- PROCESSING sis𝑣1,...,𝑣𝑁 withrespectto𝐴,the𝑗thspectralcompo- This section presents a basis-invariant coordinate free nent of eigenvalue 𝜆𝑖 is, by (13), 𝑠̂︀𝑖𝑗 =∑︀𝑝𝑘+=𝑟𝑝𝑖𝑗−1𝑠̃︀𝑘𝑣𝑘, graph Fourier transform with respect to a set of known where 𝑣𝑝,...,𝑣𝑝+𝑟𝑖𝑗−1 are a basis of J𝑖𝑗. Under this proper eigenvectors. For graphs with diagonalizable ad- definition, there is no ambiguity in the interpretation of jacency matrices, the coordinate form of this transform frequencycomponentsevenwhenJordansubspaceshave resolves with appropriate choice of basis vectors to that dimension 𝑟 >1 or there are repeated eigenvalues. The of[1],[3].Theinterpretationofthespectralcomponents properties of the spectral components are discussed in is settled for the cases of repeated eigenvalues and non- more detail below. diagonalizable, or defective, adjacency matrices. Consider matrix 𝐴 with distinct eigenval- ues 𝜆 ,...,𝜆 , 𝑘 ≤ 𝑁, that has Jordan 1 𝑘 A. Spectral Components decomposition 𝐴 = 𝑉𝐽𝑉−1. Denote by J the 𝑗th 𝑖𝑗 Jordan subspace of dimension 𝑟 corresponding to 𝑖𝑗 ThespectralcomponentsoftheFouriertransform(12) eigenvalue 𝜆 , 𝑖 = 1,...,𝑘, 𝑗 = 1,...,𝑔 . Each J 𝑖 𝑖 𝑖𝑗 are expressed in terms of basis 𝑣 ,...,𝑣 and its dual 1 𝑁 is 𝐴-invariant and irreducible (see Section III-B). Then, basis𝑤 ,...,𝑤 sincethechosenJordanbasismaynot 1 𝑁 the Jordan subspaces are the spectral components of be orthogonal. Denote the basis and dual basis matrices the signal space 𝒮 = C𝑁 and define the graph Fourier by 𝑉 = [𝑣 ···𝑣 ] and 𝑊 = [𝑤 ··· ,𝑤 ]. By defini- 1 𝑁 1 𝑁 transform of a graph signal 𝑠∈𝒮 as the mapping tion,⟨𝑤 ,𝑣 ⟩=𝑤𝐻𝑣 =𝛿 ,where𝛿 istheKronecker 𝑖 𝑗 𝑖 𝑗 𝑖𝑗 𝑖𝑗 ⨁︁𝑘 ⨁︁𝑔𝑖 delta function [33], [34]. Then 𝑊𝐻𝑉 =𝑉𝐻𝑊 =𝐼, so ℱ :𝒮 → J𝑖𝑗 the dual basis is the inverse Hermitian 𝑊 = 𝑉−𝐻 and 𝑖=1 𝑗=1 correspond to the left eigenbasis. 𝑠→(𝑠̂︀11,...,𝑠̂︀1𝑔1,...,𝑠̂︀𝑘1,...,𝑠̂︀𝑘𝑔𝑘). (12) Partition Jordan basis matrix 𝑉 as (58) so that each Thatis,theFouriertransformof𝑠,istheuniquedecom- 𝑉𝑖𝑗 ∈ C𝑁×𝑟𝑖𝑗 spans Jordan subspace J𝑖𝑗. Similarly, partition the dual basis matrix by rows as 𝑊 = position [···𝑊𝐻···𝑊𝐻 ···]𝑇, with each 𝑊𝐻 ∈C𝑟𝑖𝑗×𝑁. Sup- 𝑠=∑︁𝑘 ∑︁𝑔𝑖 𝑠 , 𝑠 ∈J . (13) pose 𝑉𝑖𝑖1𝑗 = [𝑣1𝑖𝑔·𝑖··𝑣𝑟𝑖𝑗] with correspo𝑖𝑗nding coefficients ̂︀𝑖𝑗 ̂︀𝑖𝑗 𝑖𝑗 𝑠 ,...,𝑠 in the Jordan basis expansion (14). Define ̃︀1 ̃︀𝑟𝑖𝑗 𝑖=1𝑗=1 an𝑁×𝑁 matrix𝑉0 =[0···𝑉 ···0]thatiszeroexcept 𝑖𝑗 𝑖𝑗 The distinct eigenvalues 𝜆 ,...,𝜆 are the graph forthecolumnscorrespondingto𝑉 .Theneachspectral 1 𝑘 𝑖𝑗 frequencies of graph 𝒢(𝐴). The frequency or spectral component corresponding to Jordan subspace J can 𝑖𝑗 5 be written as (below, diagonal dots and elements not where 1 is at the 𝑖th index. Then it follows that shown are zero) 𝑃 𝑃 =𝑉 𝑊𝐻𝑉 𝑊𝐻 (23) 𝑖𝑗 𝑘𝑙 𝑖𝑗 𝑖𝑗 𝑘𝑙 𝑘𝑙 𝑠̂︀𝑖𝑗 =𝑠̃︀1𝑣1+···+𝑠̃︀𝑟𝑖𝑗𝑣𝑟𝑖𝑗 (15) =𝑉𝑖𝑗(︀𝛿𝑖𝑘𝛿𝑗𝑙𝐼𝑟𝑖𝑗×𝑟𝑘𝑙)︀𝑊𝑘𝐻𝑙. (24) =𝑉0𝑠 (16) 𝑖𝑗̃︀ If𝑖=𝑘and𝑗 =𝑙,then𝑃 𝑃 =𝑉 𝐼 𝑊𝐻 =𝑃 ; =𝑉0𝑉−1𝑠 (17) 𝑖𝑗 𝑘𝑙 𝑖𝑗 𝑟𝑖𝑗×𝑟𝑖𝑗 𝑖𝑗 𝑖𝑗 𝑖𝑗 otherwise, 𝑃𝑖𝑗𝑃𝑘𝑙 =0. ⎡ ⎤ ... (b) Write =𝑉 ⎢⎢⎣ 𝐼𝑟𝑖𝑗 ...⎥⎥⎦𝑉−1𝑠 (18) ∑︁𝑔𝑖 𝑃𝑖𝑗 =∑︁𝑔𝑖 𝑉 ⎡⎢⎢⎣... 𝐼𝑟𝑖𝑗 ⎤⎥⎥⎦𝑉−1 (25) ⎡... ⎤ 𝑗=1 𝑗=1 ... =𝑉 ⎢⎢⎣ 𝐼𝑟𝑖𝑗 ...⎥⎥⎦𝑊𝐻𝑠 (19) =𝑉 ⎛⎜⎜⎝∑︁𝑔𝑖 ⎡⎢⎢⎣... 𝐼𝑟𝑖𝑗 ⎤⎥⎥⎦⎞⎟⎟⎠𝑉−1 (26) =𝑉𝑖𝑗𝑊𝑖𝐻𝑗𝑠, (20) 𝑗=1 ... ⎡ ⎤ for 𝑖=1,...,𝑘 and 𝑗 =1,...,𝑔𝑖. Denote ... ⎢ ⎥ 𝑃𝑖𝑗 =𝑉𝑖𝑗𝑊𝑖𝐻𝑗, (21) =𝑉 ⎢⎢⎢ 𝐼∑︀𝑔𝑖 𝑟𝑖𝑗 ⎥⎥⎥𝑉−1 (27) which is the projection matrix onto 𝒮 parallel to ⎣ 𝑗=1 ⎦ complementary subspace 𝒮 ∖ 𝒮 . Note𝑖𝑗that 𝑃2 = ... 𝑖𝑗 𝑖𝑗 𝑉𝑖𝑗𝑊𝑖𝐻𝑗𝑉𝑖𝑗𝑊𝑖𝐻𝑗 =𝑉𝑖𝑗𝑊𝑖𝐻𝑗 =𝑃𝑖𝑗. =𝑍𝑖0, (28) The projection matrices {𝑃 }𝑟𝑖𝑗 are related to the 𝑖𝑗 𝑗=1 or the first component matrix of 𝐴 for eigenvalue 𝜆 . 𝑖 first component matrix 𝑍 of eigenvalue 𝜆 . The com- 𝑖0 𝑖 Theorem 1(a) shows that each projection matrix 𝑃 ponent matrix is defined as [28, Section 9.5] 𝑖𝑗 only projects onto Jordan subspace J . Theorem 1(b) 𝑖𝑗 ⎡ ⎤ ... shows that the sum of projection matrices for a given 𝑍𝑖0 =𝑉 ⎢⎢⎣ 𝐼𝑎𝑖 ⎥⎥⎦𝑉−1 (22) eviagluenev.alue equals the component matrix of that eigen- ... Thissectionprovidesthemathematicalfoundationfor a graph Fourier transform based on projections onto the where 𝑎 = ∑︀𝑔𝑖 𝑟 is the algebraic multiplicity 𝑖 𝑗=1 𝑖𝑗 Jordansubspaceofanadjacencymatrix.Thenextsection of 𝜆 . This matrix acts as a projection matrix onto the 𝑖 motivatesrankingsignalprojectionsonJordansubspaces generalized eigenspace. by the energy of the signal projections. Theorem1providesadditionalpropertiesofprojection matrix 𝑃 . 𝑖𝑗 V. GENERALIZEDPARSEVAL’SIDENTITY Theorem 1. For matrix 𝐴 ∈ C𝑁×𝑁 with eigen- values 𝜆 ,...,𝜆 , the projection matrices 𝑃 onto As discussed above, a chosen Jordan basis for ma- 1 𝑘 𝑖𝑗 the 𝑗th Jordan subspace J corresponding to eigen- trix 𝐴 ∈ C𝑁×𝑁, represented by the eigenvector ma- 𝑖𝑗 value 𝜆 , 𝑖 = 1,...,𝑘, 𝑗 = 1,...,𝑔 , satisfy the trix 𝑉, may not be orthogonal. Therefore, Parseval’s 𝑖 𝑖 following properties: identity may not hold. Nevertheless, a generalized Par- seval’s identity does exist in terms of the Jordan basis (a) 𝑃 𝑃 = 𝛿 𝛿 𝑃 , where 𝛿 is the Kronecker delta and its dual; see also [34]. For a dual basis matrix 𝑊 = 𝑖𝑗 𝑘𝑙 𝑖𝑘 𝑗𝑙 𝑖𝑗 function; 𝑉−𝐻, the following property holds: (b) ∑︀𝑔𝑗=𝑖 1𝑃𝑖𝑗 =𝑍𝑖0,where𝑍𝑖0isthecomponentmatrix Property 2 (Generalized Parseval’s Identity). Consider of eigenvalue 𝜆𝑖; graph signals 𝑠1,𝑠2 ∈ C𝑁 over graph 𝒢(𝐴), 𝐴 ∈ C𝑁×𝑁. Let 𝑉 = [𝑣 ···𝑣 ] be a Jordan basis for 𝐴 Proof: (a) Since 𝑊𝐻𝑉 = 𝐼, the partition of 𝑊𝐻 with dual basis 𝑊 =1 𝑉−𝐻𝑁partitioned as [𝑤 ···𝑤 ]. wanhder𝑉e t𝑟h𝑖𝑗atiysiethldesd(i2m0e)nssaitoisnfioesf𝑊the𝑖𝐻𝑗J𝑉o𝑘r𝑙da=n𝛿s𝑖u𝑘b𝛿s𝑗p𝑙𝐼a𝑟c𝑖e𝑗×c𝑟o𝑘r𝑙-, L𝑠eitn𝑠b=asi∑︀s 𝑉𝑁𝑖=1a⟨n𝑠d,𝑣𝑠𝑖⟩𝑣=𝑖 =∑︀𝑉𝑁𝑠̃︀𝑉⟨𝑠,b𝑤e t⟩h𝑤e r=epr𝑊ese𝑠n1tatbioen𝑁thoef respondingto𝑃 ,𝑟 thedimensionofJordansubspace 𝑖=1 𝑖 𝑖 ̃︀𝑊 𝑖𝑗 𝑘𝑙 representation of 𝑠 in basis 𝑊. Then corresponding to 𝑃 , and matrix 𝐼 consists of 𝑘𝑙 𝑟𝑖𝑗×𝑟𝑘𝑙 the first 𝑟 canonical vectors 𝑒 = (0,...,1,...,0), ⟨𝑠 ,𝑠 ⟩=⟨𝑠 ,𝑠 ⟩. (29) 𝑘𝑙 𝑖 1 2 ̃︀1,𝑉 ̃︀2,𝑊 6 By extension, As in [3], [37], the total variation for finite discrete- valued (periodic) time series 𝑠 is defined as ‖𝑠‖2 =⟨𝑠,𝑠⟩=⟨𝑠 ,𝑠 ⟩. (30) ̃︀𝑉 ̃︀𝑊 𝑁−1 ∑︁ Equations (29) and (30) hold regardless of the choice TV(𝑠)= |𝑠 −𝑠 |=‖𝑠−𝐶𝑠‖ , (35) 𝑛 𝑛−1mod𝑁 1 of eigenvector basis. 𝑛=0 Energy of spectral components. The energy of a where 𝐶 is the circulant matrix (3) that represents the discrete signal 𝑠∈C𝑁 is defined as [35], [36] DSP shift operator. As in [3], (35) is generalized to the 𝑁 graph shift 𝐴 to define the graph total variation 𝐸 =⟨𝑠,𝑠⟩=‖𝑠‖2 =∑︁|𝑠|2. (31) 𝑠 TV (𝑠)=‖𝑠−𝐴𝑠‖ . (36) 𝑖=1 𝐺 1 Equation (30) thus illustrates conservation of signal Matrix 𝐴 can be replaced by 𝐴norm = 1 𝐴 when |𝜆max| energy in terms of both a Jordan basis and its dual. the maximum eigenvalue satisfies |𝜆 |>0. max The energy of the signal projections onto the spectral Equation (36) can be applied to define the total components of 𝒢(𝐴) for the GFT (12) is next defined. variation of a spectral component. These components Write 𝑠̂︀𝑖,𝑗 in terms of the columns of 𝑉 as 𝑠̂︀𝑖,𝑗 = are the cyclic Jordan subspaces of the graph shift 𝐴 as 𝛼1𝑣𝑖,𝑗,1+...𝛼𝑟𝑖𝑗𝑣𝑖,𝑗,𝑟𝑖𝑗 and in terms of the columns of describedinSectionIII-B.ChooseaJordanbasisof𝐴so 𝑊 as 𝑠̂︀𝑖,𝑗 =𝛽1𝑤𝑖,𝑗,1+...𝛽𝑟𝑖𝑗𝑤𝑖,𝑗,𝑟𝑖𝑗. Then the energy that𝑉 istheeigenvectormatrixof𝐴,i.e.,𝐴=𝑉𝐽𝑉−1, of 𝑠̂︀𝑖𝑗 can be defined as where 𝐽 is the Jordan form of 𝐴. Partition 𝑉 into ‖𝑠 ‖2 =⟨𝛼,𝛽⟩ (32) 𝑁 ×𝑟𝑖𝑗 submatrices 𝑉𝑖𝑗 whose columns are a Jordan ̂︀𝑖𝑗 chainof(andthusspan)the𝑗thJordansubspaceJ of 𝑖𝑗 using the notation 𝛼 = (𝛼1,...,𝛼𝑟𝑖𝑗) and 𝛽 = eigenvalue 𝜆𝑖, 𝑖 = 1,...,𝑘 ≤ 𝑁, 𝑗 = 1,...,𝑔𝑖. Define (𝛽1,...,𝛽𝑟𝑖𝑗). the (graph) total variation of 𝑉𝑖𝑗 as The generalized Parseval’s identity expresses the en- TV (𝑉 )=‖𝑉 −𝐴𝑉 ‖ , (37) ergy of the signal in terms of the signal expansion coef- 𝐺 𝑖𝑗 𝑖𝑗 𝑖𝑗 1 ficients 𝑠̃︀, which highlights the importance of choosing where‖·‖1representstheinducedL1matrixnorm(equal aJordanbasis.ThisemphasizesthatboththeGFT{𝑠̂︀𝑖𝑗} to the maximum absolute column sum). and the signal expansion coefficients 𝑠 are necessary to ̃︀ The next theorem shows equivalent formulations for fully characterize the graph Fourier domain. the graph total variation (37). Normal 𝐴. When 𝐴 is normal (i.e., when 𝐴𝐴𝐻 = 𝐴𝐻𝐴), 𝑉 can be chosen to have unitary columns. Then, Theorem 3. The graph total variation with respect to 𝑉 =𝑊 so GFT (12) can be written as ⟨𝑠 ,𝑠 ⟩=⟨𝑠 ,𝑠 ⟩ (33) 1 2 ̃︀1 ̃︀2 ⃦ (︀ )︀⃦ and TV𝐺(𝑉𝑖𝑗)=⃦𝑉𝑖𝑗 𝐼𝑟𝑖𝑗 −𝐽𝑖𝑗 ⃦1 (38) ‖𝑠‖2 =⟨𝑠,𝑠⟩=‖𝑠̃︀‖2. (34) =𝑖=m2,.a..x,𝑟𝑖𝑗{|1−𝜆|‖𝑣1‖1,‖(1−𝜆)𝑣𝑖−𝑣𝑖−1‖1}. (39) Note that (33) and (34) do not hold in general for diagonalizable 𝐴. Proof: Simplify (37) to obtain While the Jordan basis, or choice of eigenvectors, is not unique, the image of a signal 𝑠 under the projection TV𝐺(𝑉𝑖𝑗)=⃦⃦𝑉𝑖𝑗 −𝑉𝐽𝑉−1𝑉𝑖𝑗⃦⃦1 (40) matrix𝑃𝑖𝑗 (21)isinvarianttothechoiceofJordanbasis. ⃦⃦ ⎡ 0 ⎤⃦⃦ ⃦ ⃦ This section shows that these projections can be ranked =⃦⃦𝑉𝑖𝑗 −𝑉𝐽⎣𝐼𝑟𝑖𝑗⎦⃦⃦ (41) in terms of the percentage of recovered signal energy. ⃦ 0 ⃦ 1 The next section demonstrates a total variation-based ⃦ ⎡ ⎤⃦ ⃦ 0 ⃦ ranking of the spectral components. ⃦ ⃦ =⃦𝑉𝑖𝑗 −𝑉 ⎣ 𝐽𝑖𝑗 ⎦⃦ (42) ⃦ ⃦ ⃦ 0 ⃦ 1 VI. TOTALVARIATIONORDERING =‖𝑉 −𝑉 𝐽 ‖ (43) 𝑖𝑗 𝑖𝑗 𝑖𝑗 1 ⃦ (︀ )︀⃦ Thissectiondefinesamappingofspectralcomponents =⃦𝑉𝑖𝑗 𝐼𝑟𝑖𝑗 −𝐽𝑖𝑗 ⃦1. (44) to the real line to achieve an ordering of the spectral Let 𝜆 denote the 𝑖th eigenvalue and the columns components. This ordering can be used to distinguish 𝑣 ,...,𝑣 of 𝑉 comprise its 𝑗th Jordan chain. generalizedlowandhighfrequenciesasin[3].Anupper 1 𝑟𝑖𝑗 𝑖𝑗 Then(38)canbeexpressedintermsoftheJordanchain: bound for a total-variation based mapping of a spectral component (Jordan subspace) is derived. TV (𝑉 ) (45) 𝐺 𝑖𝑗 7 ⃦ ⎡ ⎤⃦ ⃦ 1−𝜆 −1 ⃦ shift𝐴.Whilethistotalvariationboundmaynotcapture ⃦ ⃦ ⃦⃦[︀ ]︀⎢⎢ 1−𝜆 ... ⎥⎥⃦⃦ the true total variation of a spectral component, it can =⃦⃦⃦ 𝑣1...𝑣𝑟𝑖𝑗 ⎢⎢⎣ ... −1 ⎥⎥⎦⃦⃦⃦ (46) bpeongeenntseraaslsizoecdiataesdawnituhppaeJrorbdoaunndeqfuoirvaallelnscpeeccltarasls.cTohmis- ⃦ ⃦ ⃦ 1−𝜆 ⃦ concept is explored further in [31]. 1 ⃦[︀ ]︀⃦ =⃦ (1−𝜆)𝑣1 (1−𝜆)𝑣2−𝑣1 ··· ⃦1 (47) VII. APPLICATION = max {|1−𝜆|‖𝑣 ‖ ,‖(1−𝜆)𝑣 −𝑣 ‖ }. 1 1 𝑖 𝑖−1 1 𝑖=2,...,𝑟𝑖𝑗 We apply the graph Fourier transform (12) to a signal (48) based on 2010-2013 New York City taxi data [40] over the Manhattan road network. The signal represents the Theorem 4 shows that 𝑉 can be chosen such that four-yearaveragenumberoftripsthatpassthrougheach ‖𝑉 ‖ =1 without loss of generality. nodeoftheroadnetworkasdeterminedbyDijkstrapath 𝑖𝑗 1 estimates from the start- and end-coordinates (latitude Theorem4. Theeigenvectormatrix𝑉 ofadjacencyma- andlongitude)providedbytherawdata;thecomputation trix𝐴∈C𝑁×𝑁 canbechosensothateachJordanchain behindtheseestimatesisdescribedin[41].Theroadnet- represented by the eigenvector submatrix 𝑉𝑖𝑗 ∈C𝑁×𝑟𝑖𝑗 work consists of 6,408 nodes that represent latitude and satisfies ‖𝑉 ‖ = 1; i.e., ‖𝑉‖ = 1 without loss of 𝑖𝑗 1 1 longitude coordinates from [42] that are connected by generality. 14,418 directed edges that represent one-way directions Proof: Let 𝑉 represent an eigenvector matrix of as verified by Google Maps [43]. The adjacency matrix 𝐴 with partitions 𝑉 as described above, and let 𝐽 of the road network is defective with 253 Jordan chains 𝑖𝑗 𝑖𝑗 represent the corresponding Jordan block. Let 𝐷 be a of length 2 and 193 eigenvectors without Jordan chain block diagonal matrix with 𝑟 × 𝑟 diagonal blocks vectors corresponding to eigenvalue zero (446 Jordan 𝑖𝑗 𝑖𝑗 𝐷 = (1/‖𝑉 ‖ )𝐼 . Since 𝐷 commutes with 𝐽 , chains total). Details for the eigendecomposition of this 𝑖𝑗 𝑖𝑗 1 𝑟𝑖𝑗 𝑖𝑗 𝑖𝑗 𝐷 commutes with 𝐽. Note that 𝐷 is a special case matrix are described in [30], [44]. of upper triangular Toeplitz matrices discussed in [28, Figure 3a shows the signal, consisting of the four- Example 6.5.4, Theorem 12.4.1]. year June-August average number of trips at each node Let 𝑋 =𝑉𝐷 and 𝐵 =𝑋𝐽𝑋−1. Then of the Manhattan road network for Fridays 9pm-10pm. Applying the GFT (12) and computing the energies of 𝐵 =𝑋𝐽𝑋−1 (49) the signal projections as described in Section V yields a =𝑉𝐷𝐽𝐷−1𝑉−1 (50) highly expressed eigenvector shown in Figure 3b. This =𝑉𝐷𝐷−1𝐽𝑉−1 (51) eigenvector shows that most of the signal energy is concentrated at Gramercy Park (shown in black), north =𝑉𝐽𝑉−1 (52) of Gramercy Park, and in Hell’s Kitchen on the west =𝐴. (53) side of Manhattan. Therefore,both𝑉 and𝑋 areeigenvectormatricesof𝐴. VIII. LIMITATIONS In the following, it is assumed that 𝑉 satisfies Theo- The graph Fourier transform presented in this paper solvestheproblemofuniquenessfordefectiveadjacency rem 4. Theorem 5 presents an upper bound of (38). matrices by projecting a signal onto Jordan subspaces Theorem5. Considermatrix𝐴with𝑘 distincteigenval- instead of eigenvectors and generalized eigenvectors. ues and 𝑁 ×𝑟𝑖𝑗 matrices 𝑉𝑖𝑗 with columns comprising This method relies on Jordan chain computations, how- the 𝑗th Jordan chain of 𝜆𝑖, 𝑖 = 1,...,𝑘, 𝑗 = 1,...,𝑔𝑖. ever, which is sensitive to numerical errors and can ThenthegraphtotalvariationTV𝐺(𝑉𝑖𝑗)≤|1−𝜆𝑖|+1. be expensive when the number of chains to compute and the graph dimension are large and the computing Proof: Let ‖𝑉 ‖ =1 and rewrite (38): 𝑖𝑗 1 infrastructure is memory-bound. The computation can ⃦ ⃦ TV𝐺(𝑉𝑖𝑗)≤‖𝑉𝑖𝑗‖1⃦𝐼𝑟𝑖𝑗 −𝐽𝑖𝑗⃦1 (54) be accelerated by using equivalence classes over graph =⃦⃦𝐼𝑟𝑖𝑗 −𝐽𝑖𝑗⃦⃦1 (55) topologies, as we discuss in [31]. These classes allow matching between a given graph to graphs of simpler =|1−𝜆 |+1. (56) 𝑖 topologies such that the graph Fourier transform of a signal with respect to these graphs is preserver. Since Equations(38),(39),and(56)characterizethe(graph) theequivalenceclassesmaynotbeapplicabletoarbitrary total variation of a Jordan chain by quantifying the graphstructures,weexploreinexacteigendecomposition change in a set of vectors that spans the Jordan sub- methodsin[30]toreducetheexecutiontimeandbypass space J when they are transformed by the graph the numerically unstable Jordan chain computation. 𝑖𝑗 8 (a) Average number of June-August trips, Fridays 9pm-10pm (b) Maximum expressed eigenvector Fig.3:GraphsignalbasedonNYCtaxidata(a)andthemaximumexpressedeigenvector(b).(a)Colorsdenotelog 10 bins of the four-year average number of trips for Fridays 9pm-10pm (699 log bins; white: 0–20, beige to yellow: 20–230, orange: 230–400, red: 400–550, blue: 550–610, purple to black: 610–2,700. (b) All colors except white indicate locations with a concentrated number of taxi trips. Black indicates the locations of maximum expression in the eigenvector. The plots were generated with ggmap [38] and OpenStreetMap [39]. IX. CONCLUSION polynomial 𝜙𝐴(𝜆) = det(𝐴−𝜆𝐼) = ∏︀𝑘𝑖=1(𝜆−𝜆𝑖)𝑎𝑖, 𝐼 is the identity matrix, and exponent 𝑎 represents the The graph Fourier transform proposed here pro- 𝑖 algebraic multiplicity of eigenvalue 𝜆 , 𝑖 = 1,...,𝑘. vides a unique and non-ambiguous spectral decompo- 𝑖 Denote by Ker(𝐴) the kernel or null space of matrix 𝐴, sition for signals over graphs with defective, or non- i.e., the span of vectors 𝑣 satisfying 𝐴𝑣 = 0. The geo- diagonalizable, adjacency matrices. The transform is metricmultiplicity𝑔 ofeigenvalue𝜆 equalsthedimen- based on spectral projections of signals onto the Jordan 𝑖 𝑖 sion of null space Ker(𝐴−𝜆 𝐼). The minimal polyno- subspacesofthegraphadjacencymatrix.Thiscoordinate 𝑖 mial 𝑚 (𝜆) of 𝐴 has form 𝑚 (𝜆)=∏︀𝑘 (𝜆−𝜆 )𝑚𝑖, free graph Fourier transform is unique and leads to a 𝐴 𝐴 𝑖=1 𝑖 where 𝑚 is the index of eigenvalue 𝜆 . The index 𝑚 unique spectral decomposition of graph signals. This 𝑖 𝑖 𝑖 represents the maximum Jordan chain length or Jordan paper shows that the signal projections onto the Jordan subspace dimension, which is discussed in more detail subspacescanberankedbyenergyviaageneralizedPar- below. seval’sidentity.Lastly,atotalvariation-basedorderingof the Jordan subspaces is proposed. This allows ordering frequencies, and to define low-, high-, and band-pass Generalized eigenspaces. The eigenvectors and gen- graph signals. eralizedeigenvectorsofmatrix𝐴∈C𝑁×𝑁 partitionC𝑁 into subspaces, some of which are spans of eigenvec- APPENDIXA tors, eigenspaces, or generalized eigenspaces. Subspace BACKGROUNDONJORDANDECOMPOSITIONS G = Ker(𝐴−𝜆 𝐼)𝑚𝑖 is the generalized eigenspace 𝑖 𝑖 Direct sum. Let 𝑋 ,...,𝑋 be subspaces of vector or root subspace of 𝜆 . The generalized eigenspaces 1 𝑘 𝑖 space 𝑋 such that 𝑋 =𝑋 +···+𝑋 . If 𝑋 ∩𝑋 =∅ are 𝐴-invariant; that is, for all 𝑥 ∈ G , 𝐴𝑥 ∈ G . The 1 𝑘 𝑖 𝑗 𝑖 𝑖 for all 𝑖 ̸= 𝑗, then 𝑋 is the direct sum of subspaces subspace S = Ker(𝐴−𝜆 𝐼)𝑝, 𝑝 = 0,1,...,𝑁, is 𝑖𝑝 𝑖 {𝑋 }𝑘 , denoted as 𝑋 = ⨁︀𝑘 𝑋 . Any 𝑥 ∈ 𝑋 can the generalized eigenspace of order 𝑝 for 𝜆 . For 𝑝 ≥ 𝑖 𝑖=1 𝑖=1 𝑖 𝑖 be written uniquely as 𝑥 = 𝑥 +···+𝑥 , where 𝑥 ∈ 𝑚 , S =G . The proper eigenvectors 𝑣 of 𝜆 , or sim- 1 𝑘 𝑖 𝑖 𝑖𝑝 𝑖 𝑖 𝑋 , 𝑖=1,...,𝑘. ply eigenvectors of 𝜆 , are linearly independent vectors 𝑖 𝑖 Eigenvalues and multiplicities.Considermatrix𝐴∈ inS =Ker(𝐴−𝜆 𝐼),theeigenspaceof𝜆 .Thereare 𝑖1 𝑖 𝑖 C𝑁×𝑁 with 𝑘 distinct eigenvalues 𝜆 ,...,𝜆 , 𝑘 ≤ 𝑁. 𝑔 =dimS =dimKer(𝐴−𝜆 𝐼) eigenvectors of 𝜆 . 1 𝑘 𝑖 𝑖1 𝑖 𝑖 The eigenvalues of 𝐴 are the roots of the characteristic SubspacesS forma(maximal)chainof𝒢 asdepicted 𝑖𝑝 𝑖 9 in Figure 2; that is, [6] M. Püschel and J.M.F. Moura, “Algebraic signal processing theory: 1-d space,” IEEE Transactions on Signal Processing, {0}=S𝑖0 ⊂S𝑖1 ⊂···⊂S𝑖,𝑚𝑖 =S𝑖,𝑚𝑖+1 =···⊂C𝑁 vol.56,no.8,pp.3586–3599,Aug.2008. [7] M. Püschel and J. M. F. Moura, “Algebraic signal processing where 𝑚𝑖 is the index of 𝜆𝑖. Vectors 𝑣 ∈ S𝑖𝑝 but 𝑣 ∈/ theory: Cooley-Tukey type algorithms for DCTs and DSTs,” S are generalized eigenvectors of order 𝑝 for 𝜆 . IEEE Transactions on Signal Processing, vol. 56, no. 4, pp. 𝑖,𝑝−1 𝑖 1502–1521,April2008. Properties of Jordan subspaces. A Jordan sub- [8] K. Pearson, “On lines and planes of closest fit to systems of space(7)is𝐴-invariant;thatis,forall𝑥∈J,𝐴𝑥∈J. points in space,” Philosophical Magazine Series 6, vol. 2, no. The Jordan subspace J is also cyclic since it can be 11,pp.559–572,1901. [9] H.Hotelling, “Analysisofacomplexofstatisticalvariablesinto written by (6) as principalcomponents,” JournalofEducationalPsychology,vol. J =span(︀𝑣 ,(𝐴−𝜆𝐼)𝑣 ,...,(𝐴−𝜆𝐼)𝑟−1𝑣 )︀ 24,no.6,pp.417–441,498–âA˘S¸520,1933. 𝑟 𝑟 𝑟 [10] H. Hotelling, “Relations between two sets of variates,” (57) Biometrika,vol.28,no.3/4,pp.321–377,1936. for 𝑣 ∈Ker(𝐴−𝜆𝐼)𝑟, 𝑣 ̸=0. [11] J. F. Tenenbaum, V. Silva, and J. C. Langford, “A global 𝑟 𝑟 The number of Jordan subspaces corresponding to 𝜆 geometric framework for nonlinear dimensionality reduction,” 𝑖 Science,vol.290,pp.2319–2323,2000. equalsthegeometricmultiplicity𝑔 =dimKer(𝐴−𝜆𝐼), 𝑖 [12] S.RoweisandL.Saul, “Nonlineardimensionalityreductionby since there are 𝑔𝑖 eigenvectors of 𝜆𝑖. It can be shown locally linear embedding,” Science, vol. 290, pp. 2323–2326, that the Jordan spaces {J }, 𝑗 = 1,··· ,𝑔 and 𝑖 = 2000. 𝑖𝑗 𝑖 [13] M.BelkinandP.Niyogi,“Laplacianeigenmapsfordimensional- 1,··· ,𝑘, are disjoint. ityreductionanddatarepresentation,” NeuralComputation,vol. Jordan decomposition. Let 𝑉𝑖𝑗 denote the 𝑁 ×𝑟𝑖𝑗 15,no.6,pp.1373–1396,2003. matrixwhosecolumnsformaJordanchainofeigenvalue [14] D.L.DonohoandC.Grimes,“Hessianeigenmaps:Locallylinear 𝜆 of 𝐴 that spans Jordan subspace J . Then the embedding techniques for high-dimensional data,” Proceedings 𝑖 𝑖𝑗 oftheNationalAcademyofSciences,vol.100,no.10,pp.5591– generalized eigenvector matrix 𝑉 of 𝐴 is 5596,2003. [︀ ]︀ [15] Mikhail Belkin and Partha Niyogi, “Towards a theoretical 𝑉 = 𝑉11···𝑉1𝑔1 ··· 𝑉𝑘1···𝑉𝑘𝑔𝑘 , (58) foundation for laplacian-based manifold methods,” Journal of ComputerandSystemSciences,vol.74,no.8,pp.1289–1308, where 𝑘 is the number of distinct eigenvalues. The 2008. columns of 𝑉 are a Jordan basis of C𝑁. Then 𝐴 [16] D. Ganesan, B. Greenstein, D. Estrin, J. Heidemann, and R. Govindan, “Multiresolution storage and search in sensor has block-diagonal Jordan normal form 𝐽 consisting of networks,” ACM Transactions on Storage, vol. 1, pp. 277–315, Jordan blocks 2005. ⎡𝜆 1 ⎤ [17] R.Wagner,H.Choi,R.Baraniuk,andV.Delouille, “Distributed wavelettransformforirregularsensornetworkgrids,”inProceed- ⎢⎢ 𝜆 ... ⎥⎥ ingsofthe13thIEEEWorkshoponStatisticalSignalProcessing, 𝐽(𝜆)=⎢ ⎥. (59) 2005,pp.1196–1201. ⎢⎣ ... 1⎥⎦ [18] R.Wagner,R.Baraniuk,S.Du,D.Johnson,andA.Cohen, “An architecture for distributed wavelet analysis and processing in 𝜆 sensornetworks,” inProceedingsofthe5thACMInternational ConferenceonInformationProcessinginSensorNetworks,2006, of size 𝑟 ; see, for example, [28] or [45, p.125]. The 𝑖𝑗 pp.243–250. Jordannormalform𝐽 of𝐴isuniqueuptoapermutation [19] R.R.CoifmanandM.Maggioni, “Diffusionwavelets,” Applied of the Jordan blocks. The Jordan decomposition of 𝐴 is andComputationalHarmonicAnalysis,vol.21,no.1,pp.53–94, 𝐴=𝑉𝐽𝑉−1. 2006. [20] D.K.Hammond,P.Vandergheynst,andR.Gribonval, “Wavelets ongraphsviaspectralgraphtheory,”AppliedandComputational HarmonicAnalysis,vol.30,no.2,pp.129–150,2011. REFERENCES [21] S.K.NarangandA.Ortega, “Perfectreconstructiontwo-channel waveletfilterbanksforgraphstructureddata,”IEEETransactions [1] A. Sandryhaila and J.M.F. Moura, “Discrete signal processing onSignalProcessing,vol.60,no.6,pp.2786–2799,Jun.2012. on graphs,” IEEE Transactions on Signal Processing, vol. 61, no.7,pp.1644–1656,Apr.2013. [22] A.AgaskarandY.Lu, “Aspectralgraphuncertaintyprinciple,” [2] A.SandryhailaandJ.M.F.Moura, “Bigdataanalysiswithsignal IEEE Transactions on Information Theory, vol. 59, no. 7, pp. processingongraphs:Representationandprocessingofmassive 4338–4356,2013. data sets with irregular structure,” IEEE Signal Processing [23] V.Ekambaraman,BFanti,Ayazifar,andK.Ramchandran,“Mul- Magazine,vol.31,no.5,pp.80–90,Aug.2014. tiresolution graph signal processing via circulant structures,” in [3] A. Sandryhaila and J.M.F. Moura, “Discrete signal processing ProceedingsoftheIEEEDSP/SPEWorkshop,2013,pp.112–117. on graphs: Frequency analysis,” IEEE Transactions on Signal [24] X. Zhu and M. Rabbat, “Approximating signals supported on Processing,vol.62,no.12,pp.3042–3054,Jun.2014. graphs,” in Proceedings of the 37th IEEE International Con- [4] D. Shuman, S.K. Narang, P. Frossard, A. Ortega, and P. Van- ference on Acoustics, Speech, and Signal Processing (ICASSP), dergheynst, “Theemergingfieldofsignalprocessingongraphs: Mar.2012,pp.3921–3924. Extendinghigh-dimensionaldataanalysistonetworksandother [25] S. Aksoy, F. Chung, and X. Peng, “Extreme values of the irregular domains,” IEEE Signal Processing Magazine, vol. 30, stationary distribution of random walks on directed graphs,” no.3,pp.83–98,Apr.2013. AdvancesinAppliedMathematics,vol.81,pp.128–155,2016. [5] M. Püschel and J.M.F. Moura, “Algebraic signal processing [26] F. Chung, P. Horn, and L. Lu, “Diameter of random spanning theory:Foundationand1-dtime,” IEEETransactionsonSignal trees in a given graph,” Journal of Graph Theory, vol. 69, pp. Processing,vol.56,no.8,pp.3572–3585,Aug.2008. 223–240,2012. 10

See more

The list of books you might like

Most books are stored in the elastic cloud where traffic is expensive. For this reason, we have a limit on daily download.