ebook img

A Noise Sensitivity Theorem for Schreier Graphs PDF

0.24 MB·English
Save to my drive
Quick download
Download
Most books are stored in the elastic cloud where traffic is expensive. For this reason, we have a limit on daily download.

Preview A Noise Sensitivity Theorem for Schreier Graphs

A NOISE SENSITIVITY THEOREM FOR SCHREIER GRAPHS MALINPALÖFORSSTRÖM Abstract. During the last 15 years, several extensions of the concept of noise sensitivity, 5 first coinedbyBenjamini,KalaiandSchramm in[4],have been studied. Onesuchextension 1 was studied in [11], where the definition of noise sensitivity was extended to include noise 0 consisiting of sequences of irreducible and reversible Markov chains. In this paper we focus 2 on the case where the Markov chain is a random walk on a Schreier graph, and show that a version of the Benjamini-Kalai-Schramm noise sensitivity theorem, connecting influences to n noise sensitivity, holds inthis setting. We then apply this result to give analternative proof u of one of the main results from a recent paper on exclusion sensitivity by Broman, Garban J andSteif[6]. 5 2 ] 1. Introduction R P In [4], Benjamini, Kalaiand Schramm coined the term noise sensitivity, looking at how likely . the output of a Boolean function were to be different at the starting point and the ending point h t of a continuous time random walks on a Hamming cube. In the same paper they gave a result, a now often called the Benjamini-Kalai-Schramm noise sensitivity theorem, connecting influences m to noise sensitivity, which they then used to show that percolation crossings of boxes in Z2 are [ very sensitive to resampling some of the edges in its domain. Since this paper was published, 2 several analogues of this theorem have been proved in slightly different settings, such as noise v being modelled as biasedrandomwalks on Hamming cubes [2], Brownianmotion [15], exclusion 8 processes [6] and random walks on Caley graphs [5]. Also, a quantitative version of the original 2 result has been given by Keller and Kindler [14]. Our main result is a version of the Benjamini- 8 1 Kalai-Schrammnoise sensitivity theorem for sequences of random walks on Schreier graphs. 0 To define what we mean by a Schreier graph, let be a finite group acting transitively on a G . finite set S. If g and x S we will write x to denote the action of g on x. Further, let U 1 g ∈ G ∈ 0 be a symmetric generating set of , meaning that U generates and U =U−1. Then S, and G G G 5 U generatethe Schreiergraph (S, ,U); havingvertex setS andanedge (x,y) wheneverthere 1 is u U with y =x . G G u : ∈ v By a random walk on a Schreier graph G(S, ,U) we mean the following. Let (tk)k 0 be a G ≥ i sequenceoftimeswitht =0andt t exp(1)independentlyforallk 1. Foreachk 1, X 0 k− k−1 ∼ ≥ ≥ pick anelementu U independently ofeverythingelseanduniformly atrandom. Pick X S k 0 r ∈ ∈ a uniformly at random and set Xt = X0 for all t [t0,t1). For any k 1 and t [tk,tk+1), set ∈ ≥ ∈ X = (X ) . If (X ) is chosen according to this rule, we say that (X ) is a random t tk−1 uk t t≥0 t t≥0 walkontheSchreiergraphG(S, ,U),andwheneverwetalkaboutarandomwalkinthesetting G ofSchreiergraphswe willthink ofthe randomwalkashavingbeengeneratedinthis way. As all Schreier graphs are regular, X will always be chosen according to the stationary measure π of 0 this random walk. Date:June26,2015. 1 2 MALINPALÖFORSSTRÖM Wheneverwe havea SchreiergraphG(S, ,U), we geta naturalnotionofthe influence I (f) u G of an element u U on a function f, by defining ∈ I (f)=P f(X )=f((X ) ) . u 0 0 u 6 (cid:16) (cid:17) In the special case when the Schreier graph G(S, ,U) is an n-dimensionalHamming cube (S = Zn, =(Zn, ),where iscoordinatewiseadditioGnmodule2,andU = (1,0,0,...), (0,1,0,...), (0,0,1,0,...),.. 2 G 2 ⊕ ⊕ { this definition coincides with the definition of influences used in e.g. [4] and [1]. We are now ready to present our main result. Theorem 1.1. Let G=G(S, ,U) be a Schreier graph and let f: S 0,1 . Let X =(X ) t t 0 be a random walk on G, let λGbe the spectral gap of X and let ρ be→the{log-}Sobolev constant≥of 1 X. Then for any r (0,1), any Λ>0 and any T >0, ∈ e Λlog(r)/ρ I (f)2 1/(1+r) (1) Cov(f(X0),f(XT) − u∈U u +Var(f) e−ΛT. ≤ 2λ · U · 1 (cid:18)P | | (cid:19) To make the main result more concrete, we now present two examples of families of Schreier graphs, and note what the spectral gaps and the log-Sobolev constants are in both cases. The first of these two examples has the n-dimensional Hamming cube as a special case. Example 1.2. Let Ω denote the Schreier graph G( Zn,(Zn, ), e n ), where is m,n { m m ⊕ { k}k=1 ⊕ coordinatewise addition modulo m, and e is the unique element in 0,1,...,m 1 n which k { − } is zero everywhere except at the kth coordinate where it is one. Note that when m = 2, Ωn is m an n-dimensional Hamming cube. Then U = n, and it is well known that for a fixed m N, n | | ∈ the randomwalkX(n) onΩm,n has spectralgapλ1(n) = 1−cos(n2π/m) ∼2π2/m2n andlog-Sobolev constant ρ(n) 4π2/5m2n (see e.g. [7] and [9]). ≥ Fix ε > 0 and k >0. If we let T = εn, Λ= k/m2n, it follows from (1) that for any function f : Zn 0,1 and any r(n) (0,1), n m →{ } ∈ Cov(fn(X0(n)),fn(Xε(nn))≤ m2ne−5kπlo2g(r(n))/2π2 · Pu∈UnnIu(fn)2!1/(1+r(n)+) Var(fn)·e−εk/m2. As this holds for all k >0, it follows that for any sequence (f ) , f : Zm to 0,1 , n n≥1 n n { } lim Cov(f (X(n)),f (X(n))=0 n 0 n εn n →∞ for any ε>0 whenever there is δ >0 such that for all large enough n, (2) Iu(fn)2 n−δ ≤ uX∈Un When m=2, this is the weaker version of the Benjamini-Kalai-Schramm noise sensitivity theo- rem presented in e.g. [12]. A commonly used example of a sequence of functions (f ) for which n n 1 ≥ Iu(fn)2 n−δ ≤ uX∈Un is the so called Tribes function which first studied by Ben-Or and Linial in [3]. To define this function,considerthefollowingscenario. Inanarealivesatotalofℓ tribes,eachhavingexactly n k members, which are in constant conflict with eachother. To decide whether or not to start a n war in the area, each tribe, independently of all other tribes, lets its members vote. A member votes 1 if she wants her tribe to start a war, and 0 else. A tribe starts a war if all its members A NOISE SENSITIVITY THEOREM FOR SCHREIER GRAPHS 3 votesforstartingawar. Thetribesfunctionisthefunctionfn: 0,1 ℓnkn 0,1 whoseentries { } →{ } represent the individual votes, and whose output is 1 if a war is started by any tribe and 0 else. To simplify the notation for influences, we will write I (f ) instead of I (f ). Then for each i n ei n i [n], one can check that ∈ 1 kn ℓn−1 1 kn−1 I (Tribes )= 1 , i ℓn,kn −(cid:18)2(cid:19) ! (cid:18)2(cid:19) implying that 1 kn 2(ℓn−1) 1 2(kn−1) I (Tribes )2 =ℓ k 1 . i i ℓn,kn n n −(cid:18)2(cid:19) ! (cid:18)2(cid:19) X In particular, if we set k = log n log log n , ℓ = n/k and assume that ℓ is an integer, n ⌊ 2 − 2 2 ⌋ n n n we obtain 2 log n (3) I (Tribes )2 2 . i ℓn,kn ≍ √n i (cid:18) (cid:19) X Thisshowsthat(Tribes ) isasequenceofBooleanfunctionssuchthatthesumofitssquared ℓn,kn n influences is small enough for the inequality in (2) to hold. We now consider a second example of a sequence of Scheier graphs, which is covered neither by the results in [4] nor by the results in [5] Example 1.3. Let(m ) beasequenceofintegerswithm nandletS bethesymmetric n n 1 n n ≥ ≍ group on n elements. Further, let et J denote the Schreier graph n,mn G( w 0,1 n: w =m ,S , transpositions of 1,2,...,n ) n n { ∈{ } k k } { { }} where S acts on a sequence w 0,1 n by permuting its elements. The graph J is the n ∈ { } n,mn so called Johnson graph with parameters n and m , and is an example of a graph that is a n Schreier graph but not a Caley graph. For any n and m , U = n, and the random walk n n | | X(n) on J has U = n , spectral gap λ(n) = 2/(n 1) [10] and log-Sobolev constant n,mn | n| 2 1 − ρ(n) =Θ 1/nlog2mnn((nn−−1m)n)(cid:0)[(cid:1)16]. Set Tn =n and Λ(n) =k/n for some k >0. Then using (1), for any r(n) (0,1) andany f : w 0,1 n: w =m 0,1 and some absolute constant (cid:0) (cid:1)n n ∈ { ∈{ } k k }→{ } C we have Cov(fn(X0),fn(Xn)≤ e−kClog(2r/(n()n)·l−og12)mnn((nn−−1m)n) · Pu∈Unn2Iu(fn)2!1/(1+r(n)+) Var(fn)·e−k = e−kClog(r(n))2·log2mnn((nn−−1m)n) · 2Pn(nu∈−(cid:0)Un1(cid:1)I)−u(rf(nn))2!1/(1+r(n)+) Var(fn)·e−k. As m n, if we pick r(n) constant, the term n ≍ e−kClog(r(n))·log2mnn((nn−−1m)n) will be bounded from above as n . As r(n) can be chosen to be arbitrarily close to zero, is →∞ follows that for any sequence f : w 0,1 n: w =m 0,1 , n n { ∈{ } k k }→{ } lim Cov(f (X ),f (X )=0 n 0 n n n →∞ if (4) I (f )2 <n1 2δ U 1/2 δ u n − n − ≍| | uX∈Un 4 MALINPALÖFORSSTRÖM for some δ >0. As in the previous example, one can show that one sequence of functions that satisfies (4) is the Tribes function defined in the previous example. ThemainmotivationfortryingtofindaversionoftheBenjamini-Kalai-Schrammnoisesensi- tivity theoreminthe setting ofrandomwalksonSchreiergraphswasthat quite recently,in[17], O’DonnellandWimmergeneralizedanothertheoreminvolvinginfluences,the socalledkklthe- orem,toSchreiergraphs. Althoughourproofisclosertotheproofoftheoriginalnoisesensitivity theorem as presented in eg. [12], we will use the terminology and notation from [17]. In the next section, we will present the main tools and techniques that are used in the proof. The proofofTheorem1.1is thenpresentedinSection3. Finally,in Section4,we willshowthat Theorem 1.1 provides an alternative proof of a weaker version of a noise sensitivity theorem for exclusion processes that was given in [6]. 2. Notation and tools Recall from the introduction that given a finite group which acts transitively on a finite G set S and a symmetric generating set U of G, we say that the graph with vertex set S and an edge between two vertices x and y whenever there is u U with y = x is the Schreier graph u ∈ (S, ,U). Note that all Schreier graphs are connected, regular and undirected. G G Throughout this paper, we will be concerned with reversible and irreducible continuous time Markov chains X which are random walks on Schreier graphs. For any such Markov chain X, we will let S denote the state space, Q = (q ) denote its generator matrix and π be its n ij i,j S ∈ stationary distribution. As all Schreier graphs are regular, the measure π will be a uniform measure on the state space. We write X to denote the position of X at time t R , and will t + ∈ always assume that X has been choosen according to π. 0 Next, for all t 0, let H denote the continuous time Markovsemigroupof the Markovchain t ≥ given by H =exp(tQ). t In other words, H operates on a function f with domain S by t H f()=E[f(X ) X = ]. t t 0 · | · For real valued functions f and g with domain S, we will use the inner product f,g = f,g =E[f(X )g(X )]. π 0 0 h i h i As X is assumed to be reversibleand irreducible, we can find a set, {ψj}|jS=|0−1 of eigenvectors to Q, with corresponding eigenvalues − (5) 0=λ <λ λ ... λ 0 1 2 S 1 ≤ ≤ ≤ | |− suchthat ψ isanorthonormalbasiswithrespectto , forthespaceofrealvaluedfunctions j { } h· ·i on S (see e.g. [8]). We will always let ψ 1 be the eigenvector corresponding to λ = 0. The 0 0 ≡ smallestnonzeroeigenvalue,λ ,iscalledthespectralgapoftheMarkovchainX,anditsinverse, 1 t := 1/λ is called the relaxation time. The eigenvectors ψ of Q will be eigenvectors to rel 1 j { } − Ht aswell, with corresponding eigenvalues e−λjt . Since the set ψj is an orthonormal basis for the set of real valued functions with dom{ain S}, for any f: S {R w}e can write → S 1 | |− f = f,ψ ψ . j j h i j=0 X To simplify notation, we will write fˆ(j) instead of f,ψ . For j = 0,1,..., S 1, the terms i h i | |− fˆ(j) are called the Fourier coefficients of f with respect to the basis ψ , and are useful for j { } A NOISE SENSITIVITY THEOREM FOR SCHREIER GRAPHS 5 expressing several probabilistic quantities which will be of interest of us. One of the simplest such quantities is the expected value of f(X ) for a function f: S R. Recalling that ψ 1, 0 0 → ≡ this can be expressed as E[f(X )]=E[f(X ) 1]= f,1 = f,ψ =fˆ(0). 0 0 0 · h i h i Although a little bit more complicated, using the orthonormality of the eigenvectors ψ , for i { } any t 0 we get that ≥ S 1 S 1 | |− | |− E[f(X )f(X )]=E[f(X )H f(X )]= f,H f = fˆ(i)ψ ,H fˆ(j)ψ 0 t 0 t 0 t i t j h i DXi=0 Xj=0 E S 1 S 1 S 1 S 1 | |− | |− | |− | |− = fˆ(i)ψi, fˆ(j)Htψj = fˆ(i)ψi, fˆ(j)e−λjtψj DXi=0 Xj=0 E DXi=0 Xj=0 E S 1 S 1 S 1 | |− | |− | |− = fˆ(i)fˆ(j)e−λjt ψi,ψj = e−λjtfˆ(i)2 h i i=0 j=0 j=0 X X X and consequently, that Cov(f(X ),f(X ))=E[f(X )f(X )] E[f(X )]E[f(X )] 0 t 0 t 0 t − =E[f(X )f(X )] E[f(X )]2 0 t 0 − (6) S 1 | |− = e−λjtfˆ(j)2. j=1 X NowrecallthatforanySchreiergraphG(S, ,U), anyfunctionf: S 0,1 andanyu U, G →{ } ∈ we defined the influence of u on f by I (f):=P(f(X )=f((X ) )). u 0 0 u 6 To connect this with the generator Q and the random walk X on G(S, ,U), for any w S we G ∈ define L f(w):=f(w) f(w ). With this notation Q can be written as u u − − 1 Q= L u − U | |u U X∈ and I (f)=P(f(X )=f((X ) ))=E[(f(X ) f((X ) ))2]= L f,L f . u 0 0 u 0 0 u u u 6 − h i Finally, for any u U and any f: S R, note that ∈ → 2 L f,f =E[(f(X ) f((X ) ))f(X )]+E[(f(X ) f((X ) ))f(X )] u 0 0 u 0 0 0 u 0 h i − − =E[(f(X ) f((X ) ))f(X )]+E[(f((X ) ) f(((X ) ) ))f((X ) )] 0 0 u 0 0 u 0 u u 0 u − − =E[(f(X ) f((X ) ))f(X )]+E[(f((X ) ) f(X ))f((X ) )] 0 0 u 0 0 u 0 0 u − − =E[(f(X ) f((X ) ))f(X )] E[(f(X ) f((X ) ))f((X ) )] 0 0 u 0 0 0 u 0 u − − − =E[(f(X ) f((X ) ))(f(X ) f((X ) ))] 0 0 u 0 0 u − − = L f,L f . u u h i In the proof of Theorem 1.1, we will use the following so called hypercontractive property of the operator H (see e.g. [17], [9] and [13]). t 6 MALINPALÖFORSSTRÖM Theorem 2.1. Let ρ be the log-Sobolev constant for the random walk (X ) on the Schreier t t 0 graph G(S, ,U). Then for all p and q satisfying 1 p q and q−1 ≥exp(2ρt), and all G ≤ ≤ ≤ ∞ p 1 ≤ f L2(X), we have − ∈ H f f , t q p k k ≤k k where =E[ p]1/p, =E[ q]1/q and =E[ 2]1/2. p q 2 k·k |·| k·k |·| k·k |·| The log-Sobolev constant of a Schreier graph G(S, ,U) is the largest constant such that for all nonconstant functions f: S R, G → 1 ρ I (f) Cov f(X(n))2,log((f(X(n))2) . U u ≥ 2 · 0 0 | |uX∈U (cid:16) (cid:17) We will only use Theorem 2.1 for q = 2, in which case the only condition on p is that p 1+exp( 2ρt). ≥ − In the rest of this section, we will introduce the definition of noise sensitivity used in [4] and give an analogue of this definition in the setting of Schreier graphs. The main reason for introducingthisconceptistogivesometerminologyforthebehaviourofthelefthandsideof (1) for sequences (f ) , and also to give some perspective on what kind of convergence of this n n 1 ≥ term we are interested in. Definition 2.2. Let (X(n)) be a sequence of reversible and irreducible Markov chains, with n 1 state spaces (S(n)) and st≥ationary distributions (π ) , and let (T ) be a sequence of n 1 n n 1 n n 1 real positive number≥s. A sequence (f ) of Boolean fun≥ctions with f : S≥(n) 0,1 is said n n 1 n to be noise sensitive with respect to (X(≥n),T ) if for all ε>0 → { } n n 1 ≥ (7) lim Cov f (X(n)),f (X(n)) =0. n n 0 n εTn →∞ (cid:16) (cid:17) It is easy to show that if (X(n)) is a sequence of reversible and irreducible continuous n 1 time Markov chains and (f ) is a≥sequence of Boolean functions with domains (S(n)) such n n 1 ≥ that lim Var(f ) = 0, then (f ) will automatically be noise sensitive with respect to n n n n 1 (X(n)) →.∞For this reason, a commo≥n restriction is to consider only sequences of Boolean n 1 ≥ functions which satisfy lim Var(f )>0. n n →∞ If this holds for a sequence (f ) of Boolean functions, we say that (f ) is nondegenerate. n n 1 n n 1 ≥ ≥ This property holds in both the previously given examples. Using the eigenvalues of the generator Q , we will now give another characterization of n − noise sensitivity. Both this lemma and its proof are completely analogous to the first part of Theorem 1.9 in [4]. Lemma 2.3. A sequence of Boolean functions (f ) , f : S(n) 0,1 , is noise sensitive n n 1 n with respect to (X(n),T ) if and only if for all ε>≥0, → { } n n 1 ≥ S(n) 1 (8) lim | |− e−εTnλi(n)fˆn(i)2 =0. n →∞ i=1 X Consequently, (f ) is noise sensitive with respect to (X(n),T ) if and only if for all k >0, n n 1 n n 1 ≥ ≥ (9) lim fˆ (i)2 =0. n n →∞i:λi(nX)<k/Tn A NOISE SENSITIVITY THEOREM FOR SCHREIER GRAPHS 7 Proof of Lemma 2.3. Fix ε>0 and let t=εT . Then from 6 it follows that n 2 Cov(f (X(n)),f (X(n)))=E f (X(n))f (X(n)) E f (X(n)) n 0 n εTn n 0 n εTn − n 0 (10) S(n)h 1 i h i =| |−e−ελj(n)Tnfˆn(j)2. j=1 X Foranyε>0,itis easytosee thatthe left handside of (10)tends to zeroasn if andonly if (9) holds. From this the desired conclusion follows. →∞ (cid:3) Remark 2.4. By the last lines of the proof above it follows that if a sequence of functions satisfies (7) for one ε>0, then it does so for all ε>0, i.e. the proofof Lemma 2.3 in fact shows that a sequence of Boolean functions (f ) is noise sensitive with respect to (X(n),T ) if n n 1 n n 1 ≥ ≥ and only if (n) (n) lim Cov(f (X ),f (X ))=0. n n 0 n Tn →∞ Theconceptofnoisesensitivityrelatestoourmainresultasfromthisitfollowsthatasequence of Boolean functions f : S(n) 0,1 is noise sensitive if n →{ } lim I (f )2. u n n →∞uX∈Un Insomecases,thismakesitsimplertoshowthatasequenceofBooleanfunctionsisnoisesensitive, as it reduces a question about a process to a question about the geometry of a function, as the influences do not depend on time. In [4], this was the idea that enabled Benjamini, Kalai and Schrammto show that the sequence of indicator functions of percolationcrossingsof a sequence of rectangles of increasing size is sensitive to noise. 3. Proof of main result The main purpose of this section is to give a proof of Theorem 1.1. To this end, we will first state and prove a lemma that will be needed in the proof. This result provides an analogue of a result stating that for the Hamming cube Ω , for any f: 0,1 n 0,1 and any i 2,n { } → { } ∈ 1,2,...,n , { } (11) 2λ(n) U fˆ(i)2 = L f,ψ(n) 2 i | n| h u i i u U X∈ (see e.g. the proof of Proposition V.7 in [12]). Whenever Ψ =Ψ is the span of all eigenvectors corresponding to some eigenvalue λ, we say λ that Ψ is the eigenspace corresponding to λ. Using this notation, we can now state the revised form of (11). Lemma 3.1. Let G(S, ,U) be a Schreier graph and let f: S 0,1 be a Boolean function G → { } with domain S. Let Q be the generator of the corresponding random walk. Further, let Ψ be an λ eigenspace of Q and let ψ ,...,ψ be an orthonormal basis of Ψ . Then 1 m λ − (12) 2λU f,ψ 2 = L f,ψ 2. i u i | |h i h i i∈{1X,...,m} i∈{1X,...,m}uX∈U Remark 3.2. (11) is clearly stronger than Lemma 3.1, as the latter contains an additional sum over an eigenspace. The reason why we cannot apply the same proof as for the Hamming cube is that the proof in that case relies heavely on knowing the eigenvectors of the random walk, whereas in a more general setting, these are not known, requiring an altogether different proof. In fact, (11) does not hold for all Schreier graphs. To see this, consider the Schreier graph 8 MALINPALÖFORSSTRÖM G(S , S , (12),(13),(14),(23),(24),(34) ,i.e. the graphgeneratedby the transpositions of the 4 4 { } symmetric group of order four. Then it is straight forward to check that the function ψ(σ)= 1+2 1 σ(1) 1,2 − · ∈{ } isaneigenvectorwitheigenvalueλ=2/3to Q. Setf(σ)=1 . Then f,ψ = 1,implying − σ(1)=1 h i 4 that the summand in the left hand side of (11) corresponding to this eigenvector is 2 2 4 1 1 2λU f,ψ 2 =2 = . | |h i · 3 · 2 · 4 2 (cid:18) (cid:19) (cid:18) (cid:19) Foreachtermintheinnersumontherighthandsideofthesameequationwehave L f,ψ = 1 h (ij) i 4 ifi=1(weassumei<j)and0else,implyingthatthesumontherighthandsideof (11)equals 1 2 3 L f,ψ 2 =3 = h (ij) i · 4 16 (iXj)∈U (cid:18) (cid:19) which clearly does not equal 1. 2 To be able to give a proof of Lemma 3.1, we will need the following lemma. Lemma 3.3. For any eigenspace Ψ of Q , any orthonormal basis ψ , ..., ψ of Ψ and any λ 1 m λ − u U, the set ψ , ψ (w):=ψ (w ), is also an orthonormal basis for Ψ . i,u i 1,...,m i,u i u λ ∈ { }∈{ } Proof. As ψ ,ψ = ψ ,ψ i,u j,u i j h i h i itisimmediatelyclearthat ψ isanorthonormalset,soitremainstoshowthatψ Ψ = i,u i j,u λ { } ∈ Span ψ foranyj 1,...,m . Toobtainthis,itisenoughtoshowthatψ isaneigenvector i i j,u { } ∈{ } of Q with eigenvalue λ. However this is immidiate, as − S(n) 1 S(n) 1 | |− | |− Qψ = Qψ ,ψ ψ = Qψ ,ψ ψ j,u j,u k,u k,u j k k,u − h− i h− i k=0 k=0 X X S(n) 1 S(n) 1 | |− | |− = λψ ,ψ ψ = λ ψ ,ψ ψ =λψ . j k k,u j k k,u j,u h i h i k=0 k=0 X X (cid:3) Proof of Lemma 3.1. For any f: S 0,1 and any u U, define the function f : S 0,1 u →{ } ∈ →{ } by f (w):=f(w ). Then, as L f =f f , for any i 0,1,..., S 1 we have u u u u − ∈{ | |− } L f,ψ 2 = f,ψ 2 2 f,ψ f ,ψ + f ,ψ 2 u i i i u i u i h i h i − h ih i h i u U u U X∈ X∈ (cid:8) (cid:9) = 2 f,ψ L f,ψ + f ,ψ 2 f,ψ 2 . i u i u i i h ih i h i −h i u U X∈ (cid:8) (cid:9) Morover,using that L f,ψ = L f,ψ = U Qf,ψ = U Q fˆ(j)ψ ,ψ u i u i i j i | |− | | − uX∈UD E DuX∈U E D E D Xj E = U fˆ(j) Qψ ,ψ = U fˆ(j)λ ψ ,ψ =λU fˆ(i) j i j i | | − | | | | Xj D E Xj D E A NOISE SENSITIVITY THEOREM FOR SCHREIER GRAPHS 9 it follows that L f,ψ 2 =2λU fˆ(i)2+ f ,ψ 2 f,ψ 2 u i u i i h i | | h i −h i u U u U X∈ X∈ (cid:8) (cid:9) =2λU fˆ(i)2+ f ,ψ 2 U f,ψ 2. u i i | | h i −| |h i u U X∈ Now recall that what we want to prove is that 2λU f,ψ 2 = L f,ψ 2. i u i | |h i h i i∈{1X,2,...,m} i∈{1X,2,...,m}uX∈U This follows if we can show that (13) f ,ψ 2 = U f,ψ 2. u i i h i | | h i i∈{1X,...,m}uX∈U i∈{1X,...,m} As all sums in (13) are finite, for the left hand side of this equation we have f ,ψ 2 = f(w )ψ (w) 2 u i u i h i h i i∈{1X,...,m}uX∈U i∈{1X,...,m}uX∈U = f(w)ψ (w ) 2. h i u−1 i i∈{1X,...,m}uX∈U As U =U 1 by definition, it follows that − f ,ψ 2 = f(w)ψ (w ) 2 u i i u h i h i i∈{1X,...,m}uX∈U i∈{1X,...,m}uX∈U = f,ψ 2 i,u h i i∈{1X,...,m}uX∈U = f,ψ 2. i,u h i uX∈Ui∈{1X,...,m} By Lemma 3.3 ψ is an orthonormal basis for Ψ for any u U. By combining this i,u i 1,...,m { }∈{ } ∈ fact with Parseval’s identity, we obtain f,ψ 2 = f,ψ 2 i,u i h i h i i∈{1X,...,m} i∈{1X,...,m} from whic (13), and thereby the lemma, readily follows. (cid:3) We are now ready to give a proof of Theorem 1.1. Proof of Theorem 1.1. Note first that by Lemma 3.1; S 1 1 | |− fˆ(i)2 L f,ψ 2. u i ≤ 2λ U h i 1 i≥1X:λi<Λ | | Xi=1 uX∈U As by definition, for any t>0 we have S 1 | |− kHt(Luf)k22 = e−2λjthLuf,ψji2 i=1 X we get e2Λt fˆ(i)2 H (L f) 2. ≤ 2λ U k t u k2 1 i≥1X:λi<Λ | |uX∈U 10 MALINPALÖFORSSTRÖM By the hypercontractivity principle, H (L f) 2 L f 2 k t u k2 ≤k u kp whenever p [1,2] and p 1+exp( 2ρt), where ρ is the log-Sobolev constant for the random ∈ ≥ − walkonG. Nowasf 0,1 ,wehavethat L f 2 = L f 4/p. Also, L f 2 =I (f). Putting ∈{ } k u kp k u k2 k u k2 u these two observations together yields H (L f) 2 L f 2 = L f 4/p =I (f)2/p k t u k2 ≤k u kp k u k2 u which in turn implies that e2Λt fˆ(i)2 I (f)2/p. u ≤ 2λ U 1 i≥1X:λi<Λ | |uX∈U Multiplying by one and then applying Hölder’s inequality, we obtain 1/p p−1 fˆ(i)2 e2Λt Iu(f)2/p 1 e2Λt Iu(f)2 1p−p1 p ≤ 2λ1 U · ≤ 2λ1 U ! ! i≥1X:λi<Λ | |uX∈U | | uX∈U uX∈U = e2Λt Iu(f)2 1/p U p−p1 = e2Λt u∈UIu(f)2 1/p. 2λ1|U| u U ! | | 2λ1 ·(cid:18)P |U| (cid:19) X∈ Since the only restriction on p is that p 1+exp( 2ρt), we can set p = 1+exp( 2ρt). Using ≥ − − this, the previous inequality simplifies to fˆ(i)2 e2Λt u∈UIu(f)2 1/(1+exp(−2ρt)). ≤ 2λ · U i≥1X:λi<Λ 1 (cid:18)P | | (cid:19) Setting r :=exp( 2ρt) we obtain − fˆ(i)2 e−Λlog(r/ρ) u∈UIu(f)2 1/(1+r). ≤ 2λ · U i≥1X:λi<Λ 1 (cid:18)P | | (cid:19) If we now use (6), the desired inequality readily follows (cid:3) 4. Applying the noise sensitivity theorem to exclusion sensitivity The purpose of this section is to use the main result in the previous section, Theorem 1.1, to give a proof of a weaker version of Theorem 1.14 in [6], which connects influences on Ω 2,n to a concept which the authors call exclusion sensitivity. To this end, we first define what we mean by exclusion sensitivity. Let (f ) be a sequence of Boolean functions with domain Zn. n n 1 2 Pick X(n) according to π , where π is≥the uniform measure probability measure on Zn, and 0 n n 2 let X(n) be a random walk on J , where J , is the graph defined in Example 1.3 n, X(n) n, X(n) k 0 k k 0 k . Equivalently, X(n) is a random walk on n J with X(n) π . We say that (f ) is exclusion sensitive if for all α>0, m=0 n,m 0 ∼ n n n≥1 S lim Cov(f (X(n)),f (X(n)))=0. n 0 n εn n →∞ In [6], Broman, Garban and Steif proved the following result.

See more

The list of books you might like

Most books are stored in the elastic cloud where traffic is expensive. For this reason, we have a limit on daily download.