ADAPTIVELINK SELECTION STRATEGIES FORDISTRIBUTED ESTIMATION IN WIRELESS SENSOR NETWORKS SongcenXu, RodrigoC. deLamareandH. VincentPoor CommunicationsResearch Group,Department ofElectronics UniversityofYork, U.K. Princeton University Emails: [email protected],[email protected],[email protected] 4 1 0 2 ABSTRACT network in order to select a subset of links that result in an im- n proved estimation performance. The first approach consists of an In this work, we propose adaptive link selection strategies for a exhaustive searchbased link selection (ESLS) algorithm, whereas distributedestimationindiffusion-type wirelessnetworks. Wede- J the second technique is based on a sparsity-inspired link selection velop an exhaustive search-based link selection algorithm and a 5 (SILS) algorithm. For the ESLS algorithm, we consider all pos- sparsity-inspiredlinkselectionalgorithmthatcanexploitthetopol- 1 sible combinations for each node with its neighbors and choose ogyofnetworks withpoor-qualitylinks. Intheexhaustivesearch- thecombinationassociatedwiththesmallestEMSEvalue. Forthe basedalgorithm, wechoose theset ofneighbors that resultsinthe ] SILS algorithm, we introduce the RZA strategy into the adaptive T smallestexcessmeansquareerror(EMSE)foraspecificnode.Inthe linkselectionalgorithm. TheRZAstrategyisusuallyemployedin sparsity-inspired link selection algorithm, a convex regularization I applicationstodealwithsparsesystemsinsuchawaythatitshrinks . isintroduced todeviseasparsity-inspiredlinkselectionalgorithm. s thesmallvaluesintheweightvectortozero,whichresultsinabetter c The proposed algorithms have the ability to equip diffusion-type convergence rateand steady-stateperformance. Unlikeprior work [ wireless networks and to significantly improve their performance. withsparsity-awarealgorithms[3,6,7,8,9,10],theproposedSILS Simulationresultsillustratethattheproposedalgorithmshavelower 1 algorithm exploits the possible sparsity of the EMSE associated EMSE values, a better convergence rate and significantly improve v witheach of the linksin anopposite way. Weintroduce a convex thenetworkperformancewhencomparedwithexistingmethods. 5 penalty, i.e., an ℓ1–norm term to adjust the combined coefficients 8 Index Terms— Adaptive link selection, diffusion networks, for each node with its neighbors in order to select the neighbor 7 wirelesssensornetworks,distributedprocessing. nodesthatyieldthesmallestEMSEvalues.Foraspecifiednode,we 3 calculate the EMSE all its neighbor nodes including the specified . node itselfthrough theprevious estimation. Forthenode withthe 1 1. INTRODUCTION maximum EMSE,weimpose a penalty and give thesame amount 0 ofawardtothenodewiththeminimumEMSE.TheproposedSILS 4 Distributed strategies have become very popular for parameter es- algorithm performs thisprocess automatically. By using the SILS 1 timationin wireless networks and applications such as sensor net- : works and smart grids [1, 2, 3, 4]. In this context, for each spe- algorithm some nodes with bad performance will be shrunk and v some poor nodes will taken into account when their performance cific node a set of neighbor nodes collect their local information i improves,whichmeansthenetworktopologywillchangeautomat- X andtransmittheestimatesbacktoaspecificnode. Manyworksin ically as well. Simulation results for an application to distributed theliteraturehaveproposedstrategiesfordistributedprocessing:in- r a cremental[1],diffusion[2],sparsity-aware[3]andconsensus-based system identification illustrate that, the proposed ESLS and SILS algorithms have a better convergence rate and lower EMSE value strategies[4]. Withdiffusionstrategies[2]andaleast-meansquare whencomparedwiththeexistingdiffusionLMSstrategies[2]. (LMS)algorithm,theneighborsforeachnodearefixedandthecom- binedcoefficientsarepre-calculatedafterthenetworkisgenerated. Thispaperisorganizedasfollows. Section2describesthedis- Thisapproachmaynotprovideanoptimizedestimationperformance tributedprocessinginthenetworksandproblemstatement. Insec- foreachspecifiednodebecausetherearelinksthataremoreseverely tion3, the proposed link selectionalgorithms are introduced. The affectedbynoiseorfading.Moreover,whenthenumberofneighbor numerical simulation results are provide in section 4. Finally, we nodes is large, each node requires a large network bandwidth and concludethepaperinsection5. transmit power. In order to reduce this bandwidth requirement, a Notation: Weuseboldfaceuppercaseletterstodenotematrices single-linkdiffusionstrategyhasbeenreportedrecentlyin[5],where andboldfacelowercaseletterstodenotevectors. Weuse(·)T and each node selects itsbest neighbor node. A key problem withthe (·)−1 denote the transpose and inverse operators respectively, and strategiesreportedsofarintheliteratureisthattheydonotexploit (·)∗forconjugatetransposition. thetopologyofwirelessnetworksandtheknowledgeaboutthepoor linkstoimprovetheperformanceofestimationalgorithms. In order to optimize the performance of distributed estimation 2. DISTRIBUTEDPROCESSINGINWIRELESS techniques in wireless networks and minimize the excess mean- NETWORKSANDPROBLEMSTATEMENT square error (EMSE) associated with the estimates, we propose two adaptive link selection algorithms. The proposed techniques Weconsideradiffusion–typewirelessnetworkwithNnodeswhich exploittheknowledgeaboutthepoorlinksandthetopologyofthe have limitedprocessing capabilities. At every timeinstant i, each nodektakesascalarmeasurementd(i)accordingto: k 20 d(ki) =ωH0 xk(i)+nk(i), i=1,2,...,N, (1) 18 14 12 11 wherex(ki)istheM×1inputsignalvector,nk(i)isthenoisesample 16 10 ateachnodewithzeromeanandvarianceσ2 .Through(1),wecan v,k 14 4 seethat themeasurements forall nodes arerelatedtoanunknown 20 18 vectorω . Fig.1showsanexampleforadiffusion–typewireless 12 0 network with 20 nodes. The aim for the diffusion – type network istocomputeanestimateofω0 inadistributedfashion,whichcan 10 15 7 9 minimizethecostfunction: 8 19 8 J (ω)=E|d(i)−ωHx(i)|2, (2) 6 ω k k 13 3 4 6 whereEdenotestheexpectationoperator.Tosolvethisproblem,one 17 possiblebasic diffusion strategyisthe adapt–then–combine (ATC) 2 2 diffusionstrategy[2]: 00 2 4 166 8 101 12 14 16 518 20 ψ(i) =ω(i−1)+µ x(i)[d(i)−ω(i−1)∗x(i)]∗, k k k k k k k ω(ki) = cklψl(i), (3) Fig.1.Networktopologywith20nodes l∈PNk wherec isthecombinecoefficient, whichiscalculatedunderthe kl Metropolisrules: where{nk}isthetotalnumberofnodeslinkedtonodekincluding nodekitself.Thiscombinatorialstrategywillcoverallcombination ckl = max(n1k,nl), ifk6=l arelinked choicesforeachnodekwithitsneighbors. ckl =0, forkandlnotlinked AfterthetentativesetΩs isdefined,weredefinethecostfunc- ckk =1− ckl, fork=l tionas l∈NPk/k (4) Jψ(ψ),E|d(ki)−x(ki)∗ψ|2, (7) andshouldsatisfy: where ψ, c ψ(i), (8) kl l ckl =1,l∈Nk∀k (5) lX∈Ωs Xl andtheerrorpatternisintroducedas: Forthiskindofstrategy,thechoiceoftheneighborsforeachnode e(i) ,d(i)−x(i)∗ c ψ(i). (9) isfixed,thissituationwillcauseaproblemwhensomeoftheneigh- Ωs k k kl l bornodeshaveapoorperformance, andthereisnochanceforthe lX∈Ωs nodetodiscardthebadperformanceneighborsinsteadofcontinue Foreachnodek,thestrategythatfindsthebestsetΩ shouldsolve s tousetheirinformation. Tooptimizethedistributedprocessingand thefollowingoptimization: improve the performance, we need to provide each node some re- finementsandtheabilitytoselectitslinks. Ω =argminJ (ψ) (10) s ψ Ωs c 3. PROPOSEDADAPTIVELINKSELECTION whichisequivalenttominimizingtheerrore(i).Afterallstepshave ALGORITHMS Ωs beencompleted,thediffusionstepin(3)canbemodifiedas: Inordertooptimizethedistributedprocessingandimprovetheper- ω(i) = c ψ(i). (11) formanceofthenetwork,weproposetheESLSandtheSILSalgo- k kl l Xc rithms.Thesetwoalgorithmicstrategiesgivethenodestheabilityto l∈Ωs choosetheirneighborsbasedontheirEMSEperformance. Atthispoint,themainstepsoftheESLSalgorithmhavebeencom- pleted. Theproposed ESLSalgorithmfindsthebest set Ω which s 3.1. Exhaustivesearch-basedlinkselection(ESLS) minimizes the cost function in (7) and then uses this set of nodes c TheESLSemploysanexhaustivesearchtoselectthelinksthatyield toobtainω(ki) through(11). TheESLSalgorithmissummarizedin the best performance in terms of EMSE. For our proposed ESLS Table1.WhentheESLSisimplementedinnetworkswithsmalland low-powersensors,thecostmaybecomeexpensive,asthealgorithm algorithm,weemploytheadaptationstrategygivenby in(6)requiresanexhaustivesearchandneedsmorecommunication ψ(ki) =ωk(i−1)+µkxk(i)[dk(i)−ωk(i−1)∗xk(i)]∗ resourcestoexamineallthepossiblesetsΩs. andredefinethediffusionstep. 3.2. Sparsity-inspiredlinkselection(SILS) First,weintroduceatentativesetΩ usingacombinatorialap- s proachdescribedby The ESLS algorithm outlined above mentioned problem needs to examineallpossiblesetstofindasolution,whichmightresultinan Ω ,Cj , j =1,2,...,nk, (6) unacceptablecomputationalcomplexityforlargenetworksoperating s nk betransformedintothelinkselectionmethodas: Table1.TheESLSAlgorithm FInoirtieaaliczhe:tiωm(ke−i1n)s=ta0nti=1,2,...,n ω(ki) =lX∈Nk[ckl−ρ∂∂fe1l(e(li))]ψ(li) (14) Foreachnodek=1,2,...,N ψ(i) =ω(i−1)+µ x(i)[d(i)−ω(i−1)∗x(i)]∗ where ρ is used to control the strength of the algorithm and it is k k k k k k k updatedbyusing end Foreachnodek=1,2,...,N ∂f (e(i)) sign(e(i)) findallpossiblesetsofΩ 1 l =ε l (15) e(i) =d(i)−x(i)∗ cs ψ(i) ∂el 1+ε|ξmin| Ωs k k kl l l∈PΩs In(15),theparameterξ standsfortheminimumvalueofe(i) in Ω =argmine(i) min l s Ωs Ωs eachgroupofnodesincludingeachnodek anditsneighbors. The ωc(i) = c ψ(i) functionsign(x)isdefinedas k kl l c l∈PΩs x/|x| x6=0 end sign(x)= (16) end (cid:26) 0 x=0 Tofurther simplifytheexpression in(14), weintroduce thevector andmatrixquantitiesrequiredtodescribetheadaptationprocess.We intime-varyingscenarios. Tosolvethecombinatorialproblemwith firstdefineavectorcthatcontainsthecombinationcoefficientsfor alowcomplexity,weproposetheSILSalgorithmwhichisassimple eachgroupofnodesincludingnodekanditsneighborsasdescribed asastandarddiffusionLMSalgorithmandissuitablefor adaptive by implementationsandscenarioswheretheparameterstobeestimated c,[c ] (17) (k,l) l∈Nk areslowlytime-varying. Then,weintroduceamatrixΨthatincludesalltheestimatedvectors whicharegeneratedaftertheadaptationstepin(3)foreachgroupas 2#-3,) 2#-3,) ! givenby !"#$%&’()*+#$,)) Ψ,[ψl(i)]l∈Nk (18) *-./$&’01) Anerrorvectorethatcontainsalltheerrorvaluescalculatedthrough (13)foreachgroupisexpressedby 4/5,%) 4/5,%) #6)!"#$%&’()*+#$,)*-./$&’01) e,[e(i)] (19) 2#-3,) 2#-3,) l l∈Nk To employ the sparsity –inspired approach, we have modified the !78!)) *-./$&’01) vectoreinthefollowingway.Themaximumvaluee(i)inewillbe l 4/5,%) 4/5,%) setto|e(li)|,whiletheminimumvaluee(li)willbesetto−|e(li)|.For 96)!78!)*-./$&’01) theremainingentries,theywillbesettozero. Finally,byinserting (15)-(19)into(14),thediffusionstepwillbechangedto Fig.2.SparsityAwareTechnology Nk ∂f ω(i) = [c −ρ 1(e )]Ψ k j ∂e j j Fig. 2showsthedifferencebetweentheexistingsparsity-aware Xj=1 j (20) methodsandtheproposedsparsity-inspiredstrategy. InFig. 2(a), wecanseethat,afterbeingprocessedbyasparsity-awarealgorithm, = Nk[c −ρε sign(ej) ]Ψ thenodeswithsmallerrorvalueswillbeshrunktozero.Incontrast, Xj=1 j 1+ε|ξmin| j thesparsity-inspiredalgorithmwillshrinkthenodeswithlargeerror valuestozeroasillustratedinFig.2. TheproposedSILSalgorithmperformslinkselectionbytheadjust- In the proposed SILS algorithm, we introduce the convex ment of the combination coefficients through c in (20). For the penalty term ℓ –norm into the diffusion step in (3) to perform neighbornodewiththelargestEMSEvalue,afterourmodifications 1 thelinkselection. Differentpenaltytermshavebeenconsideredfor fore,itse(i)valueinewillbeapositivenumberwhichwillleadto l thistask. Wehaveadoptedthereweightedzero–attracting strategy thetermρε sign(ej) in(20)beingpositivetoo.Thismeansthatthe [3]intothediffusionstepin(3)becausethisstrategyhasshownan 1+ε|ξmin| combiningcoefficientforthisnodewillbereducedandtheweight excellentperformanceandissimpletoimplement. for this node to build the ω(i) is reduced too. In contrast, for the First,weconsiderthefollowingregularizationfunction: k neighbornodewiththeminimumEMSE,asitse(i) valueinewill l f (e(i)))= log(1+ε|e(i)|) (12) be a negative number, the term ρε sign(ej) in (20) will be nega- 1 l l 1+ε|ξmin| lX∈Nk tive too. As a result, the weight for this node associated with the minimumEMSEtobuildtheω(i) isincreased. For theremaining wheretheerrorpatternel(i)isdefinedas: neighbor nodes, the e(i) value ink e is zero, which means the term l e(li) ,dk(i)−xk(i)∗ψl(i) (l∈Nk) (13) tρoεb1us+iigεlnd|ξ(methjien)|ωi(ni)(.2T0)heispzreorcoesasnfdorththereeciosmnboincahtainognecofoerffithcieeinrtwsiesigshtitlsl k andεistheshrinkagemagnitude.Then,thediffusionstepin(3)can satisfied(5).TheSILSalgorithmissummarizedinTable2. Table2.TheSILSAlgorithm 0 Initialize:ω(−1)=0 Diffusion ATC LMS without link selection k Foreachtimeinstanti=1,2,...,n −5 Diffusion ESLS link selection Foreachnodek=1,2,...,N Diffusion SILS link selection ψ(i) =ω(i−1)+µ x(i)[d(i)−`ω(i−1)∗x(i)]∗ −10 EMSE Bound k k k k k k k end −15 Foreachnodek=1,2,...,N B) e(li) =d(ki)−xk(i)∗ψl(i) (l∈Nk) SE (d−20 c=[c ] M (k,l) l∈Nk E Ψ=[ψ(i)] −25 l l∈Nk e=[e(i)] l l∈Nk −30 Findthemaximumandminimumtermsine Modifiedease=[0···0,|el(i)|,0···0,−|el(i)|,0···0] −35 max min ξmax=min(el(i)) |{z} | {z } −400 80 160 240 320itera40ti0ons 480 560 640 720 800 ω(i) = Nk[c −ρε sign(ej) ]Ψ k j 1+ε|ξmin| j jP=1 Fig.3.NetworkEMSEcurvesinastaticscenario end end 5 For the ESLS and SILS algorithms, we redesign the diffusion Diffusion ATC LMS without link selection stepandemploythesameadaptationprocedure,whichmeansthese Diffusion ESLS link selection 0 Diffusion SILS link selection twoalgorithmshavetheabilitytoequipanydiffusion–typewireless EMSE Bound networksbesidestheLMSstrategy.ThisincludesthediffusionRLS −5 strategy[11]andthediffusionconjugategradientstrategy[12]. −10 B) 4. SIMULATIONRESULTS E (d−15 S M E In this section, we compare our proposed diffusion link selection −20 algorithms ESLS and SILS with the traditional diffusion ATC al- gorithm[2]basedontheperformanceofEMSE.Withthenetwork −25 topologystructureinFig. 1,weintroduceN=20nodesinthissys- tem. The length for the unknown parameter ω is M=10 and it −30 0 is generated randomly. The input signal is generated as u(i) = k −35 [u(ki) uk(i−1) ... u(ki−M+1)]anduk(i) = xk(i)+ρkuk(i−1),where 0 80 160 240 320 itera40ti0ons 480 560 640 720 800 x(i)isawhitenoiseprocesstoensurethevarianceofu(i)σ2 =1. k k u,k ThenoisesamplesaremodeledascomplexGaussiannoisewithvari- Fig.4.NetworkEMSEcurvesinatime–varyingscenario anceofσ2 = 0.001. Thestepsizeforallthesethreealgorithmsis k µ= 0.045. Forthestaticscenario,thesparsityparametersofSILS algorithmaresettoρ = 4∗10−3 andε = 10. Theresultsareav- eragedover100independentruns.FromFig.3,wecanseethat,the ESLShasthebestperformanceonboththeEMSEandconvergence 5. CONCLUSION rate, it has around a 5 dB gain over the traditional diffusion ATC algorithm. SILSisabitworsethantheESLS,butstillsignificantly Inthispaper, twoadaptive link selectionstrategieshave been pro- betterthanthestandarddiffusionATCalgorithmbyabout4dB.For posedfordistributedestimationindiffusion–typewirelessnetworks. thecomplexityandprocessingtime, SILSisassimpleasthestan- TheESLSalgorithmusesanexhaustive searchtoperformthelink darddiffusionATCalgorithm,whileESLSismorecomplex.Forthe selection, and the SILSemploys a sparsity-inspired approach with time–varyingscenario,thesparsityparametersofSILSalgorithm aresettoρ = 6∗10−3 andε = 10. Theunknownvectorω isis theℓ1 –norm penalization. TheESLSalgorithmchooses thebest 0 set of nodes, while the SILS algorithm shrinks the node with the definedbythefirst–orderMarkovvectorprocess: highesterrorvaluesandawardsthenodewiththesmallesterrorsin eachgroup. Numericalresultshaveshownthatthetwoproposedal- ω0(i+1) =ω0(i)+z(i) (21) gorithmsachieveabetterconvergencerateandlowerEMSEvalues thanthealgorithmsin[2]. Theseresultsholdfor other algorithms wherez(i) isanindependent zero–meangaussianvectorprocess. includingRLSandCGtechniques. TheESLSandSILSalgorithms Fig. 4 shows that, for the time – varying scenario, the ESLS still can beused inany kind of diffusion – typewirelessnetworks and performsbest,whiletheSILShasthesecondbestperformance. appliedtoproblemsofstatisticalinferenceinsmartgrids. 6. REFERENCES [1] C.G.LopesandA.H.Sayed, “Incrementaladaptivestrategies overdistributednetworks,” IEEETrans.SignalProcess.,vol. 48,no.8,pp.223–229,Aug2007. [2] C.G.LopesandA.H.Sayed, “Diffusionleast-meansquares overadaptivenetworks: Formulationandperformanceanaly- sis,” IEEE Trans. Signal Process., vol. 56, no. 7, pp. 3122– 3136,July2008. [3] Y.GuY.ChenandA.O.Hero, “SparseLMSforsystemiden- tification,” Proc.ICASSP,pp.3125–3128,Taipei,May2009. [4] L.Xie,D.-H.Choi,S.Kar,andH.V.Poor, “FullyDistributed StateEstimationfor Wide-Area Monitoring Systems,” IEEE Transactions on Smart Grid, vol. 3, no. 3, pp. 1154–1169, September2012. [5] X. Zhao and A. H. Sayed, “Single-link diffusion strategies overadaptivenetworks,” Proc.IEEEICASSP,pp.3749–3752, Kyoto,Japan,March2012. [6] R.C.deLamareand R.Sampaio-Neto, “Adaptive Reduced- Rank Processing Based on Joint and Iterative Interpolation, DecimationandFiltering,” IEEETrans.SignalProcess.,vol. 57,no.7,pp.2503–2514,July2009. [7] R.C.deLamareandP.S.R.Diniz,“Set-MembershipAdaptive AlgorithmsbasedonTime-VaryingErrorBoundsforCDMA InterferenceSuppression,”IEEETrans.VehicularTechnology., vol.58,no.2,pp.644–654,February2009. [8] R.Meng,R.C.deLamare,andV.H.Nascimento, “Sparsity- Aware Affine Projection Adaptive Algorithms for System Identification,” Proc. Sensor Signal Processing for Defence Conference,London,UK,2011. [9] L.GuoandY.F.Huang,“Frequency-DomainSet-Membership FilteringandItsApplications,” IEEETrans.SignalProcess., vol.55,no.4,pp.1326–1338,April2007. [10] Z.Yang,R.C.deLamare,andXiangLi, “L1-regularizedstap algorithms with a generalized sidelobe canceler architecture forairborneradar,” SignalProcessing,IEEETransactionson, vol.60,no.2,pp.674–686,2012. [11] C.G.LopesF.CattivelliandA.H.Sayed, “AdiffusionRLS scheme for distributed estimation over adaptive networks,” Proc.IEEEWorkshoponSignalProcessingAdvancesinWire- less Communications (SPAWC), pp. 1–5, Helsinki, Finland, June2007. [12] S.XuandR.C.deLamare, “DistributedConjugateGradient StrategiesforDistributedEstimationOverSensorNetworks,” Proc. Sensor Signal Processing for Defence 2012, London, UK,2012. −5 Diffusion ATC LMS without link selection Diffusion ESLS link selection Diffusion SILS link selection −10 −15 ) −20 B d ( E S M E −25 −30 −35 −40 0 80 160 240 320 400 480 560 640 720 800 iterations