ebook img

Andrea Goldsmith Lecture PDF

100 Pages·2010·12.13 MB·English
Save to my drive
Quick download
Download
Most books are stored in the elastic cloud where traffic is expensive. For this reason, we have a limit on daily download.

Preview Andrea Goldsmith Lecture

Andrea  Goldsmith   Stanford  University   2010 School of Information Theory University of Southern California Aug. 5,  2010 Future  Wireless  Networks   Ubiquitous  Communica/on  Among  People  and  Devices   Next-­‐genera4on  Cellular   Wireless  Internet  Access   Wireless  Mul4media   Sensor  Networks     Smart  Homes/Spaces   Automated  Highways   In-­‐Body  Networks   All  this  and  more  … Challenges     Fundamental  capacity  limits  of  wireless  networks   are  unknown  and,  worse  yet,  poorly  defined.       Wireless  network  protocols  are  generally  ad-­‐hoc     Applica4ons  are  heterogeneous  with  hard   constraints  that  must  be  met  by  the  network     Energy  and  delay  constraints  change  fundamental   design  principles Fundamental  Network  Capacity     The  Shangri-­‐La  of  Informa/on  Theory     Much  progress  in  finding  the  capacity  limits  of   wireless  single  and  mul4user  channels     Limited  understanding  about  the  capacity  limits  of   wireless  networks,  even  for  simple  models     System  assump4ons  such  as  constrained  energy  and   delay  may  require  new  capacity  defini4ons     Is  this  elusive  goal  the  right  thing  to  pursue?   Shangri-La is synonymous with any earthly paradise; a permanently happy land, isolated from the outside world Wireless  Channel  and  Network  Capacity   Fundamental  Limit  on  Data  Rates   The set of simultaneously achievable rates with P →0 e R 3 R R 2 1 (R R …,R ) 12, 13,, 1n   Main  drivers  of  channel  capacity     Bandwidth  and  power     Sta4s4cs  and  dynamics  of  the  channel     What  is  known  about  the  channel  at  TX  and/or  RX     If  feedback  is  available     Number  of  antennas     Single  or  mul4-­‐hop In  the  beginning...     Shannon’s  Mathema/cal  Theory  of  Communica/on  derived   fundamental  data  rate  limits  of  digital  communica4on  channels.      Shannon  capacity  is  independent  of  the  transmission  and   recep4on  strategy;  Depends  only  on  channel  characteris4cs     Shannon  capacity  for  sta4onary  and  ergodic  channels  is  the   channel’s  maximum  mutual  informa4on      Significance  of  capacity  comes  from  Shannon's  coding  theorem   and  converse     Show  capacity  is  the  channel’s  maximum  “error-­‐free”  data Shannon  Capacity     Discrete  memoryless  channel  w/  input  X∈X,  output  Y∈ Y   has  mutual  informa/on  (MI)     Related  to  the  no4on  of  entropy:       Shannon  proved  DMC  has  capacity  equal  to  it’s  maximum  MI     Maximum  is  taken  over  all  possible  input  distribu4ons Coding  Theorem     Coding  theorem:  A  code  exists  with  rate  R=C-­‐∈  with   probability  of  error  approaching  zero  with  blocklength     Shannon  provided  a  random  code  with  this  property     Decoding  is  based  on  the  no4on  of  typical  sets:   Sets  where  the  probability  of  the  realiza4on  of  xn,  yn,  (xn,yn)  approximates     that  of  their  entropies     Decode  sequence  xn  that  is  jointly  typical  with    received  sequence  yn     For  codes  with  large  blocklengths,  probability  of  error  approaches  0       Shannon’s  “Communica4on  in  the  presence  of  noise”                          P    +    N     uses  geometry  for  AWGN  coding  theorem  proof     Input  occupies  sphere  of  radius  P     Output  occupies  sphere  of  radious  P+N     Capacity  corresponds  to  number  of  input                                                                     messages  that  lie  in  nonoverlapping  output  spheres Converse     Oben  based  on  Fano’s  inequality  for  message  W nR=H(W)=H(W|Yn)+I(W;Yn) ≤ H(W|Yn)+I(Xn(W);Yn) since W=f(Xn) ≤  1+P (n)nR+ I(Xn(W);Yn)≤ H(W|Yn)+nC e   This  implies   P (n)≥1-1/(nR)-C/R as n→∞ e Error  bounded  away  from  zero  for  R>C   Fano’s  inequality  based  on  cutset  bound      Loose  for  many  channels  /networks Capacity  of  AWGN  Channels   n[i] x[i] y[i] +   Discrete-­‐4me  channel;  x[i]  is  input  at  4me  i,  y[i]  is  output,  n[i]   is  iid  sample  of  white  Gaussian  noise  process  w/  PSD  N 0       Channel  bandwidth  is  B,  received  power  is  P     Received  signal-­‐to-­‐noise  power  ra4o  (SNR)  is  SNR=γ=P/(N B)   0   Maximum  mutual  informa4on  achieved  with  Gaussian  inputs   C=Blog (1+SNR) bps 2

Description:
Decompose channel through transmit precoding allocasng all resources to one user. Relay decodes and multicasts modulo sum of messages.
See more

The list of books you might like

Most books are stored in the elastic cloud where traffic is expensive. For this reason, we have a limit on daily download.