Table Of ContentSpringer Series in Information Sciences 21
Editor: Manfred R. Schroeder
Springer Series in Information Sciences
Editors: Thomas S. Huang Teuvo Kohonen Manfred R. Schroeder
Managing Editor: H.K.Y. Lotsch
1 Content-Addressable Memories 12 Multiresolution Image Processing and
By T. Kohonen 2nd Edition Analysis Editor: A. Rosenfeld
2 Fast Fourier Transform and
13 VLSI for Pattern Recognition and
Convolution Algorithms
Image Processing Editor: King-sun Fu
By H. 1. Nussbaumer 2nd Edition
3 Pitch Determination of Speech Signals 14 Mathematics of Kalman-Bucy Filtering
Algorithms and Devices By W. Hess By P.A. Ruymgaart and T. T. Soong
2nd Edition
4 Pattern Analysis and Understanding
By H. Niemann 2nd Edition 15 Fundamentals of Electronic Imaging
Systems Some Aspects of Image
5 Image Sequence Analysis
Processing By W.P. Schreiber
Editor: T.S. Huang
6 Picture Engineering 16 Radon and Projection Transform
Editors: King-sun Fu and T.L. Kunii Based Computer Vision
Algorithms, A Pipeline Architecture, and
7 Number Theory in Science and
Industrial Applications By 1.L.C. Sanz,
Communication With Applications in
E.B. Hinkle, and A.K. Jain
Cryptography, Physics, Digital
Information, Computing, and Self
17 Kalman Filtering with Real-Time
Similarity By M.R. Schroeder
Applications By C.K. Chui and G. Chen
2nd Edition
8 Self-Organization and Associative 18 Linear Systems and Optimal Control
Memory By T. Kohonen 3rd Edition By C.K. Chui and G. Chen
9 Digital Picture Processing 19 Harmony: A Psychoacoustical
An Introduction By L.P. Yaroslavsky Approach By R. Parncutt
10 Probability, Statistical Optics and
20 Group Theoretical Methods in Image
Data Testing A Problem Solving
Understanding By Ken-ichi Kanatani
Approach By B.R. Frieden
11 Physical and Biological Processing of 21 Linear Prediction Theory
Images Editors: 0.1. Braddick and A Mathematical Basis for Adaptive
A.C. Sleigh Systems By P. Strobach
Peter Strobach
Linear Prediction Theory
A Mathematical Basis for
Adaptive Systems
With 63 Figures
Springer-Verlag Berlin Heidelberg New York
London Paris Tokyo Hong Kong
Dr.-Ing. Peter Strobach
SIEMENS AG, Zentralabteilung Forschung und Entwicklung,
ZFE IS - Forschung rur Informatik und Software,
Otto-Hahn-Ring 6, D-8000 Miinchen 83, Fed. Rep. of Germany
Series Editors:
Professor Thomas S. Huang
Department of Electrical Engineering and Coordinated Science Laboratory,
University of Illinois, Urbana, IL 61801, USA
Professor Teuvo Kohonen
Laboratory of Computer and Information Sciences, Helsinki University of Technology,
SF-02150 Espoo 15, Finland
Professor Dr. Manfred R. Schroeder
Drittes Physikalisches Institut, Universitat G6ttingen, Biirgerstrasse 42-44,
D-3400 G6ttingen, Fed. Rep. of Germany
Managing Editor: Helmut K. V. Lotsch
Springer-Verlag, Tiergartenstrasse 17,
D-6900 Heidelberg, Fed. Rep. of Germany
ISBN-13:978-3-642-75208-7 e-ISBN-13:978-3-642-75206-3
DOl: 10.1007/978-3-642-75206-3
This work is subject to copyright. All rights are reserved, whether the whole or part of the material is concerned,
specifically the rights of translation, reprinting, reuse of illustrations, recitation, broadcasting, reproduction on
microfilms or in other ways, and storage in data banks. Duplication of this publication or parts thereof is only
permitted under the provisions of the German Copyright Law of September 9, 1965, in its version of June 24,
1985, and a copyright fee must always be paid. Violations fall under the prosecution act of the German Copyright
Law.
© Springer-Verlag Berlin Heidelberg 1990
Softcover reprint of the hardcover 1st edition 1990
The use of registered names, trademarks, etc. in this publication does not imply, even in the absence of a specific
statement, that such names are exempt from the relevant protective laws and regulations and therefore free for
general use.
2154/3150-543210 -Printed on acid-free paper
7Jtis wo~k was O"l'1l'0ssible tA~01A9A tAe looe.
e"colA~ageme"t a"c) SlAl'l'0~t o~ m'1 l'a~e"ts.
Dt is to tAem tAat D c)ec)icate tAis book.
Preface
Lnear prediction theory and the related algorithms have matured to the
point where they now form an integral part of many real-world adaptive
systems. When it is necessary to extract information from a random
process, we are frequently faced with the problem of analyzing and
solving special systems of linear equations. In the general case these
systems are overdetermined and may be characterized by additional
properties, such as update and shift-invariance properties. Usually, one
employs exact or approximate least-squares methods to solve the
resulting class of linear equations.
Mainly during the last decade, researchers in various fields have
contributed techniques and nomenclature for this type of least-squares
problem. This body of methods now constitutes what we call the theory
of linear prediction. The immense interest that it has aroused clearly
emerges from recent advances in processor technology, which provide
the means to implement linear prediction algorithms, and to operate
them in real time. The practical effect is the occurrence of a new class
of high-performance adaptive systems for control, communications and
system identification applications.
This monograph presumes a background in discrete-time digital
signal processing, including Z-transforms, and a basic knowledge of
discrete-time random processes. One of the difficulties I have en
countered while writing this book is that many engineers and computer
scientists lack knowledge of fundamental mathematics and geometry.
This was particularly striking when the understanding of a certain linear
prediction algorithm required a sophisticated mathematical derivation,
or a higher-level geometrical interpretation. Another difficulty arises
from the fact that most of the material covered here is scattered through
the signal processing literature without any special ordering. The many
contributions to the field of linear prediction theory and algorithms
generally use disparate notations and mathematical techniques, making
the material less accessible for the reader unfamiliar with the subject.
It is therefore the intention of this book to bring together the most
important research contributions to linear prediction theory and to
present them in a logical order using a common mathematical frame-
VIII Preface
work. Moreover, several new and previously unpublished concepts and
approaches have been added. It is hoped that the presented material may
thus be assimilated more efficiently, and that the relationship between
the different techniques can be better appreciated.
Munich PETER STROBACH
February 1989
Acknowledgements
I wish to thank the series editor, Prof. Manfred R. Schroeder, for his
interest in this work.
I would like to express my sincere gratitude to Prof. Dieter SchUtt,
chief of the information systems research department at SIEMENS AG,
Munich, FRG, whose constant encouragement has been most invaluable
to me throughout the preparation of this book. His tireless devotion and
unfailing support of his research groups has earned my highest respect.
I thank the management of SIEMENS AG, Munich, FRG, for providing
me with four weeks of paid spare time in the final phase of this work.
I am also grateful to my former M.S. student Jorg Sokat from the
University of Duisburg, FRG, for reviewing an early version of Chap. 2.
I am most indebted to Miss Deborah Hollis of Springer-Verlag,
Heidelberg, FRG, for constant support and much useful advice during the
preparation of the final manuscript. I am most thankful to the other
staff members of Springer-Verlag, Heidelberg, for their help in the
production of this book.
Special thanks go to Dipl.-Ing. M. Schielein who has provided me
with all the necessary computer equipment. How would I have done it
without him?
The beautiful drawings in this book come from Mrs. Habecker and
her crew. Their work is greatly appreciated.
The book was typeset using the particularly useful text-processing
software SIGNUM, which one of our Ph.D. students developed as a hobby
(in fact, for typing his Ph.D. thesis>. The software became a hit, but the
thesis was never completed. .. Thank you, Franz, for your fine software !
Contents
1. Introduction ....................................................................................... 1
2. The Linear Prediction Model............ ...... ...... ...... ............ .................. 13
2.1 The Normal Equations of linear Prediction 15
2.2 Geometrical Interpretation of the Normal Equations .............. 17
2.3 Statistical Interpretation of the Normal Equations .. ............... 20
2.4 The Problem of Signal Observation ................ ........................... 23
2.5 Recursion Laws of the Normal Equations .. .................. ...... ...... 25
2.6 Stationarity - A Special Case of Linear Prediction ...... ............ 26
2.7 Covariance Method and Autocorrelation Method ..................... 27
2.8 Recursive Windowing Algorithms ............................................ .. 28
2.9 Backward linear Prediction 33
2.10 Chapter Summary 35
3. Classical Algorithms for Symmetric Linear Systems ...................... .. 37
3.1 The Cholesky Decomposition ...... ........................ ............ ............ 37
3.2 The QR Decomposition .......................... .................. .................. 40
3.2.1 The Givens Reduction .... .................. ........................ ...... 42
3.2.2 The Householder Reduction ........................................... 47
3.2.3 Calculation of Prediction Error Energy...... ............ ...... 51
3.3 Some More Principles for Matrix Computations ...... ............... 54
3.3.1 The Singular Value Decomposition ............................... 54
3.3.2 Solving the Normal Equations by Singular Value
Decomposition ................................................................. . 56
3.3.3 The Penrose Pseudoinverse .......................................... .. 59
-1
3.3.4 The Problem of Computing X Y ................................ .. 60
3.4 Chapter Summary .............. ...... ........................... ............ ............ 61
XII Contents
4. Recursive Least-Squares Using the QR Decomposition . ...... ... ... ... ... 63
4.1 Formulation of the Growing-Window Recursive
Least-Squares Problem ......... ........................... ........................... 63
4.2 Recursive Least Squares Based on the Givens Reduction .. ...... 65
4.3 Systolic Array Implementation ...... ........ .......... ...... ...... ...... ......... 72
4.4 Iterative Vector Rotations - The CORDIC Algorithm ............. 78
4.5 Recursive QR Decomposition Using a Second-Order
Window ........................................................................................ 84
4.6 Alternative Formulations of the QRLS Problem ...................... 90
4.7 Implicit Error Computation ....................................................... 95
4.8 Chapter Summary....................................................................... 100
5. Recursive Least-Squares Transversal Algorithms ............................ 102
5.1 The Recursive Least-Squares Algorithm ...... ...... ............ ........... 103
5.2 Potter's Square-Root Normalized RLS Algorithm .................... 111
5.3 Update Properties of the RLS Algorithm ................................. 114
5.4 Kubin's Selective Memory RLS Algorithms ............................... 117
5.5 Fast RLS Transversal Algorithms .... .................. ...... .................. 120
5.5.1 The Sherman-Morrison Identity for Partitioned
Matrices .................................. ........................ ................. 121
5.5.2 The Fast Kalman Algorithm ........................................... 126
5.5.3 The FAEST Algorithm .................................................... 136
5.6 Descent Transversal Algorithms ............................................... 143
5.6.1 The Newton Algorithm ................................................... 145
5.6.2 The Steepest Descent Algorithm ................................... 148
5.6.3 Stability of the Steepest Descent Algorithm ................ 149
5.6.4 Convergence of the Steepest Descent Algorithm ......... 152
5.6.5 The Least Mean Squares Algorithm ........................ ...... 154
5.7 Chapter Summary ...... ...... ...... ...... ...... ...... ............ ...... ...... ............ 156
6. The Ladder Form.... ........................................................................... 158
6.1 The Recursion Formula for Orthogonal Projections ................ 160
6.1.1 Solving the Normal Equations with the Recursion
Formula for Orthogonal Projections ............ ............ ...... 162
6.1.2 The Feed-Forward Ladder Form .................................... 165
Description:Lnear prediction theory and the related algorithms have matured to the point where they now form an integral part of many real-world adaptive systems. When it is necessary to extract information from a random process, we are frequently faced with the problem of analyzing and solving special systems