TR#188: On Training Gaussian Radial Basis Functions for Image Coding

Alex Sherstinsky and Rosalind W. Picard

Article available in:
Revised Version appears in
IEEE Trans. Neural Nets
See Technical Report #271

The efficiency of the Orthogonal Least Squares (OLS) method for training approximation networks is examined using the criterion of energy compaction. We show that the selection of basis vectors produced by the procedure is not the most compact when the approximation is performed using a non-orthogonal basis. Hence, the algorithm does not produce the smallest possible networks for a given approximation error. Specific examples are given using the Gaussian Radial Basis Functions (RBF) type of approximation networks. A new procedure that finds the most compact subset of non-orthogonal basis vectors is described and used to evaluate the performance of OLS in image coding. The new procedure also permits a comparison of the Gaussian RBFs to the Discrete Cosine Transform (DCT), an orthogonal basis commonly used in image coding. This comparison shows that in terms of efficiency, the Gaussian RBFs can perform close to the DCT. Differences in perceptual distortion produced by the two coding techniques are also discussed.

PDF . Full list of tech reports