Abstract
Fuzzy logic systems (FLSs) can be designed using training data (i.e., from M given numerical input/output pairs) and supervised learning algorithms. Orthogonal least-squares (OLS) learning decomposes an FLS into a linear combination of Ms < M nonlinear fuzzy basis functions (FBFs), which are optimized during OLS to match the training data. The drawback to OLS is that the resulting system still contains information from all M initial rules, derived from the training points, even though only the most important Ms rules have been established by OLS. This is due to a normalization of the FBFs, and leads to excessive computation times during further processing. Our solution is to construct new FBFs out of the reduced rule base and to run OLS a second time. The resulting system not only is of reduced computational complexity, but is of very similar behavior to the unreduced system. The second run of OLS can be applied to a larger set of training data that greatly improves precision. We illustrate our two-pass OLS algorithm for prediction of the Mackey–Glass chaotic time series. Extensive simulations are provided.
Get full access to this article
View all access options for this article.
