Abstract
In this paper, a novel method for fault detection based on an adaptive interval regression model characterized by the upper regression model (URM) and lower regression model (LRM) has been proposed. Applying the proposed method, a confidence band for the measured data, derived in the normal operating conditions of a system, is constructed.The method combines the superiorities of model sparse representation and computational efficiency of linear programming support vector regression (LP-SVR) with some ideas from L1-norm on approximation errors. First, the upper and lower L1-norms with respect to upper bound approximation error are considered, and the both norms subject to respective constraints are integrated into LP-SVR to form new upper and lower optimization problems, respectively. Following that, optimization problem corresponding to URM and LRM are solved by linear programming and interval regression model is thus constructed to judge whether the fault occurs or not. The proposed method returns an interval output as opposed to a point output. Finally, the efficacy of this method is demonstrated by applying it on the benchmark Tennessee Eastman problem, and has been compared with conventional techniques such as principal component analysis (PCA), dynamic-PCA (DPCA) and One-Class Support Vector Machine(1-class SVM). It is shown that the proposed method is superior to those approaches in terms of performance measure of detection latency.
Get full access to this article
View all access options for this article.
