用户名: 密码: 验证码:
非线性过程监测中的数据降维及相关问题研究
详细信息    本馆镜像全文|  推荐本文 |  |   获取CNKI官网全文
摘要
基于实时测量数据对工业过程实施监测是保障过程安全平稳运行的有效手段。现代工业过程中测量变量众多,但由于潜在的质量、能量平衡及其他操作约束,变量间通常存在严重的线性或非线性相关关系,导致高维过程数据实际上由少量的内在自由变量所驱动。如何基于具有代表性的过程数据有效地使用数据降维方法去除冗余信息,揭示复杂过程数据中的潜在低维结构是过程监测中的基本问题。传统的监测方法大多基于线性多变量统计方法,对广泛存在的非线性过程难以奏效。本文工作从数据降维的角度出发,集中在解决现有非线性过程监测方法中存在的不足和提出更为有效的监测方法,并在标准的TE仿真过程和废水处理实际过程数据上验证了所提出方法的有效性和优越性。本文的研究内容和贡献如下。
     Ⅰ.提出使用线性投影近似流形学习方法最大方差展开(MVU)中隐含的非线性降维映射,得到无监督降维方法最大方差展开投影(MVUP),并设计了MVUP非线性故障检测与隔离方法。MVUP继承了MVU在降维后能展开数据中的非线性结构和保持数据分布边界的特性。MVUP监测方法具有很低的在线计算开销且监测效果与当前主流的非线性监测方法相当。
     Ⅱ.针对当前流行的基于核无监督降维(KUDR)(如核主元分析(KPCA)、核独立成分分析(KICA)等)的非线性故障检测方法中难以选择最合适核函数的问题,提出一种KUDR通用的核函数自动学习方法。数据中的非线性结构在映射到学习得到的最优核对应的核特征空间中后被尽可能展开为线性,因此KUDR在该核特征空间中实施线性降维方法可以有效地解释数据中的非线性波动,从而带来比使用普通核函数更好的故障检测效果。
     Ⅲ.将近年来提出的正交保局投影(OLPP)通过核技巧推广为一种KUDR方法核正交保局投影(KOLPP),并设计了KOLPP非线性故障检测方法。KOLPP显式地考虑了数据中的非线性结构,能同时保持数据中的局部结构和全局结构,比其他流行的KUDR方法具有更强的结构保持能力因此能带来更好的检测效果。
     Ⅳ针对当前非线性故障识别降维中由于使用Fisher判别准则存在的不足,提出一种新的有监督降维方法保局判别分析(LPDA)及其核推广核保局判别分析(KLPDA)用于非线性故障识别。不同于Fisher判别准则面向从整体上分开不同故障类的数据,(K)LPDA的降维目标函数直接面向减少局部类间重叠,对于减少误分率更有意义能带来更好的识别效果。
     Ⅴ.针对传统方法使用扩充向量利用过程动态信息易造成训练数据相对不足且损失变量结构信息的缺点,提出使用扩充矩阵利用过程动态信息的机制。并将LPDA扩展为能直接对矩阵数据降维的张量保局判别分析(TLPDA),设计了基于扩充矩阵和TLPDA降维的动态故障识别方法。
     Ⅵ.以适用于非线性数据集的k-最近邻距离dk作为离群度指标,提出一种快速离群点检测算法近邻修剪(NHP)用于对高维非线性训练数据集的预处理。现有算法仅通过减少对每个数据点dk查询的计算开销来提高算法效率,而NHP算法在每次dk查询过程中能计算出其他点的扩上界用于直接修剪非离群点,能减少dk查询的次数,并通过优化搜索次序提高修剪效果和减少每次dk查询的计算开销。
Process monitoring based on measurement data is very useful for the maintenance of process safety and stability. While numerous variables are measured in modern indus-trial processes, there usually exist severe dependencies among them and high-dimensional process data is actually driven by fewer intrinsic free variables due to the underlying mass/energy balance and other operational constraints. How to effectively perform dimen-sionality reduction to discard redundancy and reveal intrinsic lower-dimensional structure in complex process data is a basic issue in process monitoring. Most traditional monitoring methods are based on linear multivariate statistical methods and not effective for prevalent nonlinear processes. From the perspective of data dimensionality reduction, this disserta-tion focuses on dealing with deficiencies in existing nonlinear process monitoring methods and proposing more effective monitoring methods, and the validity and superiority of the proposed methods are demonstrated on benchmark simulation and real world processes. The contribution of this dissertation is summarized as follows.
     Ⅰ. A new unsupervised dimensionality reduction method named maximum variance unfolding projections (MVUP) and MVUP-based nonlinear fault detection and iso-lation methods are proposed. MVUP approximates the underlying nonlinear dimen-sionality reduction mapping of the manifold learning method maximum variance un-folding (MVU) by linear projection, inheriting its nonlinear structure unfolding and distribution boundary preserving features. The MVUP-based monitoring method has very low online computational cost and comparable effectiveness to state-of-the-art nonlinear monitoring methods.
     Ⅱ. A kernel function learning method is proposed for kernel unsupervised dimensional-ity reduction (KUDR) (e.g. KPCA, KICA) methods which have been widely used in nonlinear fault detection. Nonlinear structure in data is unfolded to be linear in the kernel feature space corresponding to the learned optimal kernel. Therefore, KUDR using the optimal kernel can effectively explain data variation by performing a linear method in the kernel feature space, leading to improved detection performance.
     Ⅲ. A new KUDR method named kernel orthogonal locality preserving projections (KOLPP) and KOLPP-based fault detection method are proposed. KOLPP is the kernel gen-eralization of OLPP method and explicitly consider the underlying nonlinear struc-ture in data. KOLPP has more structure preserving power than other popular KUDR methods, leading to better detection performance.
     Ⅳ. To overcome the drawbacks of existing fault recognition methods due to using the Fisher's criterion, a new supervised dimensionality reduction method named local-ity preserving discriminant analysis (LPDA) and its kernel generalization KLPDA are proposed for nonlinear fault recognition. (K)LPDA directly target at minimizing local overlapping among different classes and can provide better recognition perfor-mance than existing methods.
     Ⅴ. To overcome the drawbacks of using extended vector for incorporating process dy-namic information, an extended matrix-based scheme is proposed. LPDA is extended to Tensor LPDA (TLPDA) which can deal with matrices directly. A dynamic fault recognition method based on extended matrix and TLPDA is proposed.
     Ⅵ. With theκ-nearest neighbor distance (dκ) as outlying measure which is effective for nonlinear data set, a fast outlier detection algorithm named NeighborHood Pruning (NHP) is proposed for preprocessing training data. NHP derives upper bounds of dk for other data points when performing each dκquery, which are used for pruning non-outliers and reducing the number of dκqueries. The searching order is optimized to increase the number of pruning and reduce the computational cost of each dκquery.
引文
[1]MONTGOMERY D C, GERTH R. Introduction to statistical quality control[M]. Wiley,1991.
    [2]DEMMING W E. Out of the Crisis[M]. MIT Press,1986.
    [3]VENKATASUBRAMANIAN V, RENGASWAMY R, YIN K, et al. A review of process fault detection and diagnosis part i:Quantitative model-based methods[J]. Computers and Chemical Engineering,2003,27(3):293-311.
    [4]NIMMO I. Adequately address abnormal operations[J]. Chemical Engineering Progress,1995,91(9):36-45.
    [5]LASER M. Recent safety and environmental legislation[J]. Process Safety and Environmental Protection,2000,78(5):419-422.
    [6]GERTLER J J. Survey of model-based failure detection and isolation in complex-plants[J]. IEEE Control Systems Magazine,1988,8(6):3-11.
    [7]FRANK P M. Fault diagnosis in dynamic systems using analytical and knowledge-based redundancy-a survey and some new results[J]. Automatica,1990,26(3):459-474.
    [8]FRANK P M, DING X. Survey of robust residual generation and evaluation meth-ods in observer-based fault detection systems[J]. Journal of Process Control,1997, 7(6):403-424.
    [9]PATTON R, CLARK R, FRANK P M. Issues of fault diagnosis for dynamic sys-tems[M]. Springer,2000.
    [10]TARIFA E E, SCENNA N J. Fault diagnosis, direct graphs, and fuzzy logic[J]. Computers and Chemical Engineering,1997,21:649-654.
    [11]VENKATASUBRAMANIAN V, RENGASWAMY R, KAVURI S N. A review of process fault detection and diagnosis part ii:Qualitative models and search strate-gies[J]. Computers and Chemical Engineering,2003,27(3):313-326.
    [12]CHIANG L H, RUSSELL E, BRAATZ R D. Fault Detection and Diagnosis in Industrial Systems [M]. Springer,2001.
    [13]QIN J S. Statistical process monitoring:basics and beyond[J]. Journal of Chemo-metrics,2003,17(8-9):480-502.
    [14]MACGREGOR J F, KOURTI T. Statistical process control of multivariate pro-cesses[J]. Control Engineering Practice,1995,3(3):403-414.
    [15]RAICH A C, INAR A. Multivariate statistical methods for monitoring continuous processes:assessment of discrimination power of disturbance models and diagno-sis of multiple disturbances[J]. Chemometrics and Intelligent Laboratory Systems, 1995,30(1):37-48.
    [16]RAICH A, CINAR A. Statistical process monitoring and disturbance diagnosis in multivariable continuous processes[J]. AIChE Journal,1996,42(4):995-1009.
    [17]CHIANG L H, RUSSELL E L, BRAATZ R D. Fault diagnosis in chemical processes using fisher discriminant analysis, discriminant partial least squares, and principal component analysis[J]. Chemometrics and Intelligent Laboratory Systems,2000, 50(2):243-252.
    [18]郭明.基于数据驱动的流程工业性能监控与故障诊断研究[D].杭州:浙江大学博士论文,2004.
    [19]KU W, STORER R H, GEORGAKIS C. Disturbance detection and isolation by dynamic principal component analysis [J]. Chemometrics and Intelligent Laboratory Systems,1995,30(1):179-196.
    [20]BAKSHI B R. Multiscale pca with application to multivariate statistical process monitoring[J]. AIChE Journal,1998,44(7):1596-1610.
    [21]RUSSELL E L, CHIANG L H, BRAATZ R D. Fault detection in industrial pro-cesses using canonical variate analysis and dynamic principal component analy-sis[J]. Chemometrics and Intelligent Laboratory Systems,2000,51(1):81-93.
    [22]MISRA M, YUE H H, QIN S J, et al. Multivariate process monitoring and fault diagnosis by multi-scale pca[J]. Computers and Chemical Engineering,2002, 26(9):1281-1293.
    [23]CHIANG L H, KOTANCHEK M E, KORDON A K. Fault diagnosis based on fisher discriminant analysis and support vector machines[J]. Computers and Chemical Engineering,2004,28(8):1389-1401.
    [24]LEE J M, YOO C K, LEE I B. Statistical monitoring of dynamic processes based on dynamic independent component analysis[J]. Chemical Engineering Science,2004, 59(14):2995-3006.
    [25]ZHAO S J, ZHANG J, XU Y M. Monitoring of processes with multiple operating modes through multiple principle component analysis models[J]. Industrial and Engineering Chemistry Research,2004,43(22):7025-7035.
    [26]HU K, YUAN J. Multivariate statistical process control based on multiway locality preserving projections[J]. Journal of Process Control,2007,18(7-8):797-807.
    [27]HU K, YUAN J. Statistical monitoring of fed-batch process using dynamic multi-way neighborhood preserving embedding[J]. Chemometrics and Intelligent Labo-ratory Systems,2008,90(2):195-203.
    [28]ZHANG Y, QIN S J. Adaptive actuator fault compensation for linear systems with matching and unmatching uncertainties[J]. Journal of Process Control,2009.
    [29]谢磊.间歇过程统计性能监控[D].杭州:浙江大学博士论文,2005.
    [30]李荣雨.基于PCA的统计过程监控研究[D].杭州:浙江大学博士论文,2007.
    [31]葛志强.复杂工况过程统计监测方法研究[D].杭州:浙江大学博士论文,2009.
    [32]JOLLIFFE I T. Principal Component Analysis[M]. Springer,2002.
    [33]THOMPSON B. Canonical correlation analysis[M]. Sage Publications,1984.
    [34]HYV RINEN A, OJA E. Independent component analysis:algorithms and applica-tions[J]. Neural Networks,2000,13(4-5):411-430.
    [35]LEE J M, YOO C K, LEE I B. Statistical process monitoring with independent component analysis[J]. Journal of Process Control,2004,14(5):467-485.
    [36]FUKUNAGA K. Introduction to Statistical Pattern Recognition[M]. Academic Press,1990.
    [37]LOOG M, DUIN R P W, HAEB-UMBACH R. Multiclass linear dimension reduc-tion by weighted pairwise fisher criteria[J]. IEEE Transactions on Pattern Analysis and Machine Intelligence,2001,762-766.
    [38]KRAMER M A. Nonlinear principal component analysis using autoassociative neu-ral networks [J]. AIChE Journal,1991,37(2):233-243.
    [39]DONG D, MCAVOY T J. Nonlinear principal component analysis based on prin-cipal curves and neural networks[J]. Computers and Chemical Engineering,1996, 20(1):65-78.
    [40]GE Z, SONG Z. Online monitoring of nonlinear multiple mode processes based on adaptive local model approach[J]. Control Engineering Practice,2008, 16(12):1427-1437.
    [41]LI J, CUI P. Kernel scatter-difference-based discriminant analysis for nonlinear fault diagnosis[J]. Chemometrics and Intelligent Laboratory Systems,2008,94(1):80-86.
    [42]ZHAO C, WANG F, ZHANG Y. Nonlinear process monitoring based on kernel dissimilarity analysis[J]. Control Engineering Practice,2009,17(1):221-230.
    [43]LEE J M, YOO C K, CHOI S W, et al. Nonlinear process monitoring using kernel principal component analysis[J]. Chemical Engineering Science,2004,59(1):223-234.
    [44]LEE J M, YOO C K, LEE I B. Fault detection of batch processes using multi-way kernel principal component analysis[J]. Computers and Chemical Engineering, 2004,28(9):1837-1847.
    [45]CHO J H, LEE J M, CHOI S W, et al. Fault identification for process monitoring using kernel principal component analysis [J]. Chemical Engineering Science,2005, 60(1):279-288.
    [46]ZHANG Y, QIN S J. Fault detection of nonlinear processes using multiway ker-nel independent component analysis[J]. Industrial and Engineering Chemistry Re-search,2007,46(23):7780-7787.
    [47]ZHANG Y. Enhanced statistical analysis of nonlinear processes using kpca, kica and svm[J]. Chemical Engineering Science,2009,64(5):801-811.
    [48]SCHOLKOPF B, SMOLA A J. Learning with Kernels:Support Vector Machines, Regularization, Optimization, and Beyond[M]. MIT Press Cambridge, MA, USA, 2001.
    [49]SHAWETAYLOR J, CRISTIANINI N. Kernel methods for pattern analysis[M]. Cambridge University Press,2004.
    [50]LU J, PLATANIOTIS K N, VENETSANOPOULOS A N. Face recognition using kernel direct discriminant analysis algorithms[J]. IEEE Transactions on Neural Net-works,2003,14(1):117-126.
    [51]GANAPATHIRAJU A, HAMAKER J, PICONE J. Applications of support vector machines to speech recognition[J]. IEEE Transactions on Signal Processing,2004, 52(8):2348-2355.
    [52]SCHOLKOPF B, TSUDA K, VERT J P. Kernel methods in computational biol-ogy[M]. Bradford Books,2004.
    [53]SCHOLKOPF B, SMOLA A, MULLER K R. Nonlinear component analysis as a kernel eigenvalue problem[J]. Neural Computation,1998,10(5):1299-1319.
    [54]YANG J, GAO X, ZHANG D. Kernel ica:an alternative formulation and its appli-cation to face recognition[J]. Pattern Recognition,2005,38(10):1784-1787.
    [55]BAUDAT G, ANOUAR F E. Generalized discriminant analysis using a kernel ap-proach[J]. Neural Computation,2000,12(10):2385-2404.
    [56]ISERMANN R, BALLE P. Trends in the application of model-based fault detec-tion and diagnosis of technical processes[J]. Control Engineering Practice,1997, 5(5):709-719.
    [57]KESAVAN P, LEE J H. Diagnostic tools for multivariable model-based control sys-tems[J]. Industrial and Engineering Chemistry Research,1997,36(7):2725-2738.
    [58]KOHAVI R, PROVOST F. Glossary of terms[J]. Machine Learning,1998, 30(2/3):271-274.
    [59]HE X. Locality preserving projections[D]. PhD thesis. University of Chicago,2005.
    [60]CAID, HE X, ZHOU K, et al. Locality sensitive discriminant analysis[C]//Proceed-ings of the 20th International Joint Conference on Artificial Intelligence. Hyderabad, India:2007:708-713.
    [61]DUDA R O, HARTPE, STORK D G. Pattern classification[M]. Wiley-Interscience, 2000.
    [62]MARTIN E B, MORRIS A J. Non-parametric confidence bounds for process per-formance monitoring charts[J]. Journal of Process Control,1996,6(6):349-358.
    [63]WOLD S. Cross-validatory estimation of the number of components in factor and principal components models[J]. Technometrics,1978,20(4):397-405.
    [64]VALLE S, LI W, QIN S J. Selection of the number of principal components:The variance of the reconstruction error criterion with a comparison to other methods[J]. Industrial and Engineering Chemistry Research,1999,38(11):4389-4401.
    [65]STONE M. Cross-validatory choice and assessment of statistical predictions [J]. Journal of the Royal Statistical Society,1974,36(2):111-133.
    [66]LOPES J A, MENEZES J C. Multivariate monitoring of fermentation processes with non-linear modelling methods[J]. Analytica Chimica Acta,2004,515(1):101-108.
    [67]MAULUD A, WANG D, ROMAGNOLI J A. A multi-scale orthogonal nonlinear strategy for multi-variate statistical process monitoring[J]. Journal of Process Con-trol,2006,16(7):671-683.
    [68]MALTHOUSE E C. Limitations of nonlinear pca as performed with generic neural networks[J]. IEEE Transactions on Neural Networks,1998,9(1):165-173.
    [69]HASTIE T, STUETZLE W. Principal curves[J]. Journal of the American Statistical Association,1989,84(406):502-516.
    [70]SHAO R, JIA F, MARTIN E B, et al. Wavelets and non-linear principal components analysis for process monitoring[J]. Control Engineering Practice,1999,7(7):865-879.
    [71]FOURIE S H, DE VAAL P. Advanced process monitoring using an on-line non-linear multiscale principal component analysis methodology[J]. Computers & Chemical Engineering,2000,24(2-7):755-760.
    [72]GENG Z, ZHU Q. Multiscale nonlinear principal component analysis (nlpca) and its application for chemical process monitoring[J]. Industrial and Engineering Chem-istry Research,2005,44(10):3585-3593.
    [73]LIN W, QIAN Y, LI X. Nonlinear dynamic principal component analysis for on-line process monitoring and diagnosis[J]. Computers and Chemical Engineering,2000, 24(2-7):423-429.
    [74]ZHAO S, XU Y. Multivariate statistical process monitoring using robust nonlinear principal component analysis[J]. Tsinghua Science & Technology,2005,10(5):582-586.
    [75]ZHAO S J, ZHANG J, XU Y M, et al. Nonlinear projection to latent structures method and its applications[J]. Industrial and Engineering Chemistry Research, 2006,45(11):3843-3852.
    [76]GNANADESIKAN R. Methods for statistical data analysis of multivariate observa-tions [M]. Wiley-Interscience,1997.
    [77]HIDEN H G, WILLIS M J, THAM M T, et al. Non-linear principal components anal-ysis using genetic programming[J]. Computers and Chemical Engineering,1999, 23(3):413-125.
    [78]DEMPSTER A P, LAIRD N M, RUBIN D B. Maximum likelihood from incomplete data via the em algorithm[J]. Journal of the Royal Statistical Society. Series B (Methodological),1977,39(1):1-38.
    [79]TEH Y W, ROWEIS S. Automatic alignment of local representations [C]// Advances in Neural Information Processing Systems. Vancouver, British Columbia, Canada: 2003:865-872.
    [80]CHEN J, LIU J. Mixture principal component analysis models for process monitor-ing[J]. Industrial and Engineering Chemistry Research,1999,38(4):1478-1488.
    [81]ZHANG F. A mixture probabilistic pca model for multivariate processes moni-toring[C]// Proceeding of the 2004 American Control Conference. Boston Mas-sachusetts, USA:2004:3111-3115.
    [82]ZHAO Z G, LIU F. A new method for process monitoring based on mixture prob-abilistic principal component analysis models [C]//Advances in Neural Networks-ISNN 2006. Chengdu, China:2006:939-944.
    [83]赵忠盖,刘飞,徐保国.基于改进混合概率主元分析模型的过程监控[J].控制与决策,2006,21(007):745-749.
    [84]CHOI S W, MARTIN E B, MORRIS A J, et al. Fault detection based on a maximum-likelihood principal component analysis (pca) mixture[J]. Industrial and Engineer-ing Chemistry Research,2005,44(7):2316-2327.
    [85]CHOI S W, PARK J H, LEE I B. Process monitoring using a gaussian mixture model via principal component analysis and discriminant analysis[J]. Computers and Chemical Engineering,2004,28(8):1377-1387.
    [86]THISSEN U, SWIERENGA H, DE WEIJER A P, et al. Multivariate statistical process control using mixture modelling[J]. Journal of Chemometrics,2005,19(1).
    [87]BOSER B E, GUYONIM, VAPNIK V N. A training algorithm for optimal margin classifiers[C]// Proceedings of the 5th Annual Workshop on Computational Learning Theory. Pittsburgh, PA, USA:ACM New York, NY, USA,1992:144-152.
    [88]HAM J, LEE D D, MIKA S, et al. A kernel view of the dimensionality reduction of manifolds[C]// Proceedings of the 21th International Conference on Machine Learn-ing. Banff, Alberta, Canada:ACM,2004:47.
    [89]CHOI S W, LEE I B. Nonlinear dynamic process monitoring based on dynamic kernel pca[J]. Chemical Engineering Science,2004,59(24):5897-5908.
    [90]CHOI S W, MORRIS J, LEE I B. Nonlinear multiscale modelling for fault detection and identification[J]. Chemical Engineering Science,2008,63(8):2252-2266.
    [91]YINGWEI Z, QIN S J. Improved nonlinear fault detection technique and statistical analysis[J]. AIChE Journal,2008,54(12):3207-3220.
    [92]CUI P, LI J, WANG G. Improved kernel principal component analysis for fault detection[J]. Expert Systems with Applications,2008,34(2):1210-1219.
    [93]LIU X, KRUGER U, LITTLER T, et al. Moving window kernel pca for adap-tive monitoring of nonlinear processes [J]. Chemometrics and Intelligent Laboratory Systems,2009,96(2):132-143.
    [94]GE Z, YANG C, SONG Z. Improved kernel pca-based monitoring approach for nonlinear processes[J]. Chemical Engineering Science,2009,64(9):2245-2255.
    [95]LEE J M, QIN S J, LEE I B. Fault detection of non-linear processes using kernel in-dependent component analysis[J]. The Canadian Journal of Chemical Engineering, 2007,85(4).
    [96]张曦,阎威武,赵旭,邵惠鹤.Nonlinear Statistical Process Monitoring and Fault Detection Using Kernel ICA[J].东华大学学报:英文版,2007,24(005):587-593.
    [97]MAHADEVAN S, SHAH S L. Fault detection and diagnosis in process data using one-class support vector machines[J]. Journal of Process Control,2009.
    [98]CHO H W. Data description and noise filtering based detection with its appli-cation and performance comparison[J]. Expert Systems with Applications,2009, 36(1):434-441.
    [99]CHENG C, CHIU M S. Nonlinear process monitoring using jitl-pca[J]. Chemomet-rics and Intelligent Laboratory Systems,2005,76(1):1-13.
    [100]VENKATASUBRAMANIAN V, VAIDYANATHAN R, YAMAMOTO Y. Process fault detection and diagnosis using neural networks-i. steady-state processes[J]. Computers and Chemical Engineering,1990,14(7):699-712.
    [101]FAN J Y, NIKOLAOU M, WHITE R E. An approach to fault diagnosis of chemical processes via neural networks[J]. AIChE Journal,1993,39(1):82-88.
    [102]FARELL A E, ROAT S D. Framework for enhancing fault diagnosis capabili-ties of artificial neural networks[J]. Computers and Chemical Engineering,1994, 18(7):613-636.
    [103]TSAI C S, CHANG C T. Dynamic process diagnosis via integrated neural net-works[J]. Computers and Chemical Engineering,1995,19:747-752.
    [104]ZHAO X, YAN W, SHAO H. Monitoring and fault diagnosis for batch process based on feature extract in fisher subspace[J]. Chinese Journal of Chemical Engineering, 2006,14(6):759-764.
    [105]ZHANG X, YAN W, ZHAO X, et al. Nonlinear biological batch process monitor-, ing and fault identification based on kernel fisher discriminant analysis[J]. Process Biochemistry,2007,42(8):1200-1210.
    [106]CHO H W. Nonlinear feature extraction and classification of multivariate data in kernel feature space[J]. Expert Systems with Applications,2007,32(2):534-542.
    [107]ZHANG X, YAN W, ZHAO X, et al. Nonlinear real-time process monitoring and fault diagnosis based on principal component analysis and kernel fisher discriminant analysis[J]. Chemical Engineering and Technology,2007,30(9):1203-1211.
    [108]CHO H W. An orthogonally filtered tree classifier based on nonlinear kernel-based optimal representation of data[J]. Expert Systems with Applications,2008, 34(2):1028-1037.
    [109]HE X B, YANG Y P, YANG Y H. Fault diagnosis based on variable-weighted kernel fisher discriminant analysis[J]. Chemometrics and Intelligent Laboratory Systems, 2008,93(1):27-33.
    [110]LI J, CUI P. Improved kernel fisher discriminant analysis for fault diagnosis[J]. Expert Systems with Applications,2009,36(2, Part 1):1423-1432.
    [111]CHEN J, BANDONI A, ROMAGNOLI J A. Robust statistical process monitor-ing[J]. Computers and Chemical Engineering,1996,20:497-502.
    [112]XIE L, ZHANG J, WANG S. A robust statistical batch process monitoring frame-work and its application[J]. Chinese Journal of Chemical Engineering,2004, 12(5):682-687.
    [113]MOLLER S F, VON FRESE J, BRO R. Robust methods for multivariate data anal-ysis[J]. Journal of Chemometrics,2005,19(10):549-563.
    [114]WANG X, KRUGER U, IRWIN G W. Process monitoring approach using fast moving window pca[J]. Industrial and Engineering Chemistry Research,2005, 44(15):5691-5702.
    [115]DOYMAZ F, CHEN J, ROMAGNOLI J A, et al. A robust strategy for real-time process monitoring[J]. Journal of Process Control,2001,11(4):343-359.
    [116]CHIANG L H, PELL R J, SEASHOLTZ M B. Exploring process data with the use of robust outlier detection algorithms [J]. Journal of Process Control,2003,13(5):437-449.
    [117]CRAMER J A, SHAH S S, BATTAGLIA T M, et al. Outlier detection in chemical data by fractal analysis[J]. Journal of Chemometrics,2004,18(7-8):317-326.
    [118]ZENG J S, GAO C H. Improvement of identification of blast furnace ironmaking process by outlier detection and missing value imputation[J]. Journal of Process Control,2009,19(9):1519-1528.
    [119]BARNETT V, LEWIS T. Outliers in statistical data[M]. Wiley New York,1994.
    [120]KNORR E M, NG R T, TUCAKOV V. Distance-based outliers:algorithms and applications[J]. The VLDB Journal,2000,8(3-4):237-253.
    [121]RAMASWAMY S, RASTOGI R, SHIM K. Efficient algorithms for mining outliers from large data sets[J]. ACM SIGMOD Record,2000,29(2):438.
    [122]DOWNS J J, VOGEL E F. A plant-wide industrial process problem[J]. Computers and Chemical Engineering,1993,7:245-255.
    [123]SANCHEZ M, CORTES U, LAFUENTE J, et al. Dai-depur:an integrated and distributed architecture for wastewater treatment plants supervision[J]. Artificial Intelligence in Engineering,1996,10(3):275-285.
    [124]GIMENO J M, BEJAR J, SANCHEZ-MARRE M, et al. Discovering and modeling process change:An application to industrial processes[C]// Proceedings of the 2nd International Conference on the Practical Application of Knowledge Discovery and Data Mining.1998:143-153.
    [125]GE Z, SONG Z. Process monitoring based on independent component analysis-principal component analysis (ica-pca) and similarity factors[J]. Industrial and En-gineering Chemistry Research,2007,46(7):2054-2063.
    [126]ALGHAZZAWI A, LENNOX B. Monitoring a complex refining process using mul-tivariate statistics [J]. Control Engineering Practice,2008,16(3):294-307.
    [127]KIM D, LEE I B. Process monitoring based on probabilistic pca[J]. Chemometrics and Intelligent Laboratory Systems,2003,67(2):109-123.
    [128]LOU X, LOPARO K A. Bearing fault diagnosis based on wavelet transform and fuzzy inference[J]. Mechanical Systems and Signal Processing,2004,18(5):1077-1095.
    [129]HU Q, HE Z, ZHANG Z, et al. Fault diagnosis of rotating machinery based on improved wavelet package transform and svms ensemble[J]. Mechanical Systems and Signal Processing,2007,21(2):688-705.
    [130]LEI Y, HE Z, ZI Y. A new approach to intelligent fault diagnosis of rotating ma-chinery[J]. Expert Systems with Applications,2008,35(4):1593-1600.
    [131]TENENBAUM J B, SILVA V, LANGFORD J C. A global geometric framework for nonlinear dimensionality reduction[J]. Science,2000,290(5500):2319-2323.
    [132]SAUL L K, ROWEIS S T, SINGER Y. Think globally, fit locally:Unsupervised learning of low dimensional manifolds[J]. Journal of Machine Learning Research, 2004,4(2):119-155.
    [133]BELKIN M, NIYOGI P. Laplacian eigenmaps for dimensionality reduction and data representation[J]. Neural Computation,2003,15(6):1373-1396.
    [134]WEINBERGER K Q, SHA F, SAUL L K. Learning a kernel matrix for nonlinear dimensionality reduction[C]// Proceedings of the 21th International Conference on Machine Learning. Banff, Alberta, Canada:2004:106.
    [135]WEINBERGER K Q, SAUL L K. Unsupervised learning of image manifolds by semidefinite programming[J]. International Journal of Computer Vision,2006, 70(1):77-90.
    [136]VANDENBERGHE L, BOYD S. Semidefinite programming[J]. SIAM Review, 1996,38(1):49-95.
    [137]STURM J F. Using sedumi 1.02, a matlab toolbox for optimization over symmetric cones[J]. Optimization Methods and Software,1999, 11(1):625-653.
    [138]BORCHERS B. Csdp, a c library for semidefinite programming[J]. Optimization Methods and Software,1999,11(1):613-623.
    [139]BEN-ISRAEL A, GREVILLE T N E. Generalized Inverses:Theory and Applica-tions[M]. Springer,2003.
    [140]TIKHONOV A N. Solutions of Ill-Posed Problems[M]. Winston Publishing,1977.
    [141]DENG C, XIAOFEI H, JIAWEI H. Spectral regression for efficient regularized subspace learning[C]// Proceedings of the IEEE 11th International Conference on Computer Vision. Rio de Janeiro, Brazil:2007:1-8.
    [142]CHRISTOPHER C P, MICHAEL A S. Lsqr:An algorithm for sparse linear equa-tions and sparse least squares[J]. ACM Transactions on Mathematical Software, 1982,8(1):43-71.
    [143]MILLER P, SWANSON R E, HECKLER C F. Contribution plots:a missing link in multivariate quality control[J]. Applied Mathematics and Computer Science,1998, 8(4):775-792.
    [144]WESTERHUIS J A, GURDEN S P, SMILDE A K. Generalized contribution plots in multivariate statistical process monitoring[J]. Chemometrics and Intelligent Lab-oratory Systems,2000,51(1):95-114.
    [145]LYMAN P R, GEORGAKIS C. Plant-wide control of the tennessee eastman prob-lem[J]. Computers and Chemical Engineering,1995,19(3):321-331.
    [146]LEE J M, QIN S J, LEE I B. Fault detection and diagnosis based on modified independent component analysis[J]. AIChE Journal,2006,52(10):3501-3514.
    [147]BACH F R, LANCKRIET G R G, JORDAN M I. Multiple kernel learning, conic duality, and the smo algorithm[C]// Proceedings of the 21 International Conference on Machine Learning. New York, NY, USA:ACM Press,2004:6.
    [148]ONG C S, SMOLA A J, WILLIAMSON R C. Learning the kernel with hyperker-nels[J]. Journal of Machine Learning Research,2005,6:1043-1071.
    [149]XIONG H, SWAMY M N S, AHMAD M O. Optimizing the kernel in the empirical feature space[J]. IEEE Transactions on Neural Networks,2005,16(2):460-474.
    [150]AMARI S, WU S. Improving support vector machine classifiers by modifying ker-nel functions[J]. Neural Networks,1999,12(6):783-789.
    [151]CHEN B, LIU H, BAO Z. Optimizing the data-dependent kernel under a unified kernel optimization framework[J]. Pattern Recognition,2008,41(6):2107-2119.
    [152]CHOI S W, LEE C, LEE J M, et al. Fault detection and identification of nonlinear processes based on kernel pca[J]. Chemometrics and Intelligent Laboratory Sys-tems,2005,75(1):55-67.
    [153]BYRD R H, HRIBAR M E, NOCEDAL J. An interior point algorithm for large-scale nonlinear programming[J]. SIAM Journal of Optimization,1999,9:877-900.
    [154]WALTZ R A, MORALES J L, NOCEDAL J, et al. An interior algorithm for nonlin-ear optimization that combines line search and trust region steps[J]. Mathematical Programming,2006,107(3):391-408.
    [155]GOLDBERG D E. Genetic Algorithms in Search, Optimization and Machine Learn-ing[M]. Addison-Wesley Longman Publishing Co.,Inc. Boston, MA, USA,1989.
    [156]CHAPELLE O, SCHOLKOPF B, ZIEN A. Semi-supervised Learning[M]. MIT Press,2006.
    [157]HE X, NIYOGI P. Locality preserving projections[C]// THRUN S, SAUL L K, SCHOLKOPF B. Advances in Neural Information Processing Systems 16.Vancou-ver, Canada:2003:8-13.
    [158]CAID, HE X, HAN J, et al. Orthogonal laplacianfaces for face recognition [J]. IEEE Transactions on Image Process,2006,15(11):3608-3614.
    [159]HORN J L. A rationale and test for the number of factors in factor analysis [J]. Psychometrika,1965,30(2):179-185.
    [160]VENKATASUBRAMANIAN V, RENGASWAMY R, KAVURI S N, et al. A review of process fault detection and diagnosis part iii:Process history based methods[J]. Computers and Chemical Engineering,2003,27(3):327-346.
    [161]HE Q P, QIN S J, WANG J. A new fault diagnosis method using fault directions in fisher discriminant analysis[J]. AIChE Journal,2005,51(2):555-571.
    [162]HE X, YAN S, HU Y. Face recognition using laplacianfaces[J]. IEEE Transactions on Pattern Analysis and Machine Intelligence,2005,27(3):328-340.
    [163]XIAOFEI H, DENG C, JIAWEI H. Learning a maximum margin subspace for im-age retrieval[J]. IEEE Transactions on Knowledge and Data Engineering,2008, 20(2):189-201.
    [164]CHUNG F R K. Spectral Graph Theory[M]. American Mathematical Society,1997.
    [165]HUAN W, SHUICHENG Y, DONG X, et al. Trace ratio vs. ratio trace for dimen-sionality reduction[C]//Computer Vision and Pattern Recognition,2007. CVPR'07. IEEE Conference on. Minneapolis, Minnesota, USA:2007:1-8.
    [166]YANG J. Why can lda be performed in pca transformed space?[J]. Pattern Recog-nition,2003,36(2):563-566.
    [167]STRANG G. Introduction to Linear Algebra[M]. Wellesley Cambridge Press,2003.
    [168]YE J, JANARDAN R, LI Q. Two-dimensional linear discriminant analysis[C]// Advances in Neural Information Processing Systems. Vancouver, British Columbia, Canada:2004:1569-1576.
    [169]HE X, CAID, NIYOGI P. Tensor subspace analysis[C]// Advances in Neural Infor-mation Processing Systems. Vancouver, British Columbia, Canada:2006:499.
    [170]WANG H, YAN S, HUANG T, et al. A convergent solution to tensor subspace learning[C]// Proceedings of the 20th International Joint Conference on Artifical Intelligence. Hyderabad, India:Morgan Kaufmann Publishers Inc.,2007:629-634.
    [171]HU K, YUAN J. Batch process monitoring with tensor factorization [J]. Journal of Process Control,2009,19(2):288-296.
    [172]NOMIKOS P, MACGREGOR J F. Monitoring batch processes using multiway prin-cipal component analysis[J]. AIChE Journal,1994,40(8):1361-1375.
    [173]CAI D, HE X, HAN J. Tensor space model for document analysis[C]// Proceedings of the 29th Annual International ACM SIGIR Conference on Research and Devel-opment in Information Retrieval. Seattle, WA, USA:ACM,2006:626.
    [174]HAN J, KAMBER M. Data mining:concepts and techniques[M]. Morgan Kauf-mann,2006.
    [175]JOHNSON T, KWOK I, NG R. Fast computation of 2-dimensional depth con-tours[C]// Proceedings of the ninth ACM SIGKDD International Conference on Knowledge Discovery and Data mining. New York, USA:AAAI Press,1998:224-228.
    [176]CHEN H, JIANG G, YOSHIHIRA K. Robust nonlinear dimensionality reduction for manifold learning[J]. Pattern Recognition,2006,2:447-450.
    [177]BECKMANN N, KRIEGEL H P, SCHNEIDER R, et al. The r*-tree:an efficient and robust access method for points and rectangles[C]// Proceedings of the 1990 ACM SIGMOD International Conference on Management of Data. Princeton, NJ, USA:1990:322-331.
    [178]BAY S D, SCHWABACHER M. Mining distance-based outliers in near linear time with randomization and a simple pruning rule[C]// Proceedings of the 9th ACM SIGKDD International Conference on Knowledge Discovery and Data min-ing. Washington, D.C., USA:ACM Press,2003:29-38.
    [179]GHOTING A, PARTHASARATHY S, OTEY M E. Fast mining of distance-based outliers in high-dimensional datasets[J]. Data Mining and Knowledge Discovery, 2008,16(3):349-364.
    [180]CIACCIA P, PATELLA M, ZEZULA P. M-tree:An efficient access method for sim-ilarity search in metric spaces[C]// Proceedings of the 23rd International Conference on Very Large Data Bases. Athens, Greece:1997:426-435.
    [181]WU M, JERMAINE C. Outlier detection by sampling with accuracy guarantees [C]// Proceedings of the 12th ACM SIGKDD International Conference on Knowledge Discovery and Data mining. Philadelphia, PA, USA:ACM Press,2006:767-772.
    [182]CHERNOFF H. A measure of asymptotic efficiency for tests of a hypothesis based on the sum of observations[J]. The Annals of Mathematical Statistics,1952, 23(4):493-507.
    [183]GUHA S, RASTOGI R, SHIM K. Cure:An efficient clustering algorithm for large databases[J]. Information Systems,2001,26(1):35-58.
    [184]HETTICH S, BAY S D. The uci kdd archive [http://kdd.ics.uci.edu] 1999.

© 2004-2018 中国地质图书馆版权所有 京ICP备05064691号 京公网安备11010802017129号

地址:北京市海淀区学院路29号 邮编:100083

电话:办公室:(+86 10)66554848;文献借阅、咨询服务、科技查新:66554700