用户名: 密码: 验证码:
基于高斯过程的高光谱图像分类研究
详细信息    本馆镜像全文|  推荐本文 |  |   获取CNKI官网全文
摘要
为利用高光谱成像光谱仪,高光谱图像可通过数百个连续且细分的光谱波段对地表区域同时成像,进而获得三维图像数据。高光谱图像信息通过较窄的波段区间、较多的波段数量而提供。高光谱图像能被用于从光谱空间中对地物予以细分和识别,这是其能够在军事和民用领域得到广泛应用的重要原因。
     当前,在高光谱图像分类领域,核函数方法因具有可解决非线性问题和数据特征维数过多问题的能力,而受到科研人员越来越多的关注,逐步成为研究热点。最近几年,支持向量机作为一种核函数方法广泛被应用于高光谱图像分类。但是,支持向量机本身存在诸如核函数中超参数难以选择、输出结果不具有概率意义等问题,限制了其进一步的推广。
     高斯过程也是一种基于核函数的方法。它具有完全的贝叶斯公式化表示,能够明确的进行概率建模,使结果更易于解释。高斯过程的贝叶斯学习提供了一个范式,可以根据训练样本,实现从先验分布到后验分布的转换,以及对核函数超参数的推理。
     本文以基于高斯过程的高光谱图像分类技术作为主要研究内容,针对高光谱图像波段数多,波段间相关性和空间相关性强,带标记训练样本过少的特点,将高斯过程理论与亲和传播聚类算法、综合波段特征与空间特征构造空间约束核函数方法、条件随机场理论和半监督核函数理论相结合,对高光谱图像分类进行研究。主要的创新性研究成果如下:
     1.高光谱图像有监督分类时,若训练样本数目有限,会出现"Hughes"现象,即分类精度先随着图像波段数目的增加而增加,当到达一定极值后,分类精度又随着波段数目的增加而下降。为避免"Hughes"现象,高光谱图像分类前应先进行波段选择。本文在高光谱图像波段选择理论基础上,提出了基于波段选择的高斯过程高光谱图像分类方法,该方法先用亲和传播(Affinity Propagation)方法进行波段选择,再用高斯过程进行高光谱图像分类。实验结果表明,基于波段选择的高斯过程高光谱图像分类方法能够在较少的波段数下取得较好的分类结果。
     2.高光谱图像不仅拥有较高的谱段分辨率,而且像素点之间有很强的空间相邻关系。本文通过综合波段相关性和空间相关性构造空间约束核函数,提出了空间约束核函数高斯过程分类方法。实验结果表明,空间约束核函数高斯过程分类方法部分消除了同物异谱和同谱异物造成的分类错误。
     3.为进一步利用高光谱图像的空间结构,本文将高斯过程和条件随机场理论相结合,提出了用于高光谱图像的高斯过程与条件随机场集成分类方法。实验结果表明,高斯过程与条件随机场集成分类方法能够有效减少高光谱图像中的孤立噪声点,进而提高高光谱图像分类精度。
     4.高光谱图像分类中,经常遇到带标记训练样本不足的问题。半监督学习理论可有效利用少数带标记样本和大量易于获得的无标记样本,从而可以改善分类和预测精度。本文根据高光谱图像像素点之间空间局部相关性较强的特性,基于半监督流形假设,通过构造半监督核函数,创新地提出一种用于高光谱图像分类的半监督核函数高斯过程方法。一方面,该方法为非线性方法,对高维非线性的高光谱图像可以取得较好的分类效果;另一方面,该方法为非参数方法,仅需要对少数几个超参数进行学习,速度较快,也比较简单。实验结果表明,在只有少量带标记的训练样本情况下,高光谱图像分类精度有明显的提高。
     本文提出的基于高斯过程高光谱图像分类的多个方法,能够较好地针对高光谱图像波段数众多、光谱波段相关性和空间相关性强、以及带标记训练样本过少的特点,从而在标准高斯过程高光谱分类基础上进行改进,有较高的分类精度和较强的适应性。在本文最后,对全文做了总结并展望了未来的研究方向。
Hyperspectral imagery (HSI) is a three-dimensional imagery generated by imaging spectrometer simultaneously to the same surface scenery at hundreds of bands. It contains hundreds of spectral information in a narrow spectral region. One of the main applications of HSI is to identify and recognize the materials by the rich spectral information, which is also the basic reason for widely used of HSI to military and civilian fields.
     Currently, kernel methods are more and more popular in HSI classification for their ability to solve nonlinear problems and less sensitive to the curse of dimensionality with respect to traditional classification techniques. As a kind of kernel methods, support vector machine(SVM) classifiers are widely used for HSI classification in recent years. However, there are some drawbacks in SVM such as difficulty of hyperparameters selection and non probabilistic outputs,which prevent their further population.
     Another potentially interesting kernel-based classification approach is represented by Gaussian process classifier(GPC). By contrast to SVM classifiers, GPCs are Bayesian clas-sifiers and they permit a fully Bayesian treatment of considered classification problem. GPCs have the advantage of providing output probabilities rather than discriminant func-tion values. Moreover, they can use evidence for automatic model selection and Hyperpa-rameter optimization.
     In this paper,research is focused on Gaussian Processes(GP) based HSI classification. Aiming at HSI features such as numerous bands, highly correlations of spectral and spatial, lack of labeled samples, We combine Gaussian Processes with Affinity Propagation(AP), kernel construction methods, conditional random fields and semi supervised learning re-spectively to propose some new Gaussian Processes based methods.The major works and contribution of this dissertation are as follows:
     1. The dimensionality of HSI strongly affects the performance of many supervised clas- sification methods,which is called "Hughes" phenomenon. In order to avoid it, band selection should be processed before classification of HSI. Combined with Affin-ity propagation, Band selection based Gaussian processes method is proposed in this dissertation, which means band selection by AP is followed by classification by GPC. Experimental results show that the proposed band selection based Gaussian processes method can get better classification result in a few spectral bands.
     2. HSI shows strong spectral and spatial correlations. By constructing a new spatial kernel function (SGK) of GP, spatial relations in HSI are included, so that classifica-tion error partially caused by "same material different spectral" and "same spectral different material" can be partially eliminated.
     3. In order to utilize spatial structure of HSI, we make a combination of GPC with conditional random fields(CRF) and propose GPCRF method for HSI classification. Experiments on the real world Hyperspectral images attest to the accuracy and robust of GPCRF method, because it can reduce image noise to some extent.
     4. In HSI classification, supervised learning methods for classification often lead to low performance because of the hard of obtaining the labeled training samples. Mean-while, there are a lot of unlabeled data in Hyperspectral images. In semi supervised learning theory, labeled samples and abundant unlabeled samples are combined to train classifiers by estimating parameters of a generative model. A new classifica-tion method of Spatial Semi-supervised gaussian processes(SSGP) is proposed in this dissertation which is based on the assumption of semi supervised manifold as-sumption. SSGP is a semi-supervised learning method, and spatial correlations of labeled samples and unlabeled samples can be build to raise the classification cor-rect rate; SSGP is a kernel method and it can deal good with the nonlinear property of HSI; SSGP is a non-parameter method and has few Hyperparameters which can be learned from the data. Experiment results show that SSGP method is very good at classification of Hyperspectral images with respect to classification accuracy and stability at the case of small percentage of labeled training samples.
     In this dissertation, we take the advantages of HSI features such as abundant spectral bands, highly correlations of spectral and spatial and lack of labeled samples, improve stan-dard GPC and propose several GP based HSI classification methods. The results achieved show that these methods have the potential of yielding accurate and stability.
     Finally, we make a conclusion and give a research Perspective.
引文
[1]Vane G, Green R, Chrien T, et al. The airborne visible/infrared imaging spectrometer (AVIRIS)[J]. Remote Sensing of Environment,1993,44(2-3):127-143.
    [2]浦瑞良,宫鹏.高光谱遥感及其应用[M].北京:高等教育出社,2000.
    [3]McKeown Jr D, Cochran S, Ford S, et al. Fusion of HYDICE hyperspectral data with panchromatic imagery for cartographic feature extraction[J]. Geoscience and Remote Sensing, IEEE Transactions on,2002,37(3):1261-1277.
    [4]童庆禧,张兵,郑兰芬.高光谱遥感——原理,技术与应用[M].北京:高等教育出版社,2006.
    [5]Hunt G. Near-infrared (1.3-2.4 mm) spectra of alteration mineralst鈥攑otential for use in remote sensing[J]. Geophysics,1979,44(12):1974-1986.
    [6]贾森.非监督的高光谱图像解混技术研究[D].PhD thesis.浙江大学,2007.
    [7]Hughes G. On the mean accuracy of statistical pattern recognizers[J]. IEEE Trans-actions on Information Theory,1968,14(1):55-63.
    [8]刘春红.超光谱遥感图像降维及分类方法研究[D].PhD thesis.哈尔滨工程大学,2005.
    [9]Jia X, Richards J. Efficient transmission and classification of hyperspectral image data[J]. IEEE Transactions on Geoscience and Remote Sensing,2003,41(5):1129-1131.
    [10]贺霖,潘泉,邸韦华,et al.高光谱图像高维多尺度自回归有监督检测[J].自动化学报,2009,(005):509-518.
    [11]谢秋昌.基于高光谱数据的分类技术研究[D].Master's thesis.长安大学,2008.
    [12]Chang C, Chiang S. Anomaly detection and classification for hyperspectral im-agery[J]. IEEE Transactions on Geoscience and Remote Sensing,2002,40(6):1314-1325.
    [13]Ji B, Chang C, Jensen J, et al. Unsupervised constrained linear Fisher's discriminant analysis for hyperspectral image classification[C]//Proceedings of SPIE.2004:344.
    [14]Kumar S, Ghosh J, Crawford M. Best-bases feature extraction algorithms for clas-sification of hyperspectral data[J]. IEEE Transactions on Geoscience and Remote Sensing,2002,39(7):1368-1379.
    [15]Kuo B, Yang J, Landgrebe D. Gaussian mixture classifier with regularized covari-ance estimator for hyperspectral data classification[C]//IEEE Proceedings of Inter-national Geoscience and Remote Sensing Symposium,2003. IGARSS'03. IEEE. 2004:276-278.
    [16]Santich N. Reducing the parameterization of the covariance matrix for maximum likelihood classification[C]//2002 IEEE International Geoscience and Remote Sens-ing Symposium,2002. IGARSS'02. IEEE.2002:3519-3521.
    [17]Shah C, Arora M, Varshney P. Unsupervised classification of hyperspectral data: an ICA mixture model based approach[J]. International Journal of Remote Sensing, 2004,25(2):481-487.
    [18]Meyer A, Paglieroni D, Astaneh C. K-means reclustering:algorithmic options with quantifiable performance comparisons[C]//Proceedings of SPIE.2003:84.
    [19]Lee S, Crawford M. Hierarchical clustering approach for unsupervised image clas-sification of hyperspectral data[C]//2004 IEEE International Proceedings of Geo-science and Remote Sensing Symposium,2004. IGARSS'04. IEEE.2005:941-944.
    [20]Koltunov A, Ben-Dor E. Mixture density separation as a tool for high-quality in-terpretation of multi-source remote sensing data and related issues[J]. International Journal of Remote Sensing,2004,25(16):3275-3299.
    [21]Salazar J, Post B. A novel distribution-free segmentation approach applied to mul-tispectral thermal imager data[C]//Proceedings of SPIE.2004:118.
    [22]Gomez-Chova L, Calpe-Maravilla J, Camps-Valls G, et al. Robust automatic classi-fication method for hyperspectral imagery[C]//Proceedings of SPIE.2004:398.
    [23]Du Q. Real-time online unsupervised detection and classification for remotely sensed imagery[C]//Proceedings of SPIE.2004:665.
    [24]Stein D. Material identification and classification in hyperspectral imagery using the normal compositional model[C]//Proceedings of SPIE.2003:559.
    [25]Kempeneers P, Deronde B, Bertels L, et al. Classifying hyperspectral airborne im-agery for vegetation survey along coastlines[C]//Geoscience and Remote Sensing Symposium,2004. IGARSS'04. Proceedings.2004 IEEE International. IEEE.2005: 1475-1478.
    [26]Robila S. Distributed source separation algorithms for hyperspectral image process-ing[J]. SPIE Algorithms and Technologies for Multispectral, Hyperspectral, and Ultraspectral Imagery X,2004,5093:628-635.
    [27]Lee C, Snyder W. Hyperspectral image segmentation using active contours[C]// Proceedings of SPIE.2004:159.
    [28]熊桢,童庆禧.用于高光谱遥感图象分类的一种高阶神经网络算法[J].中国图象图形学报:A辑,2000,5(003):196-201.
    [29]Benediktsson J, Palmason J, Sveinsson J. Classification of hyperspectral data from urban areas based on extended morphological profiles[J]. IEEE Transactions on Geoscience and Remote Sensing,2005,43(3):480-491.
    [30]Goel P, Prasher S, Patel R, et al. Classification of hyperspectral data by decision trees and artificial neural networks to identify weed stress and nitrogen status of corn[J]. Computers and Electronics in Agriculture,2003,39(2):67-93.
    [31]Rand R, Bosch E. The effect of wavelet-based dimension reduction on neural network classification and subpixel targeting algorithms[C]//Proceedings of SPIE. 2004:653.
    [32]Pal M, Mather P. An assessment of the effectiveness of decision tree methods for land cover classification[J]. Remote sensing of environment,2003,86(4):554-565.
    [33]GUALTIERI J, CROMP R. Support Vector machines for hyperspectral remote sens-ing classification[C]//SPIE proceedings series. Society of Photo-Optical Instrumen-tation Engineers.1999:221-232.
    [34]Melgani F, Bruzzone L. Classification of hyperspectral remote sensing images with support vector machines[J]. IEEE Transactions on Geoscience and Remote Sensing, 2004,42(8):1778-1790.
    [35]Camps-Valls G, Gomez-Chova L, Calpe-Maravilla J, et al. Robust support vec-tor method for hyperspectral data classification and knowledge discovery[J]. IEEE Transactions on Geoscience and Remote Sensing,2004,42(7):1530-1542.
    [36]Lennon M, Mercier G, Hubert-Moy L. Classification of hyperspectral images with nonlinear filtering and support vector machines[C]//2002 IEEE International Geo-science and Remote Sensing Symposium,2002. IGARSS'02. IEEE.2002:1670-1672.
    [37]Halldorsson G, Benediktsson J, Sveinsson J. Source based feature extraction for support vector machines in hyperspectral classification[C]//2004 IEEE Interna-tional Geoscience and Remote Sensing Symposium,2004. IGARSS'04. Proceed-ings. IEEE.2004.
    [38]Melgani F, Bruzzone L. Support vector machines for classification of hyperspec-tral remote-sensing images[C]//Geoscience and Remote Sensing Symposium,2002. IGARSS'02.2002 IEEE International. IEEE.2002:506-508.
    [39]Mercier G, Lennon M. Support vector machines for hyperspectral image classifica-tion with spectral-based kernels[C]//Geoscience and Remote Sensing Symposium, 2003. IGARSS'03. Proceedings.2003 IEEE International. IEEE.2004:288-290.
    [40]Bazi Y, Melgani F. Toward an optimal SVM classification system for hyperspectral remote sensing images[J]. IEEE Transactions on Geoscience and Remote Sensing, 2006,44(11):3374-3385.
    [41]Dundar M, Landgrebe D. A cost-effective semisupervised classifier approach with kernels[J]. IEEE Transactions on Geoscience and Remote Sensing,2004, 42(1):264-270.
    [42]杨国鹏,余旭初,陈伟,et al.基于核Fisher判别分析的高光谱遥感影像分类[J].遥感学报,2008,(004):579-585.
    [43]Bruzzone L, Prieto D. A technique for the selection of kernel-function parame-ters in RBF neural networks for classification of remote-sensing images[J]. IEEE Transactions on Geoscience and Remote Sensing,2002,37(2):1179-1184.
    [44]Giacinto G, Roli F, Bruzzone L. Combination of neural and statistical algorithms for supervised classification of remote-sensing images [J]. Pattern Recognition Letters, 2000,21(5):385-397.
    [45]Bruzzone L, Cossu R. A multiple-cascade-classifier system for a robust and partially unsupervised updating of land-cover maps [J]. IEEE Transactions on Geoscience and Remote Sensing,2002,40(9):1984-1996.
    [46]Bazi Y, Melgani F. Classification of Hyperspectral Remote Sensing Images using Gaussian Processes[C]//IEEE International Geoscience and Remote Sensing Sym-posium,2008. IGARSS 2008. IEEE.2009.
    [47]Yao F, Qian Y. Band selection based gaussian processes for hyperspectral remote sensing images classification[C]//Image Processing (ICIP),2009 16th IEEE Inter-national Conference on.2009:2845-2848.
    [48]Rosario D. A semiparametric approach using the discriminant metric SAM (spectral angle mapper)[C]//Proceedings of SPIE.2004:58.
    [49]Goetz A, Vane G, Solomon J, et al. Imaging spectrometry for earth remote sens-ing[J]. Science,1985,228(4704):1147.
    [50]Schmidt K, Skidmore A, Kloosterman E, et al. Mapping coastal vegetation using an expert system and hyperspectral imagery[J]. Photogrammetric Engineering and Remote Sensing,2004,70(6):703-715.
    [51]Weiner N. Extrapolation, Interpolation, and Smoothing of Stationary Time Series with Engineering Applications[M]. MIT Press,1949.
    [52]Matheron G. The intrinsic random functions and their applications[J]. Advances in applied probability,1973,5(3):439-468.
    [53]Journel A, Huijbregts C. Mining geostatistics[J]. New York,1978.
    [54]Thompson P. Optimum Smoothing of Two-Dimensional Fieldsl[J]. Tellus,1956, 8(3):384-393.
    [55]Daley R. Atmospheric data analysis[M]. Cambridge University Press,1993.
    [56]Whittle P. Prediction and regulation by linear least-square methods[M]. University of Minnesota Press,1983.
    [57]Ripley B. Spatial statistics[M]. Wiley-Blackwell,2004.
    [58]Cressie N. Statistics for spatial data[J]. Terra Nova,1992,4(5):613-617.
    [59]O'Hagan A, Kingman J. Curve fitting and optimal design for prediction[J]. Journal of the Royal Statistical Society. Series B (Methodological),1978,1-42.
    [60]Sacks J, Welch W, Mitchell T, et al. Design and analysis of computer experiments [J]. Statistical science,1989,4(4):409-423.
    [61]Santner T, Williams B, Notz W. The design and analysis of computer experi-ments[M]. Springer Verlag,2003.
    [62]Williams C K I, Rasmussen C E. Gaussian processes for regression[C]//Advances in Neural Information Processing Systems 8. MIT press 1996:514-520.
    [63]Rasmussen C E, I.Williams C K. Gaussian Processes for Machine Learning[M]. The MIT press,2006.
    [64]Williams C, Barber D. Bayesian classification with Gaussian processes[J]. IEEE Transactions on Pattern Analysis and Machine Intelligence,1998,20(12):1342-1351.
    [65]Gibbs M, MacKay D. Variational Gaussian process classifiers[J]. IEEE Transactions on Neural Networks,2000,11(6):1458-1464.
    [66]Bernardo J, Berger J, Dawid A, et al. Regression and Classification Using Gaussian Process Priors [C]//Bayesian statistics 6:proceedings of the Sixth Valencia Interna-tional Meeting, June 6-10,1998. Oxford University Press, USA.1999:475.
    [67]Urtasun R, Darrell T. Discriminative Gaussian process latent variable model for clas-sification[C]//Proceedings of the 24th international conference on Machine learn-ing. ACM.2007:927-934.
    [68]Csato L. Gaussian processes Iterative sparse approximations[D]. PhD thesis. Uni-versity of Aston in Birmingham,2002.
    [69]Mukai T, Kuriyama S. Geostatistical motion interpolation[C]//ACM SIGGRAPH 2005 Papers. ACM.2005:1062-1070.
    [70]Zhu X, Ghahramani Z, Lafferty J. Semi-supervised learning using Gaussian fields and harmonic functions[C]//MACHINE LEARNING-INTERNATIONAL WORK-SHOP THEN CONFERENCE-.2003:912.
    [71]Csato L, Opper M. Sparse representation for Gaussian process models[C]//Ad-vances in neural information processing systems 13:proceedings of the 2000 con-ference. The MIT Press.2001:444.
    [72]Smola A, Kondor R. Kernels and regularization on graphs [C]//Learning theory and Kernel machines:16th Annual Conference on Learning Theory and 7th Ker-nel Workshop, COLT/Kernel 2003, Washington, DC, USA, August 24-27. Springer Verlag.2003:144.
    [73]Sindhwani V, Chu W, Keerthi S. Semi-supervised Gaussian process classifiers[C]// Proceedings of the 20th International Joint Conference on Artificial Intelligence. 2007:1059-1064.
    [74]Seeger M. Expectation propagation for exponential families[J]. Univ. California Berkeley, Berkeley, CA, Tech. Rep.[Online]. Available:www. kyb. tuebingen. mpg. de/bs/people/seeger,2005.
    [75]Opper M, Winther O. Gaussian processes for classification:Mean field algo-rithms[J]. Neural Computation,2000,12(11):2177-2204.
    [76]Minka T. A family of algorithms for approximate Bayesian inference[D]. PhD thesis. Massachusetts Institute of Technology,2001.
    [77]Seeger M. Bayesian model selection for support vector machines, Gaussian pro-cesses and other kernel classifiers[J]. Advances in neural information processing systems,2000,12:603-609.
    [78]Kuss M, Rasmussen C. Assessing approximate inference for binary Gaussian pro-cess classification[J]. The Journal of Machine Learning Research,2005,6:1704.
    [79]Seeger M. PAC-Bayesian generalisation error bounds for gaussian process classifi-cation[J]. The Journal of Machine Learning Research,2003,3:233-269.
    [80]Silverman B. Some aspects of the spline smoothing approach to non-parametric re-gression curve fitting[J]. Journal of the Royal Statistical Society. Series B (Method-ological),1985,47(1):1-52.
    [81]Lawrence N, Seeger M, Herbrich R. Fast sparse Gaussian process methods:The informative vector machine[J]. Advances in neural information processing systems, 2003,625-632.
    [82]Seeger M. Bayesian Gaussian Process Models:PAC-Bayesian Generalisation Error Bounds and Sparse Approximations [J].2003.
    [83]Csato L, Opper M. Sparse on-line Gaussian processes[J]. Neural Computation, 2002,14(3):641-668.
    [84]Tresp V. A Bayesian committee machine[J]. Neural Computation,2000, 12(11):2719-2741.
    [85]Lin X, Wahba G, Xiang D, et al. Smoothing spline ANOVA models for large data sets with Bernoulli observations and the randomized GACV[J]. Annals of Statistics, 2000,28(6):1570-1600.
    [86]Zhu J, Hastie T. Kernel logistic regression and the import vector machine[J]. Journal of Computational and Graphical Statistics,2005,14(1):185-205.
    [87]Chang C, Du Q, Sun T, et al. A joint band prioritization and band-decorrelation approach to band selection for hyperspectral image classification [J]. IEEE Transac-tions on Geoscience and Remote Sensing,2002,37(6):2631-2641.
    [88]Jimenez-Rodriguez L, Arzuaga-Cruz E, Velez-Reyes M. Unsupervised linear feature-extraction methods and their effects in the classification of high-dimensional data[J]. IEEE Transactions on Geoscience and Remote Sensing,2007,45(2):469-483.
    [89]Chang C, Wang S. Constrained band selection for hyperspectral imagery [J]. IEEE Transactions on Geoscience and Remote Sensing,2006,44(6):1575-1585.
    [90]Serpico S, Bruzzone L. A new search algorithm for feature selection in hyperspectral remote sensing images[J]. IEEE Transactions on Geoscience and Remote Sensing, 2002,39(7):1360-1367.
    [91]Wacker A, Landgrebe D. The minimum distance approach to classification [J]. LARS Information Note-100771, Purdue University, Laboratory for Applications of Remote Sensing (LARS), West Lafayette,1971.
    [92]Kailath T. The divergence and Bhattacharyya distance measures in signal selec-tion[J]. IEEE Transactions on Communication Technology,2002,15(1):52-60.
    [93]Cover T, Thomas J, Wiley J. Elements of information theory[M]. Wiley Online Library,1991.
    [94]Berkhin P. A survey of clustering data mining techniques [J]. Grouping Multidimen-sional Data,2006,25-71.
    [95]Frey B, Dueck D. Mixture modeling by affinity propagation[J]. Advances in neural information processing systems,2006,18:379.
    [96]Frey B, Dueck D. Clustering by passing messages between data points [J]. science, 2007,315(5814):972.
    [97]Deng H, Clausi D. Advanced Gaussian MRF rotation-invariant texture features for classification of remote sensing imagery [C]//2003 IEEE Computer Society Confer-ence on Computer Vision and Pattern Recognition,2003. Proceedings. IEEE.2003.
    [98]于秀兰,莫虹.Markov随机场模型在多光谱遥感图像分类中的应用[J].中国农业资源与区划,1999,20(005):48-53.
    [99]Kumar M, Miller D. A non-parametric classification strategy for remotely sensed images using both spectral and textural information[C]//Proceedings of the 24th IASTED international conference on Signal processing, pattern recognition, and ap-plications. ACTA Press.2006:89.
    [100]Fauvel M, Chanussot J, Benediktsson J. Evaluation of kernels for multiclass classi-fication of hyperspectral remote sensing data[C]//2006 IEEE International Confer-ence on Acoustics, Speech and Signal Processing,2006. ICASSP 2006 Proceedings. IEEE.2006.
    [101]Jimenez L, Rivera-Medina J, Rodriguez-Diaz E, et al. Integration of spatial and spectral information by means of unsupervised extraction and classification for ho-mogenous objects applied to multispectral and hyperspectral data[J]. IEEE Trans-actions on Geoscience and Remote Sensing,2005,43(4):844-851.
    [102]Cristianini N, Shawe-Taylor J. An introduction to support Vector Machines:and other kernel-based learning methods [M]. Cambridge Univ Pr,2000.
    [103]Shawe-Taylor J, Cristianini N. Kernel methods for pattern analysis[M]. Cambridge Univ Pr,2004.
    [104]Reed M, Simon B. Functional Analysis, ser[J]. Methods of Modern Mathematical Physics. New York:Academic,1980,1.
    [105]姚伏天,钱沄涛.用于高光谱遥感图像分类的空间约束高斯过程方法[J].南京大学学报:自然科学版,2009,45(005):665-670.
    [106]Horn R A, Johnson C R. Matrix Analysis[M]. Cambridge, UK:Cambridge Univer-sity Press,1985.
    [107]Harris R. Contextual classification post-processing of Landsat data using a prob-abilistic relaxation model[J]. International Journal of Remote Sensing,1985, 6(6):847-866.
    [108]Kittler J, Pairman D. Contextual pattern recognition applied to cloud detection and identification[J]. IEEE Transactions on Geoscience and Remote Sensing,2007, (6):855-863.
    [109]Mohn E, Hjort N, Storvik G. A simulation study of some contextual classification methods for remotely sensed data[J]. IEEE Transactions on Geoscience and Remote Sensing,2007, (6):796-804.
    [110]Swain P, Vardeman S, Tilton J. Contextual classification of multispectral image data[J]. Pattern Recognition,1981,13(6):429-441.
    [111]Di Zenzo S, Bernstein R, Degloria S, et al. Gaussian maximum likelihood and con-textual classification algorithms for multicrop classification[J]. IEEE Transactions on Geoscience and Remote Sensing,2007, (6):805-814.
    [112]Rossi R, Dungan J, Beck L. Kriging in the shadows:geostatistical interpolation for remote sensing[J]. Remote Sensing of Environment,1994,49(1):32-40.
    [113]Zhong P, Wang R. A multiple conditional random fields ensemble model for urban area detection in remote sensing optical images[J]. IEEE Transactions on Geo-science and Remote Sensing,2007,45(12):3978-3988.
    [114]Fu-Tian Y, Yun-Tao Q. A spatial Gaussian process method for hyperspectral remote sensing imagery classification[J]. Journal of Nanjing University (Natural Sciences), 2009,45(5):665-670.
    [115]冯莉,李满春,李飞雪.基于遗传算法的遥感图像纹理特征选择[J].南京大学学报:自然科学版,2008,44(003):310-319.
    [116]Lafferty J D, McCallum A, Pereira F C N. Conditional random fields:Probabilistic models for segmenting and labeling sequence data[C]//Proceedings of the Eigh-teenth International Conference on Machine Learning. San Francisco, CA, USA: Morgan Kaufmann Publishers Inc.,2001:282-289.
    [117]Kumar S, Hebert M. Discriminative random fields:A discriminative framework for contextual interaction in classification[C]//Ninth IEEE International Conference on Computer Vision,2003. Proceedings. IEEE.2008:1150-1157.
    [118]Camps-Valls G, Gomez-Chova L, Munoz-Mari J, et al. Composite kernels for hy-perspectral image classification[J]. IEEE Geoscience and Remote Sensing Letters, 2006,3(1):93-97.
    [119]Camps-Valls G, Marsheva B, Zhou D. Semi-supervised graph-based hyperspectral image classification[J]. IEEE Transactions on Geoscience and Remote Sensing, 2007,45(10):3044-3054.
    [120]Kumar S. Models for Learning Spatial Interactions in Natural Images for Context-Based Classification[D]. PhD thesis. Carnegie Mellon University,2005.
    [121]Sutton C, McCallum A. An Introduction to Conditional Random Fields for Rela-tional Learning[J]. Introduction to statistical relational learning,2007,93.
    [122]Besag J. Spatial interaction and the statistical analysis of lattice systems[J]. Journal of the Royal Statistical Society. Series B (Methodological),1974,36(2):192-236.
    [123]Besag J. On the statistical analysis of dirty pictures[J]. Journal of the Royal Statis-tical Society,Series B,1986,48(3):259-302.
    [124]Greig D, Porteous B, Seheult A. Exact maximum a posteriori estimation for binary images[J]. Journal of the Royal Statistical Society. Series B (Methodological),1989, 51(2):271-279.
    [125]Vatsavai R, Shekhar S, Burk T. A semi-supervised learning method for remote sens-ing data mining[C]//17th IEEE International Conference on Tools with Artificial Intelligence,2005. ICTAI 05. IEEE.2005:5-211.
    [126]Gomez-Chova L, Calpe J, Camps-Valls G, et al. Semi-supervised classification method for hyperspectral remote sensing images[C]//International Geoscience And Remote Sensing Symposium.2003:1776-1778.
    [127]Blum A, Mitchell T. Combining labeled and unlabeled data with co-training[C]// Proceedings of the eleventh annual conference on Computational learning theory. ACM.1998:92-100.
    [128]赵鹏飞,钱沄涛,郑文斌,et al.基于图像与文本特征的在线生物文献MRI图像库构建[J].中国生物医学工程学报,2010,29(5).
    [129]Shahshahani B, Landgrebe D. The effect of unlabeled samples in reducing the small sample size problem and mitigating the Hughes phenomenon[J]. IEEE Transactions on Geoscience and Remote Sensing,1994,32(5):1087-1095.
    [130]Chapelle O, Scholkopf B, Zien A. Semi-Supervised Learning[M]. MIT Press,2006.
    [131]Sindhwani V, Niyogi P, Belkin M. Beyond the point cloud:from transductive to semi-supervised learning[C]//Proceedings of the 22nd international conference on Machine learning. ACM.2005:831.
    [132]Belkin M, Niyogi P, Sindhwani V. Manifold regularization:A geometric frame-work for learning from labeled and unlabeled examples[J]. The Journal of Machine Learning Research,2006,7:2434.
    [133]Krishnapuram B, Williams D, Xue Y, et al. On semi-supervised classification[J]. Advances in neural information processing systems,2004,17:721-728.
    [134]Kapoor A, Qi Y, Ahn H, et al. Hyperparameter and kernel learning for graph based semi-supervised classification[J]. Advances in Neural Information Processing Sys-tems,2006,18:627.
    [135]Scholkopf B, Herbrich R, Smola A. A generalized representer theorem[C]//Com-putational learning theory. Springer.2001:416-426.
    [136]李宏伟,刘扬,卢汉清,et al.结合半监督核的高斯过程分类[J].自动化学报,2009,35(7):888-895.
    [137]Williams C, Seeger M. Using the Nystrom method to speed up kernel machines[J]. Advances in Neural Information Processing Systems 13,2001.
    [138]Smola A, Schokopf B. Sparse Greedy Matrix Approximation for Machine Learn-ing[C]//Proceedings of the Seventeenth International Conference on Machine Learning. Morgan Kaufmann Publishers Inc.2000:918.
    [139]Frieze A, Kannan R, Vempala S. Fast Monte-Carlo algorithms for finding low-rank approximations[J]. Journal of the ACM (JACM),2004,51(6):1025-1041.
    [140]Fine S, Scheinberg K. Efficient SVM training using low-rank kernel representa-tions[J]. The Journal of Machine Learning Research,2002,2:243-264.
    [141]Poggio T, Girosi F. Networks for approximation and learning[J]. Proceedings of the IEEE,2002,78(9):1481-1497.
    [142]Luo Z, Wahba G. Hybrid Adaptive Splines.[J]. Journal of the American Statistical Association,1997,92(437).
    [143]Rasmussen C, Quinonero-Candela J. Healing the relevance vector machine through augmentation[C]//Proceedings of the 22nd international conference on Machine learning. ACM.2005:689-696.

© 2004-2018 中国地质图书馆版权所有 京ICP备05064691号 京公网安备11010802017129号

地址:北京市海淀区学院路29号 邮编:100083

电话:办公室:(+86 10)66554848;文献借阅、咨询服务、科技查新:66554700