Loading...

Table of Content

    25 June 2014, Volume 43 Issue 6
    Concepts and Key Techniques for 30 m Global Land Cover Mapping
    2014, 43(6):  551-557. 
    Asbtract ( )   HTML   PDF (3049KB) ( )  
    References | Related Articles | Metrics

    Global Land Cover (GLC) characterization and monitoring at fine resolution is a key task and big challenge for both earth observation and geomatics societies in the world. Recently the first operational 30-m GLC mapping project has been completed by China. It is based on the optimum selection and processing of Landsat-like satellite imagery for full global coverage, service-oriented integration of all available reference data and auxiliary information, objected-based precise land cover characterization, and knowledge-based data quality controlling. The key techniques developed include the multi-type imagery geometric processing and radiometric reconstruction, integration of heterogeneous data and external services, object-based thematic-layer classification, and knowledge-based spatio-temporal consistency verification. The technical guidelines and software tools have been further developed for supporting the operational 30-m GLC mapping of the years 2000 and 2010.

    Global Bathymetry Model Predicted from Vertical Gravity Gradient Anomalies
    2014, 43(6):  558-574. 
    Asbtract ( )   HTML   PDF (3255KB) ( )  
    References | Related Articles | Metrics

    The response function between bathymetry and vertical gravity gradient anomalies is derived. A global bathymetry model was formed using vertical gravity gradient anomalies and ship soundings based on the response function. The latitude rang of the model is 75°S~70°N and the resolution is 1′×1′. Comparing to ship soundings, the accuracy of the model in South India Ocean and Northwest Pacific was evaluated. The result show that, in the discussed region, the accuracy of the model predicted in this paper is better than ETOPO1, GEBCO and DTU10, and in Northwest Pacific, the accuracy is slightly better than the latest V15.1 model from SIO, the accuracy in South India Ocean is consistent with V15.1. The influence of high order terms and crust isostasy were discussed. It found that both of the two factors have nearly no effect on the result and can be ignored. The predicted model was compared with the model formed using gravity anomalies too. The result shows that, in the wave length 100 to 200km, the vertical gravity gradient anomalies are preferred to gravity anomalies. In Northwest Pacific, comparing with V15.1, the accuracy of the predicted model is increased by about 29.5% when these two kinds of data were combined. It indicates that a bathymetry model of higher accuracy can be formed combining gravity anomalies and vertical gravity gradient anomalies.

    Wave Number Domain Iterative Tikhonov Regularization Method for Downward Continuation of Airborne Gravity Data
    2014, 43(6):  566-574. 
    Asbtract ( )   HTML   PDF (1805KB) ( )  
    Related Articles | Metrics
    Virtual Observation Method to Ill-posed Total Least Squares Problem
    2014, 43(6):  575-581. 
    Asbtract ( )   HTML   PDF (1001KB) ( )  
    Related Articles | Metrics
    Characteristics of Position Time Series at CORS Stations in Sichuan Basin before and after Wenchuan Earthquake
    2014, 43(6):  582-589. 
    Asbtract ( )   HTML   PDF (3841KB) ( )  
    References | Related Articles | Metrics

    The sites’ movement characteristics before and after Wenchuan Earthquake are investigated using the daily position time series of 12 continuously operating reference stations (CORS) in Sichuan Province from 2006 March to 2012 September. The displacements due to surface mass loading effects such as pressure loading, nontidal ocean loading, snow depth and soil moisture loading have been calculated and removed to reduce the root mean square (RMS) of the vertical component. A spatial filtering method based on the principle component analysis (PCA) is employed to extract the common mode errors (CME) from the daily time series. The method of maximum likelihood estimation is also utilized to choose the optimal noise model and assess the model parameters of the time series. The results indicate that the spatial response of the first principal component with the PCA analysis is reduced obviously by 20%~40% after the earthquake due to the effect of the post-seismic deformation. The velocity field has also changed obviously, especially at the PIXI, CHDU and MYAN sites, while they seem to be locked at the YAAN and QLAI sites. The annual amplitude of Sichuan Basin is the largest just one year before the earthquake, and after that it becomes smaller gradually. The results above imply that the Wenchuan Earthquake has potentially changed the movement characteristics of Sichuan Basin.

    SVR Aided Adaptive Robust Filtering Algorithm for GPS/INS Integrated Navigation
    2014, 43(6):  590-606. 
    Asbtract ( )   HTML   PDF (1282KB) ( )  
    References | Related Articles | Metrics

    The number of observations is less than the number of state parameters in loosely-coupled global positioning system and inertial navigation system (GPS/INS) integrated navigation system. It is hard to distinguish dynamical model error from observation gross error using observation and state residuals, resulting from that the residuals are affected by both dynamical model error and observation gross error. A robust adaptive kalman filtering (RAKF) algorithm is put forward based on genetic algorithm and support vector regression (GA-SVR). The algorithm addresses the limits of anomaly detection on condition of lacking redundant observations. Support vector regression algorithm is used to train the mapping model for predicting suboptimal observations with parameter optimization based on genetic algorithm. The global abnormal detection, combined with the predicted observations, choose robust or adaptive kalman filtering autonomously for purpose of adjusting contribution of observations and dynamical model to the results. Finally field data on the vehicle are collected to verify the algorithm. It's shown that, dynamical model error can be distinguished from observation gross error based on GA-SVR, the influence of anomaly data is greatly weakened with RAKF algorithm to improve the reliability and accuracy of navigation solutions.

    A Workload-distribution Based CPU/GPU MTF Compensation Approach for High Resolution Satellite Images
    2014, 43(6):  598-606. 
    Asbtract ( )   HTML   PDF (1233KB) ( )  
    Related Articles | Metrics

    A novel workload-distribution based CPU/GPU MTF compensation approach for high resolution satellite images is proposed in this paper. First, the basic GPU implementation issues are addressed; Next, three performance tuning methods – the execution configuration optimization, memory access optimization and instruction optimization – are applied to further improve the performance. We test our approach with the GF-1 panchromatic image in the CPU/GPU system that consists of an Intel Xeon E5650 CPU and a NVIDIA Tesla C2050 GPU. The experimental result shows that the speedup ratio is up to 42.80 times. Furthermore, the CPU/GPU workload distribution strategy is presented to fully exploit CPU’s computing horsepower. With this strategy, the speedup ratio of MTF compensation finally reaches to 47.82 times (the corresponding processing time is 1.62s), which could meet the requirement of near real-time MTF compensation for high resolution satellite images.

    Sparse Unmixing for Hyperspectral Image Based on Spatial Homogeneous Analysis
    2014, 43(6):  607-612. 
    Asbtract ( )   HTML   PDF (965KB) ( )  
    References | Related Articles | Metrics

    Endmember abundance of hyperspectral imagery is of notable sparsity and distributing smoothness in spatial space. According to these two properties, a sparse unmixing algorithm based on imagery spatial homogeneity analysis is proposed in this paper. Firstly, homogeneity index is calculated by imagery spatial homogeneity analysis. Then the spatial regularizers of the sparse regression unmixing model are weighted according to the homogeneity index. This model can reflect the spatial distribution complexity of endmember abundance and make the unmixing process more effective. Experiments on both simulated and real hyperspectral data show that this algorithm well keeps unmixing abundance sparsity and spatially smoothness with good noise immunity and promotes entire unmixing accuracy.

    A Novel Vegetation Height Inversion Method Based on Polarimetric Interferometric Covariance Matrix Decomposition
    2014, 43(6):  613-636. 
    Asbtract ( )   HTML   PDF (6082KB) ( )  
    Related Articles | Metrics

    Vegetation height inversion results of three-stage inversion process for polarimetric SAR interferometry are seriously affected by the inaccuracies of the underlying ground topographic phase estimation. In order to solve this problem, this paper proposes a new algorithm. This algorithm which combines the Freeman-Durden polarimetric decomposition concept and polarimetric interferometry covariance matrix decomposition obtains the more accurate results. Then it can estimate the vegetation height by applying the RVOG model. Finally, the validity of the proposed algorithm has been tested with the simulated L-band PolInSAR data from PolSARProSim software by ESA and real ALOS PALSAR data covered Amazon forest, and the experiment results show the proposed algorithm is more accurate than the tranditional three-stage inversion process.

    Improved Active Contour Model for Building Roof Boundary Extraction from LiDAR Point Cloud
    2014, 43(6):  620-636. 
    Asbtract ( )   HTML   PDF (3318KB) ( )  
    Related Articles | Metrics

    Based on the edge and the local region information,this paper proposes a new active contour model, which can process multi-spectral image, and it is used to extract the building roof boundary from LiDAR data. The input image of our model is processed by Microstation software,we first classify the LiDAR point cloud and then convert the classified results to raster format. Our model is solved by variational level set method, and the minimal solution is the exact building roof boundary. It can eliminate the restrictions on the initialization and the image types of ACM, and it is suitable for the automatic extraction of any shape of building roof boundaries. In addition,we reduce the computational time of our model by adding the level set rules. Building roof boundary extraction experiment result indicates that our model can obtain higher accuracy in matched rate、shape similarity and positional accuracy than the IAC model and the GACcolor model.

    A Distributed Multi-dimensional SOLAP Model of Remote Sensing Data and Its Application in Drought Analysis
    2014, 43(6):  627-636. 
    Asbtract ( )   HTML   PDF (1971KB) ( )  
    Related Articles | Metrics

    SOLAP has been applied to multi-dimensional analysis of remote sensing data recently. However, its computation performance faces a considerable challenge from the large-scale dataset. A geo-raster cube model extended by Map-Reduce is proposed, which refers to the application of Map-Reduce (a data-intensive computing paradigm) in the OLAP field. In this model, the existing methods are modified to adapt to distributed environment based on the multi-level raster tiles. Then the multi-dimensional map algebra is introduced to decompose the SOLAP computation into multiple distributed parallel map algebra functions on tiles under the support of Map-Reduce. The drought monitoring by remote sensing data is employed as a case study to illustrate the model construction and application. The prototype is also implemented, and the performance testing shows the efficiency and scalability of this model.

    Morphing Transformation of Linear Features by Using Independent Bend Structures More Sufficiently
    2014, 43(6):  637-652. 
    Asbtract ( )   HTML   PDF (1246KB) ( )  
    Related Articles | Metrics

    This paper proposes a morphing approach for linear features by sufficiently considering their independent bend structures. First, the bend structures of the linear features are identified based on a constrained Delaunay triangulation (CDT) model, and represented by bend forest. Second, corresponding bends are determined by bend matching. One can find that some independent bend structures are hided in the higher-level bends. Therefore, the CDT model is iteratively used for corresponding bends and new bend forests of the back-side are built, so that bend matching is iteratively utilized to detect new corresponding bends. After bend matching, the starts and ends of corresponding bends are used to split the linear features so that corresponding segments are obtained. In succession, linear interpolation algorithm is utilized to detect corresponding points and straight-line trajectories are used for morphing. Finally, the experiments are implemented. The results show that the approach proposed in this paper can improve the ability to detect the characteristic points of corresponding bends and to maintain these characteristic points in the morphing process. As a result, this approach can produce better morphing results.

    A Quick Flood Inundation Algorithm Based on Massive DEM Data
    2014, 43(6):  645-652. 
    Asbtract ( )   HTML   PDF (2073KB) ( )  
    Related Articles | Metrics

    The conventional flood inundation methods based on DEM data usually have some disadvantages. For example, the seed filling method which is a popular flood inundation method can not get better result when the data amount is huge, and also it reflects low computational efficiency due to too many recursive operations. To resolve this problem, this paper proposes a quick flood inundation algorithm based on massive DEM data. It focuses on strip-divide method and real time raster compression storage technology. The paper makes the comparison between the common seed filling algorithm and strip-divide seed filling algorithm, the results show that this algorithm greatly improves the computational efficiency also it resolves the problem of massive DEM data analysis.

    Syntactic Characteristics and Smart Construction Mechanism of Thematic Map Symbol
    2014, 43(6):  653-660. 
    Asbtract ( )   HTML   PDF (2057KB) ( )  
    References | Related Articles | Metrics

    The construction for thematic map symbol is a very complex and intelligent process. This symbol can be automatically generated and easily shared on the web through the syntactic structure of semantic symbols. The symbol types, inner structure and its design pattern are expounded. And a syntactic construction theory based on letter (thematic maps primitive) - word (single thematic symbol) - sentence (combined symbols or complex symbols) structure model is put forward for automatic construction of thematic map symbol. Then, a formal description of the mechanism is given and a Object-relational model based design pattern is developed. Finally, A formal description model is proposed based two application systems. The results of this formal description have been applied to integrate application.