High-resolution remote-sensing images are increasingly applied in land-use classification problems. Land-use scenes are often very complex and difficult to represent. Subsequently, the recognition of multiple land-cover classes is a continuing research question. We propose a classification framework based on a sparse coding-based correlaton (termed sparse correlaton) model to solve this challenge. Specifically, a general mapping strategy is presented to label visual words and generate sparse coding-based correlograms, which can exploit the spatial co-occurrences of visual words. A compact spatial representation without loss discrimination is achieved through adaptive vector quantization of correlogram in land-use scene classification. Moreover, instead of using K-means for visual word encoding in the original correlaton model, our proposed sparse correlaton model uses sparse coding to achieve lower reconstruction error. Experiments on a public ground truth image dataset of 21 land-use classes demonstrate that our sparse coding-based correlaton method can improve the performance of land-use scene classification and outperform many existing bag-of-visual-words-based methods.
Building extraction is one of the main research topics of the photogrammetry community. This paper presents automatic algorithms for building boundary extractions from aerial LiDAR data. First, segmenting height information generated from LiDAR data, the outer boundaries of aboveground objects are expressed as closed chains of oriented edge pixels. Then, building boundaries are distinguished from nonbuilding ones by evaluating their shapes. The candidate building boundaries are reconstructed as rectangles or regular polygons by applying new algorithms, following the hypothesis verification paradigm. These algorithms include constrained searching in Hough space, enhanced Hough transformation, and the sequential linking technique. The experimental results show that the proposed algorithms successfully extract building boundaries at rates of 97%, 85%, and 92% for three LiDAR datasets with varying scene complexities.
Accuracy needs to be improved when extrapolating instantaneous remote sensing evapotranspiration (ET) values for daily ET estimates. Daily net radiation (R n24 ) is a critical variable for this extrapolation. We expand the de Bruin and Sticker formula for R n24 in the evaporative fraction method by incorporating ground radiation measurements. In addition, an improved daily transmittance formula that considers atmospheric conditions and solar declination is proposed for R n24 calculation. The coefficient for daily net longwave radiation (R nl24 ) in R n24 is regressed from the ground radiation measurements and the improved daily transmittance values. A comparative study was conducted in the Haihe River basin, China. Experimental results show that the improved method yields a root mean square error of 0.97 mm for daily ET and only about 2% underestimation for actual seasonal ET in the study area. These results are better than the values obtained by traditional methods.
Geospatial data are the backbone of spatial analysis, but only current and accurate data can provide the appropriate
framework for successful use of GIS technology. The revision of geospatial data is still one of the major open challenges
for the successful implementation of Geographic Information Systems (GIS). There is a great need for cost-efficient data
revision and quality control methods in order to fulfill the need of most faithful image of the geographic space reality.
Maintaining one database per scale without directly maintaining interrelationship between multiple scale databases leads
to no update propagation and inter-database consistency is lost. A Multi-resolution/representation-database (MRDB)
approach is proposed in this paper to solve this problem. MRDB is a spatial database technology that designed to store
one real world phenomena at several specially designed levels of precision, accuracy and resolution. Propagating updates
between these different scales datasets in MRDB with the advantage that data consistency and integrity can be
significantly improved and enable an automatic incremental update process for the data sets.
KEYWORDS: Geographic information systems, Network architectures, Data modeling, Complex systems, Web services, Internet, Classification systems, Image processing, Internet technology, Information fusion
With the rapid development and application of Internet technology, Geographic Information System has stepped into a
new age with its main form as Geographic Information Services. Although there are so many Geographic Information
Services available on the Internet now, they are still in very low rate of application. To facilitate the discovery, some
proposals for Geographic Information Services infrastructures focus on centralized service registry (UDDI, Universal
Description, Discovery and Integration ) for cataloguing their geospatial functions and characteristics. Centralized
systems introduce single points of failure, hotspots in the network and expose vulnerability to malicious attacks. In order
to solve the problem above, this paper proposes A Complex Network Peer-to-Peer Approach for Geospatial Web
Services Discovery. Based on complex network theory, a Peer-to-Peer network has been established, and it takes the
charge of each peer's communication and management, and an EBRIM registry centre has been inserted into each peer
for the registry and query of Geographic Information Services.
Due to increasing power consumption, heat dissipation, and other physical issues, the architecture of central processing unit (CPU) has been turning to multicore rapidly in recent years. Multicore processor is packaged with multiple processor cores in the same chip, which not only offers increased performance, but also presents significant challenges to application developers. As a matter of fact, in GIS field most of current GIS algorithms were implemented serially and could not best exploit the parallelism potential on such multicore platforms. In this paper, we choose Inverse Distance
Weighted spatial interpolation algorithm (IDW) as an example to study how to optimize current serial GIS algorithms on multicore platform in order to maximize performance speedup. With the help of OpenMP, threading methodology is introduced to split and share the whole interpolation work among processor cores. After parallel optimization, execution time of interpolation algorithm is greatly reduced and good performance speedup is achieved. For example, performance speedup on Intel Xeon 5310 is 1.943 with 2 execution threads and 3.695 with 4 execution threads respectively. An
additional output comparison between pre-optimization and post-optimization is carried out and shows that parallel optimization does to affect final interpolation result.
KEYWORDS: Databases, Geographic information systems, Data modeling, Image quality, Data processing, Composites, System identification, Information fusion, Lithium, Data acquisition
The revision of geographical data is still one of the major open challenges for the successful implementation of Geographic Information Systems (GIS). Nowadays, Geographic Information Systems are considered to be truly analysis and decision-making tools. For that reason, a growing number of organizations invest in such systems and add specific information necessary to the tasks for which they have the responsibility. Spatial data are the backbone of GIS analysis, but only current and accurate spatial data can provide the appropriate framework for successful use of GIS technology.
Out-of-date or inaccurate spatial data could contaminate GIS results in direct correlation to the obsolescence and inaccuracy of the spatial data. There is a great need for cost-efficient spatial data revision and quality control methods in order to update the master and user spatial databases, fulfilling the request of the most faithful image of the geographic space reality. This paper tries to provide a review of latest achievements on Spatio-temporal Data Revision (SDR). A comparison of three popular revision models is given in detail.
KEYWORDS: Data modeling, 3D modeling, Buildings, Data storage, Spatial resolution, Geographic information systems, 3D optical data storage, 3D metrology, Remote sensing, Computer programming
The buildings in modern city are complex and diverse, and the quantity is huge. These bring very big challenge for
constructing 3D GIS under network circumstance and eventually realizing the Digital Earth. After analyzed the
characteristic of network service about massive 3D urban building model data, this paper focuses on the organization
and management of spatial data and the network services strategy, proposes a progressive network transmission schema
based on the spatial resolution and the component elements of 3D building model data. Next, this paper put forward
multistage-link three-dimensional spatial data organization model and encoding method of spatial index based on fully
level quadtree structure. Then, a virtual earth platform, called GeoGlobe, was developed using above theory.
Experimental results show that above 3D spatial data management model and service theory can availably provide
network services for large-scale 3D urban model data. The application results and user experience good .
KEYWORDS: Data modeling, Visual process modeling, Web services, Visualization, Process modeling, Floods, Roads, Prototyping, Data processing, Lawrencium
Web services and web service composition technology have become the primary methods to realize geospatial
information sharing and interoperability. There are various integration models and many specifications for web services
composition, BPEL4WS is the most typical and prevailing one. But BPEL4WS is IT-oriented, the syntax structure is
complicated, a well understanding to XML specifications and web services specifications is demanded, the way to
describe processes and define activities in BPEL4WS are different from ways in certain domain, service contracts need
early binding before the process instance execution. For above reasons, BPEL4WS is not suit for geospatial processing
process's visual modeling. In this paper, an abstract geospatial service chain model based on data-dependent relationship
is designed. A mapping algorithm is also proposed for translating the abstract service chain model into BPEL4WS form.
So geospatial experts who are not web services experts can intuitively modeling service chain, translate the model into
BPEL4WS style and execute it using BPEL4WS engine. Based on these researches, a geospatial web service chain
visual modeling platform prototype is built, aiming to meet modeling demand of geospatial domain experts and common
users.
Spatial image browse and retrieval in spatial information system are based on centralized server. The limitations are
obvious under a concurrent multi-users environment. P2P (Peer-to-Peer) approach takes full advantage of the resources
and bandwidth among clients distributed around network in order to offload the centralized server and improve
performance of delivery. This paper explores three key issues of forming a spatial image P2P network, the P2P network
topology, the organization and identifier of spatial image and a content locating method based on spatial grid index.
Then, we design an effective and reliable spatial image delivery system assisted by P2P networks. We have
comparatively evaluated our design through requesting an image from World Wind server and our P2P network formed
by World Wind clients, which shows a significant improvement of delivery speed.
Quantitative measure of image information amount is of great importance in many image processing applications, e.g. image compression and image registration. Many commonly used metrics are defined mathematically. However, the ultimate consumers of images are human observers in most situations, thus such measures without consideration of internal mechanism of human visual system (HVS) may not be appropriate. This paper proposes an improved definition of mutual information between two images based on the visual information which is actually perceived by human beings in different subbands of image. This definition is both sensitive to the pixels' spatial location and correlates well with human perceptual feeling than mutual information purely calculated by pixels' grayscale value. Experimental results on images with different noises and JPEG&JPEG2000 compressed images are also given.
KEYWORDS: Data modeling, Visualization, 3D modeling, Data storage, 3D visualizations, Internet, Visual analytics, Software development, Eye models, Analytical research
Research works and applications of geographical information trend towards global view in space domain and high details in frequency domain. With the rapid development of network communication and distributed computing technology in the past years, it is now technically possible to provide the global high-resolution terrain service through network. Aiming at providing highly detailed terrain service for seamless 3D visualization and analysis over the Internet, this paper discusses the characteristics and bottlenecks of high-performance terrain data service. An extensible multi-pyramid data model is proposed to store and manage spatial data in both client and server sides. Based on this data model, an architectural design of adaptive terrain service is proposed to enhance Quality of Geospatial Information Service (QoGIS). Finally, based upon above studies, a prototype software system is developed for providing real time three-dimensional global terrain visualization service on the network.
Airborne laser scanning (ALS) is the most recently emerged technology for acquiring Digital Terrain Model (DTM). In order to generate DTMs from ALS data, the terrain points and off-terrain points should be classified. To meet the objective, a sweep line based filtering algorithm was proposed in this paper. This algorithm works on raw range data obtained by ALS system without interpolation. The algorithm includes 3 steps. First, the point cloud is divided into sweep lines. The second step is to classify terrain points and off terrain points within each sweep line by choosing suitable tolerance value of height difference and size of filtering window. Finally, the filtering results of each sweep line are collected. Experimental work is conducted on 5 data sets of various terrain environments. A quantitative assessment was conducted to compare this result with manual classification. Both qualitative and quantitative experiments show that sweep line based filter can remove most of the off-terrain points effectively. Compared with other algorithms, sweep line based filtering is featured by simplified the 2-dimentional problem into 1-dimentional problem where less computational resources are required.
This paper proposes a QoS routing with delay constrained and bandwidth guaranteed in TDMA/CDMA ad hoc networks. Because the wireless bandwidth is shared among adjacent nodes and the network topology changes as the nodes move, quality of service is more difficult to guarantee in ad hoc networks than in most other type of networks. QoS routing requires finding not only a route from a source to a destination, but also a route that satisfies the end-to-end QoS requirement, often given in terms of bandwidth or delay. We improve a distributed algorithm to realize the control of delays and give the method to calculate the bandwidth of a path that can be used to judge whether the bandwidth of the path satisfies the requirement. In TDMA/CDMA ad hoc networks, this is a new scheme to satisfy not only delay constrained but also bandwidth guaranteed. Some simulations have been conducted and the simulation results have demonstrated the deference of call success rate between our algorithm and other existing global algorithm or other localized algorithms that only consider delay constrained or bandwidth guaranteed.
This paper investigates the QoS routing in TDMA/CDMA ad hoc networks. Since the network topology may constantly change and the available bandwidth is very limited in ad hoc networks, it's quite often to see a call is blocked when a path with required bandwidth cannot be found in the system. Therefore, we try to find multiple paths whose aggregated bandwidth can meet the bandwidth requirement and whose delays are within the required delay bound and then use the multiple paths in parallel for the QoS transmission of the call. This QoS routing we proposed can significantly reduce the system blocking probability and thus make a better use of network resources. We discuss the process of searching multiple parallel paths and proposed three heuristics (according to three parameters: property of maximum bandwidth, property of shortest path, property of maximum ratio of bandwidth to hops) to choose a group of paths whose total bandwidth satisfies the requirement. Some simulations have been conducted and the simulation results have demonstrated the deference of blocking rate gained by using the proposed three heuristics and also shown the proposed algorithms out-perform one existent on-demand algorithm.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.