Professional Documents
Culture Documents
The Vexcel UltraCam D large-format airborne digital camera showing the multiple lens cones that produce its panchromatic and multi-spectral frame images. (Source: Vexcel).
introduced a beta version of its Yahoo! Maps software, see http://maps.yahoo.com/beta/. However up till now, Yahoo! has not utilized imagery in this particular product. Still it would be no surprise if it decided to do so.
ing satellite images, see www.skylinesoft.com/. Yet another similar service, entitled EarthViewer, was offered from 2001 onwards by the Keyhole Corporation, see www.keyhole.com/. This last service allowed customers to start from a world globe and to zoom rapidly, smoothly and in a very dynamic way to the specific area of interest, often at a very high resolution - down to street level within cities. This service was aimed originally at business users who paid a quite substantial subscription (annual fee) of several hundred dollars for the service. Indeed the full "enterprise" version of the software for multiple users within a large organization cost $20,000. However, in 2003, Keyhole introduced a lightweight version of the software with somewhat reduced capabilities - called Keyhole LT - which was aimed at the consumer market and which retailed at some tens of dollars. Keyhole needed to populate the very large (multi-terabyte) image and map database required to support all its various products and services. To this end, it obtained aerial photography from AirPhotoUSA; satellite imagery from i-cubed and DigitalGlobe; and map and address data from Geographic Data Technology (GDT), now owned by Tele Atlas. At that time, the Keyhole service mainly covered the U.S.A. at high resolutions.
II - Google
In October 2004, Google announced that it had acquired the Keyhole Corporation. The two companies are both based in Mountain View, California, which meant that there was a minimum disruption to the staff. Since then,
April/May 2006
Viewpoint
ver of Vexcel
Colorado - was acquired by Google. @Last's main software product is SketchUp, which allows the simple construction and modelling of 3D objects, especially buildings, from scratch. It has gained a substantial customer base among architects, graphic artists and game developers. Moreover two plug-in versions of SketchUp had also been developed (i) for use with ESRI's ArcGIS package, and (ii) to develop 3D content for Google Earth. An annotated perspective view of part of Chicago showing the skyscraper One presumes that Google decidbuildings in the downtown area of the city - as displayed on Google Earth. ed to acquire @Last Software so (Source: Google) that this valuable tool would not matters have moved rapidly with the various fall into the hands of a competitor, see www.sketchup.com/. Keyhole products being renamed and integrated closely with Google's search engine III - Microsoft and with Google Maps and Google Local. From The Microsoft Corporation has been active in the users' standpoint, the most notable the field of supplying spaceborne and airborne feature is that the map data can be superimimagery for some considerable time - very posed directly over the image data of the much longer than Google! terrain. In June 2005, Google launched its Google Earth product based on the Keyhole III (a) - TerraServer technology. In particular, the lightweight This involvement began with its participation version of the product was offered free for in the TerraServer project that started in personal use through a simple download from 1997/98. Initially this was a collaboration Google's Web site without any need for the between Aerial Images, Compaq and Microsoft. user to register. The previous enhanced, The Aerial Images company, through an agreebusiness and enterprise versions of the ment with the Sovinformsputnik company, software - now called Google Earth Plus, supplied Russian SPIN-2 high-resolution space Google Earth Pro and Google Earth Enterprise imagery as the initial baseline imagery to cusstill have to be paid for, though at a lower tomers using TerraServer. Compaq supplied its level than before. However the availability of high-powered Alpha servers to provide the the free version of the Google Earth triggered considerable computing resources that were a massive explosion of public and media interrequired for this on-line service. est and much favourable publicity. While, for its part in the partnerThe Google Earth database has continued to ship, Microsoft supplied a scaledbe developed at a rapid rate. From the specific up version of its Windows NT point of view of Western Europe, many more software and its SQL relational image data suppliers have been signed up database management system. and coverage is now much more extensive Essentially, at that early stage, than before. However the rest of the world Microsoft viewed TerraServer outside North America and Western Europe is mainly as a research project and still very poorly served - at least in terms of test bed for the development of high-resolution imagery. advanced database technology. no longer has any stake in the TerraServer venture. Furthermore the main source of the imagery has also been changed. The baseline imagery is now the USGS aerial photography with a 1m GSD that covers almost all of the United States. Internationally the baseline imagery became the 15m GSD Landsat imagery, mainly supplied by EarthSat (now MDA Federal), and the 1km NASA imagery as processed by the Globe Explorer company. These systematic coverages are supplemented by (i) more scattered higher resolution aerial photographic image coverage, mainly of the United States, supplied by AirPhotoUSA, Sanborn and other commercial aerial photography providers and (ii) by satellite coverage supplied by Digital Globe. Topographic map coverage at scales ranging from 1:24,000 to 1:250,000 is also available from TerraServer, together with aeronautical charts at still smaller scales, see www.terraserver.com.
A perspective view of an area in which the buildings, roads and car parks have been constructed using the SketchUp software from @Last Software an example from the SketchUp gallery. (Source: @Last Software/Google)
April/May 2006
Viewpoint
TerraServer-USA into a single application, see http://virtualearth.msn.com/. Like Google Earth, Virtual Earth is being offered free as part of MSN and it too has received a great deal of favourable comment. Virtual Earth is entirely Web-based and there is no need to download and install special software as there is with Google Earth. With this product, Microsoft entered into a direct head-to-head competition with Google which had released the beta version of its Google Earth product on 23rd June 2005.
A screen shot taken from Windows Live Local, powered by Microsoft's Visual Earth. The lower part of the screen contains an annotated aerial image of Seattle; the upper part shows the three views provided by the corresponding "street side" images. (Source: Microsoft)
group to the company's MapPoint group. The latter group had for some time been selling its MapPoint map products for travel location, navigation and planning. On 30th June 2005, Microsoft's MSN Search rolled out a beta version of its Local Search service that allowed a
MapPoint map or a TerraServer-USA aerial photo image to be displayed alongside the results of a search. On 25th July 2005, MSN released the beta version of its Virtual Earth product that fully integrated MSN Search, MapPoint's Maps & Directions and the
April/May 2006
Viewpoint
Another screen shot from Windows Live Local - in this case, showing the area of Fort Washington in New York city. This gives a "bird's eye view" of the area that utilizes the geo-referenced oblique photography supplied by Pictometry. (Source: Microsoft)
Pictometry that will allow its geo-referenced image and space image data if Virtual Earth oblique aerial photography to be used in is to become fully operational, not only in Virtual Earth. From a West European the U.S.A. (which is its first objective) but perspective, it will be interesting to see if this world-wide. A complication with aerial imagery agreement will be extended to cover is of course that the image data obtained by Pictometry's European licensees - the Blom service providers does not always remain with IV - Vexcel group, including Simmons Aerofilms (U.K.), them - in many cases, the original films and With regard to Microsoft's proposed purchase CGR (Italy), GeoTec (Germany), FM-Kartta digital data are delivered up to the customers of Vexcel, one can see immediately that certain (Finland), Seficart (Iberia), etc. who then own the data. This is often the parts of Vexcel's activities are very attractive to (ii) ORBIMAGE - Shortly after this agreement case with government-owned mapping Microsoft - but not all. An overview of these with Pictometry was signed, Microsoft signed organizations. activities was given in my profile of the another similar 5-year agreement with Vexcel Corporation that was published in ORBIMAGE for it to supply its global the December 2004 issue of space imagery acquired by its OrbView-2 GeoInformatics. My analysis (and opin(at low-resolution) and OrbView-3 (at ion) of the potential value of these activhigh resolution) for use in Virtual Earth. ities to Microsoft will be conducted Since then, in December 2005, ORBIMunder three main headings - geospatial AGE has taken over Space Imaging and data, software and hardware. formed the new GeoEye Corporation. Again it will be interesting to see if this IV (a) - 3D Urban Model Data means that high-resolution IKONOS As discussed above, the acquisition of space imagery will also be available for image, map and terrain data to act as use in Virtual Earth. additional content for Virtual Earth (iii) EarthData - Yet another potentially would appear to be a high priority for important supplier of aerial imagery was Microsoft. In which case, particular attensigned up by Microsoft in December tion would have been paid by Microsoft 2005 in the shape of the EarthData An aerial image with a set of measurements (distance, area, height) to the high-resolution 3D building and Corporation, one of the largest U.S. superimposed over the image - produced by GeoTango's SilverEye software. terrain model data sets that have been aerial mapping companies - again on the (Source: GeoTango). generated from stereo-pairs of aerial basis of a 5-year agreement. photographs by Vexcel's Mapping Services III (d) - GeoTango Division for numerous cities in North America However notwithstanding the recruitment of Besides data, Microsoft required additional and elsewhere. Although these data sets were all of these major suppliers of imagery, it is software for the display, manipulation and generated primarily for use in the planning of important to realize that they can only supply visualization of the image data held in Virtual cellular phone networks by telecomms a part of the content that is needed to Earth. To satisfy part of this requirement, on providers, they have also been made available populate Virtual Earth. Thus Microsoft still 23rd December 2005, it bought the GeoTango to other users off-the-shelf under Vexcel's needs to find other major sources of aerial
company based in Toronto, Canada. GeoTango had already developed various software packages for 2D and 3D content building and visualization, see www.geotango.com/. Its GlobeView software generates a Digital Earth that allowed image data and locationbased information from anywhere on the Internet to be streamed and displayed on the user's screen. Its SilverEye software is designed to ease the task of generating 3D models of high-value facilities and urban landscapes. SilverEye also allows the rapid collection and display of quantities such as distance, area, volume, slope or bearing to an acceptable standard of accuracy for many purposes using a single airborne or spaceborne image. Thus it does not require the provision of a stereo-pair of images for this particular task. A third development at GeoTango is its SmartDigitizer software that allows the semiautomatic feature extraction of lines and polygons to be carried out on remotely sensed imagery. In fact, SmartDigitizer has already been incorporated into PCI's Geomatica remote sensing image processing suite. Quite obviously all three of GeoTango's software packages could be utilized within Virtual Earth and in the Windows Live Local service that is powered by Visual Earth.
April/May 2006
Viewpoint
Microsoft has included part of the Facet company's street-level coverage of Seattle and San Francisco in its beta version of Virtual Earth under the title Street Side Views. It is also worth noting that Tele Atlas is undertaking similar surveys in Western Europe. This uses the mobile mapping van system developed by the Polish GeoInvent company - which Tele Atlas has bought. However, up till now, there has been no news of Microsoft making use of this European imagery.
A perspective view of downtown Chicago based on the 3D terrain model and building height data produced from aerial photographs by Vexcel's Mapping Services Division. (Source: Vexcel)
Global Landscape product line. This 3D building and terrain model data and the associated aerial photographic data from which it was produced should be invaluable additions to Microsoft's data portfolio to be used in Virtual Earth, see www.vexcel.com/ services/mapping/.
10
April/May 2006
Viewpoint
mass production of orthophoto, map and terrain elevation data on the scale needed to populate Virtual Earth. To outside observers, Microsoft is much more likely to purchase or license the required data from the numerous existing photogrammetric service providers. Still the UltraMap software and its associated archiving facilities are there if needed for specific purposes or projects, see www.vexcel.com/products/photogram/diap/.
will be potential buyers of these successful product lines. Still there are some other considerations to be kept in mind. As one might expect, the marketing manager of Vexcel, Jerry Skaw, sent out a letter by e-mail to all of Vexcel's customers concerning the proposed take-over of the company by Microsoft. It included the following statements - "... the acquisition is expected to bring with it resources and support that enhance our offerings and allow us to expand in ways that greatly benefit our current customer base ...... We also plan to continue to develop, sell and support our current products - many of which will contribute directly to our new role. ..." The complete text of Skaw's letter is available at http://industry.slashgeo.org. However potentially much more significant in this particular context is the news that Professor Vincent Tao has been appointed Director of Virtual Earth. Prof. Tao holds the Canada Research Chair in Geomatics Engineering at York University in Toronto. He is a quite outstanding photogrammetrist and has been the main driving force behind the software developments at the GeoTango company that Microsoft purchased only three months ago. His appointment to the position at Microsoft appears to have been made very shortly after its purchase of GeoTango. This puts a quite different complexion on the whole take-over saga - both regarding the proposed acquisition of Vexcel in the first place and the potential to fully exploit all the different elements of its hardware and software portfolio outlined above. I look forward with great interest to Microsoft's future developments in this area with Prof. Tao as Director of Virtual Earth. The following links may be useful with regard to Microsofts Virtual Earth:Windows Live Local (powered by Virtual Earth)http://local.live.com/ Windows Live Local Community http://local.live.com/community/default.aspx Windows Live Local Technology Preview Street Side Views http://preview.local.live.com/ Windows Live Local / Virtual Earth Blog http://spaces.msn.com/VirtualEarth/ MSN Virtual Earth Groups Message Board http://groups.msn.com/MSNVirtualEarth
Gordon Petrie (g.petrie@ges.gla.ac.uk) is Emeritus Professor in the Dept. of Geographical & Earth Sciences of the University of Glasgow, Scotland, U.K.
A satellite ground receiving station that has been supplied by the Vexcel Corporation. (Source: Vexcel)
April/May 2006
11
Article
ALK Technologies
The builders of FleetCenter, ALK Technologies, were founded in 1979 with headquarters in Princeton, New Jersey. The company develops solutions for corporate and consumer customers globally. ALK's CoPilot Live mobile GPS navigation solutions are available in Europe, North America and Australia as retail-branded products and as the basis for OEM navigation
The Award
The Navteq Global LBS Challenge was created to drive growth in the LBS industry by bringing together the key players in the LBS-wireless value chain. The Global LBS The Challenge awarded a grand-prize winner and category
winners. The grand prize winner was ALK Technologies, that developed CoPilot Live FleetCenter, a fleet-tracking, messaging, reporting and optimization application, which integrates with the CoPilot Live mobile navigation solution to enable real time asset visibility.
12
April/May 2006
Article
FleetCenter integrates with CoPilot Live to provide integrated satellite navigation, tracking and fleet control using connected Windows Mobile-based devices.
horizontal services for which location will be the enabler and not necessarily the vertical service itself, like navigation. Fondrevay prefers to talk about locationenabled services instead of LBS: Navigation will continue to be a part of the service offer, however, because even as you are becoming location-aware of the things, place and even people around you, you will always want to have at least the option to find your way "there". This is fundamentally why we believe the term LBS is a misnomer. The more descriptive term is location-enabled services because there are very few services for which integration of location does not create incremental value but also very few services whose core value proposition is location. The term LBS is a bit narrow in scope and potentially minimizes the role location plays; location enabled services better reflects the future role location devices can play in peoples lives.
Navteq Strategy
Navteq wants to continue to push the market to embrace location as a core, lifestyle enhancing enabler. Their nominal role in the
April/May 2006
13
Interview
GP - Now that Leica Geosystems has been taken over by Hexagon AB, and given Hexagon's existing strengths in the metrology field, one might expect new product development to go towards very short-distance laser scanners that could be used indoors for metrology applications - for example, reverse engineering, prototyping and the measurement and recording of small objects. Will entry into this area be a long-term aim for Leica Geosystems and a strategic objective of Hexagon AB?
KM - One of the strategic objectives at Hexagon is to remain the leader both in the micro and macro measurement space. As the existing leader, we will continue to monitor the needs of our customers in all our markets and develop the required solutions. I think the trend will continue and even accelerate to move towards laser scanning as a mainstream tool in both these micro and macro markets. The benefits of accurate,
14 April/May 2006
high definition as-built (as-is) information are not only valuable to an engineer designing a bridge retrofit, but also to an engineer who is reverse engineering an automobile part. Laser scanning in all markets for Hexagon, including those within Leica Geosystems, will continue to be a long term aim.
GP - The former HDS Division of Leica Geosystems concentrated its efforts on the manufacture, sales and support of laser scanners that operated over medium distances between 1m and 25m (with the HDS4500 instrument) and up to 300m (with the HDS2500 and HDS3000 instruments). Does the newly formed Geosystems Division have an interest in extending its product range into the area of still longer range scanners - measuring distances up to 1,500m, like the ILRIS-3D instrument produced by Optech?
KM - There are really two aspects to this question. The first aspect is a historical one. Our founder was actually a civil/structural engineer who ran a large engineering and construction management company. In that business, he experienced at first-hand the need for much better as-built (as-is) information, especially for plants and related structures, than that which was typically available using traditional as-built methods. He recognized that better as-built information allowed better retrofit design, which could significantly reduce construction costs and risks for retrofit projects. He also saw that this was a large, industry-wide problem. So this led to the development of high-accuracy scanning systems with a maximum range of about 300m. This turned out to be a sweet spot in the market in terms of the wide variety of applications that users could benefit from the technology and it has been one of the main reasons behind Leica Geosystems commercial success thus far.
Interview
The second aspect is a looking forward perspective. If you look at the fundamental characteristics of terrestrial scanning systems for much longer ranges, you will find a few inherent shortcomings that limit the types of applications for which the technology is really wellsuited. One shortcoming is accuracy. At long range, the magnitude of the distance and angle errors combine with a much large spot size and wider point-to-point spacing such that high accuracy is just not achievable at very long ranges. Therefore, you see such systems being relegated to low-accuracy applications such as mining and terrain models. These are useful applications of the technology, but the cost of this technology today may be too expensive. If we can bring the right value to customers, we will certainly do it. The fact that all laser scanners are line-of-sight instruments imposes another practical constraint as you go out further. There are often obstructions between the scanner and the surface of interest and things tend to get worse as you get further from the scanner. Even if a scanner could capture data at 500m range, you may not have a clean line-of-sight to the surface of interest. So, in practice, users have to move a scanner around a site, only getting 50m, 100m or perhaps 200m practical line-ofsight range. Yet another problem at longer ranges is the scanners angle of incidence to horizontal surfaces. If the angle of incidence is too low, you cannot get a return and if you do, it is just not very accurate. The bottom line is that when you look at longer range, wide-area applications, many users would begin consider a combined solution of a Leica aerial scanner, such as the ALS50, and a terrestrial scanner such as the Leica HDS3000. This solution provides customers with the ability to map large areas and to augment points of interest with higher accuracy points.
with excellent quality and at fair cost. I think customers care more about local service and support, especially with a technology like laser scanning, in which support can be so critical to their success. So let me address what this move means from both of these perspectives. First of all, the talk you were hearing is correct and I am very excited about this move. With this move, we will be able to fully leverage Leica Geosystems high-end, state-of-the-art manufacturing facility in Switzerland. The Switzerland team is exceptional at quality, cost-efficiency, and longterm product serviceability. This is something that can be a challenge for a smaller manufacturing facility and especially one that is located in the San Francisco Bay area, which has very high manufacturing labor costs. Although we are moving the main manufacturing line to Switzerland, we are keeping a state-of-the-art testing, calibration, and repair facility in our San Ramon offices. So, customers in the Americas will still be able to get prompt service and support. Our European customers now also have access to a state-of-the-art European-based service, calibration, and repair facility. Asian customers will have access to both of these facilities. A final benefit of the manufacturing move for our customers is that without the distractions of periodically supporting manufacturing issues, we can better focus our San Ramon talents on product development (R&D), marketing, service and support.
This will lead to better and faster innovations for our customers. In summary, the move allows the California team to focus on product innovation and service/support and the Swiss team to do what our customers have come to rely on - produce high quality Leica Geosystems products and provide first class service and support.
GP - In aerial photogrammetry, there is much interest in data fusion, especially combining airborne laser scanned data with the image data produced by digital frame cameras and pushbroom scanners. Is there a similar strong interest in data fusion among land and engineering surveyors? If so, do you foresee further developments on the instrumentation side with calibrated photogrammetric imagers such as digital panoramic cameras being integrated into ground-based laser scanners? Presumably this would require very close co-operation between your own unit and the photogrammetric side of your company's Geospatial Imaging Division, both regarding software as well as hardware.
KM - There are currently numerous examples of data fusion in the market. One example - combining digital photographs and clouds of points - has already become common practice in laser scanning. This is done either using cameras that are integrated directly into the scanner itself or using images from external digital cameras that are not attached to the scanner. There has been a lot of activity in the industry over the last year in this area, including our own Cyclone 5.4 release.
There are many different sensors to capture imagery or spatial information such as total stations, GPS, digital aerial cameras, and terrestrial laser scanners. The key is how to combine this information efficiently to provide intelligent data to users. Today, existing customers routinely use their total stations and GPS instruments to complement the data captured by laser scanning. I think the real key in the future is not with the sensors but with the software that can intelligently reference, measure, analyze and present this fused data. Software applications like Cyclone should not care about the type of sensor used to capture the information. The challenge from a vendor standpoint will be to continue to keep pace with the sources of new spatial and image information and make it easy for users to take full advantage of this information in their office software and in their networked and web environments.
GP - Before the take-over by Hexagon, Leica Geosystems had already moved its production of the ALS50 airborne laser scanner from Massachusetts to its main manufacturing plant in Heerbrugg, Switzerland. Now there is "talk" within the surveying industry that the production of the HDS3000 ground-based laser scanner will be transferred from the factory in California to Heerbrugg. Would you please like to comment on this "talk"? I am sure that the answer will be of much interest to your existing customers!
KM - Actually, I dont think most customers care too much about where products are manufactured, as long as they are produced
16
April/May 2006
Interview
where the value in laser scanning is not so much the productivity to capture the as-built info but more in the downstream benefit of lower construction costs, lower construction risks and shorter down-times of the facility. Reducing the period of time an oil rig is down for a revamp project is just as valuable in Malaysia as it is in the North Sea or the Gulf of Mexico. From a geographic standpoint, things are somewhat more competitive in Europe than other areas, as this is where most of the other hardware vendors happen to be headquartered. These vendors tend to focus their marketing and sales efforts close to home and customers are a bit less worried about support if the vendors headquarters are next door. In terms of vertical markets, I think we have a strong leadership position in each one except for the mining market, although we have some good successes here as well. Open pit mining is one market where the very long range systems have a good fit, as accuracy requirements are low, line-of-sight may not be a problem, and, if you are scanning vertical walls in an open pit, you dont have an angle-of-incidence problem. The last portion of your question about the level of sophistication of prospective customers - is an interesting one. We are fortunate in that we do well across the board, but I think that we probably tend to get an even higher share of the more sophisticated users who have done their homework. For example, there are as yet no standards for specifying laser scanners. Some vendors are much more aggressive than others with their specs and what they include or exclude from their specs. Leica Geosystems has tended to be very conservative with specs, often choosing to let the actual performance of our products significantly exceed our specs. So, newcomers are well advised to evaluate vendors specs carefully. More sophisticated prospects will do the extra homework to make sure that they get what they want. They will talk with other users and actually test products before buying. They also tend to be more successful when they do buy, which leads to more referral-based business for their vendors. Because of this, we put a lot of effort into educating the market about the technology to help customers make well informed decisions.
Gordon Petrie (gpetrie@ges.gla.ac.uk) is Emeritus Professor in the Dept. of Geographical & Earth Sciences of the University of Glasgow, Scotland, U.K. Visit www.leica-geosystems.com for more information about the company and its products
An image of the interior of the central dome of the famous Hagia Sophia church in Istanbul, Turkey that has been generated by a Leica Geoosystems HDS laser scanner.
As part of Hexagon, we are fortunate to add yet another dimension to the aspects of data fusion. As you know, Hexagon is expert in micro measurement technology. There are also data fusion needs and opportunities between the micro and macro worlds. In Hexagon, sharing R&D and other competencies is a key company value. We work closely within Hexagon to share development for both hardware and software efforts to bring more complete solutions to customers. Our customers can be assured of this.
GP - Turning next to your market leadership in ground-based laser scanners, what do you feel are the main factors that have produced this success? And where does your main competition come from - does it vary (i) from one regional market (North America, Europe, Asia, Australia) to another; (ii) within different industries (surveying and non-surveying applications); and (iii) according to the level of knowledge and sophistication of your existing users and your potential customers?
KM - Well, it is no accident that we have such a strong leadership position in the industry. There are really several key factors. (i) One is having been an industry pioneer we were able to get off to a fast start right out of the gate. Together with early adopters, we created the growth of this market. Today scanning is already accepted as the preferred solution in a number of market segments. (ii) A second factor is that I think we have made some sound strategic decisions. We offer state-of-the-art software and hardware. The micro-chip laser offers the cleanest dataset and that helped to improve accuracy. We have the exclusive right to the two
patents on the use of micro-chip lasers. The power of the Leica Cyclone software is well known. Customers expect our commitment to develop these technologies further. They trust Leica Geosystems to have the stamina and the resources to further advance this technology for their benefit. I think the market has highly valued this in Leica Geosystems. (iii) I think a third factor behind Leica Geosystems success is that we have hit the mark in terms of the breadth and the capabilities of our hardware and software products. Laser scanners and associated software are still fairly costly to acquire, so customers really want them to be as versatile as possible in terms of the number of applications that they effectively address. The capabilities and actual performance of both our hardware and software have met this need and this has been a big plus. (iv) If you ask around, youll find that customer support is the fourth factor. Especially with a new technology, support can be just as critical as the products themselves as far as helping to make users successful. And, in the end, this has really been the biggest factor of all: the success of so many of our customers. It is their success that fundamentally drives adoption of the technology and has continued to bring the majority of new customers to Leica Geosystems. The main competitor in the laser scanning industry is not another manufacturer but the way customers are currently doing their asbuilt activities. This barrier does vary from region to region. In areas of generally higher labor costs, such as much of Europe, North America and certain parts of Asia/Pacific, laser scannings productivity advantages are more appealing. In regions with low labor costs, we focus on industries like oil & gas,
April/May 2006
17
Urgency
Spatial Data Infrastructure, all presenters had compelling cases for why an SDI is required. According to Major General M Gopal Rao, Surveyor General of India, an SDI will save time, effort and money in finding data and will avoid unnecessary duplication. Host and keynote speaker Brigadier General Khalifa Al Romathi pointed to better outcomes through improved economic, social and environmental decision making while others cited standardisation and support for the development of the geospatial industry itself. The need for government leadership appeared to be the consensus opinion. The role of government in the establishment of a Spatial Data Infrastructure does not, however, preclude or diminish the focus on the private sector. Both the general public and industry are expected to benefit when doing business within the framework of a fully realised SDI. Indeed, Canadas vision of Spatial Data Infrastructure specifically focuses on enabling value-added commercial activities. Growth of the geospatial technology industry is prompting the organisations in attendance in Abu Dhabi to move quickly, and the sense of urgency only builds as volumes of legacy data grow. The acquisition of geospatial data is accelerating, and any major re-definition of spatial framework will have a major impact. For example, Dr. Taibs paper articulated the challenges of implementing the GDM2000 datum and the difficulty in applying such changes to the catalogues of data built upon what is now obsolete datum. This is an issue particularly for early adopters of GIS those who have built the largest catalogues of data on early frameworks. Global economic growth will place a premium on being able to effectively deploy and manage traditional infrastructure. This was put in the spotlight recently with the controversy surrounding the transfer of U.S. port management contracts to Dubai World Ports. The financial community has also acknowledged this trend, as evidenced by the creation of new Infrastructure Funds. In the same way, exponential growth of geospatial data and applications will challenge our ability to move spatial data seamlessly across markets and jurisdictions. How well we fare may depend upon how the global geospatial community delivers on the promise of an SDI.
Daniel Shannon (daniel.shannon@telus.com) is
Parallel
Many speakers drew a parallel between traditional infrastructure and SDI, and the setting couldnt have been more appropriate. The Emirates of Abu Dhabi and Dubai provide stark visual examples of the demands being placed on infrastructure in the face of the massive growth occurring in the region. The Geospatial Information & Technology Association (GITA)s Executive Director Bob Samborski spoke on the organisations research into return on investment for geospatial technology implementations. Several of his observations were germane to the discussion on Spatial Data Infrastructure. Particularly relevant was the assertion that sound financial analysis is fundamental to any investment in geospatial technology, and that those investments must support business objectives.
Economic Benefits
But will private industry wait? Virtually all of the six nations represented in Abu Dhabi highlighted economic benefits as a primary objective of their SDI. However, they also spoke of efforts that have taken decades, and some that are expected to take just as long. In the world of information technology, this is a long time to spend defining ontology. More to the point, Ravi Gupta, Editor-in-
SDI Necessary?
While it is difficult to quantify the value of something as large and overarching as a
April/May 2006
19
Article
GNSS Update
Launch of New GNSS Receivers and Chipsets
At the moment GNSS product manufacturers are busy developing new products. Chipsets are becoming available that are capable of receiving signals from all three GNSS systems. Furthermore the production of chipsets for the new GPS frequencies is coming up to speed. But the development of GNSS systems themselves is not at rest either. One fact is that GPS has become ten to fifteen percent more precise over the last few months. By Huibert-Jan Lekkerkerk
ESA's Director General, Mr Jean-Jacques Dordain, delivering his address at the contract signing ceremony. (source: www.esa.int).
Galileo
A contract between the European Union and Galileo Industries Gmbh for the development of the first four Galileo satellites was signed in Berlin on the 19th of January this year. For the European Union Giuseppe Viriglio, director EU and instustrial affairs from ESA, signed the contract. For Galileo Industries CEO Gunter Stamerjohanns signed the contract. This is an important step towards the development of an operational Galileo system. Furthermore the European Union and Korea signed a contract for mutual cooperation in the development of Galileo, after six months of negotiation. Korea is not the first Asian country to participate in the development of Galileo since earlier contracts were signed with both China and India. GIOVE-A, which was launched December 2005, is fully operational and extensive tests are taking place. Ground stations in the Netherlands, Belgium and Great Britain are tracking the satellite and the broadcasted signals. The great radio telescope in Chibolton, Great Britain, is for example used to track the signals from GIOVE-A in order to gain insight into the radio environment in the satellite orbit. Furthermore tests are performed to check whether the Galileo satellite signals are interfering with other radio signals. The first experimental receivers made by Septentrio, Belgium, are being tested with the use of GIOVE-A as well. This is an important aspect of the development of Galileo since this provides insight into the practical use of Galileo. The sister satellite of GIOVE-A, GIOVE-
Signing of the contract for the first four Galileo satellites (source: www.esa.int).
Egnos
In December 2005 the use of Egnos for controlling rail traffic was tested in South Africa. In this test only GPS, Egnos and train bound sensors were used, eliminating the need for expensive railroad based sensors. Furthermore this was a test for calculating and broadcasting Egnos signals over Africa. It is expected that Egnos will be expanded towards the African continent in (near) future. March this year showed further testing in Portugal performed by Alcatel Alenia Space. This test was directed at localizing GSM telephones for better response to 112 emergency calls. It is expected that in the years to come over half of the mobile telephones will use the technology tested. This technology, which uses a combination of GPS, Egnos and GSM positioning, makes exact telephone location possible, both indoors and outdoors.
20
April/May 2006
Article
B, is currently being put together at Alcatel Alenia Space in Italy. When complete, the satellite will be transported to Estec, the Netherlands, for testing in the laboratories under simulated space conditions. The launch of GIOVE-B is planned for the autumn of 2006.
GPS
The GPS satellite tracking system used by the American air force was recently updated. As a result twice as many orbital information is collected, resulting in an improvement of ten to fifteen percent of the precision of the GPS system. The first Block IIF satellite, which is being built at Boeing, has undergone the first radio tests with success. In January Boeing received an order for three additional Block IIF satellites. Including options this amounts to a total of nine satellites commissioned to Boeing. Boeing has a rich history in building GPS satellites since they also built the block IIA satellites. Of these block IIA satellites two have been in operational service for more than 15 years, twice the design life.
GLONASS
After the successful launch of three satellites in December 2005, Russian president Poetin
has decided to become personally involved in the GLONASS program. As a result four additional satellites will be built in 2006, resulting in three satellites to be launched in 2006, followed by seven in 2007. Building additional satellites however is no luxury. In the last article we already mentioned the large amount of First radiospectrum received from GIOVE-A (source: www.esa.int). old(er) GLONASS satellites. Over the last few months, The GNSS market is not only preparing for three have stopped functioning, bringing the Galileo, but even GLONASS seems back in number of active satellites down to 11. Of the grace after a number of years with virtually no satellites launched in December, two have not GLONASS receivers (or satellites) available. been activated yet so the number of active Some highlights: satellites can go up to 13 if no other satellites Trimble recently introduced their new breaks down. GLONASS and GPS combined receiver GNSS Receivers (R8 GNSS). Apart from being able to It is good to see that GNSS systems are in receive Glonass signals, this receiver can constant development, but without the also handle the new L2C and L5 frequenreceivers there is little use for these improvecies introduced into GPS. There is no ments. However, the manufacturers of GNSS sign of Galileo compatible receivers at receivers and chipsets seem to realize this as Trimble at the moment; well and at the moment we see one new sys Leica launched a new series of receivers tem after the other. and reference stations as well as supporting full GNSS capability, including the new GPS frequencies, GLONASS and Galileo; Topcon, a company which has always been a full GNSS supplier, brings the G3 technology thereby choosing the same approach as Leica; Novatel chooses a different approach with the GPS+ technology for GPS and GLONASS L1/L2 on one hand and the Galileo / GPS technology on the other hand. The latter is capable of receiving both L1/L5 and E5 frequencies; Javad, who until half 2005 had an exclusive agreement with Topcon for the development of land survey GPS receivers, is currently producing GPS and GLONASS combined products only. They have however recently announced the first products based on the new GeNiuSS chipset, which is capable of Galileo reception as well.
Huibert-Jan Lekkerkerk (hlekkerkerk@geoinformatics.com) is a freelance writer and trainer in the field of positioning and hydrography.
Chilbolton Observatory in England where GIOVE-A tests are being performed. (source: www.esa.int)
April/May 2006
21
Special
Autodesk Topobase
Since the acquisition of C-Plan Autodesk has been working hard on making Autodesk Topobase applicable to the vertical market. The main aim is not only to offer building stones but implementation as well. Ostyn explains that this software is the final step in the Autodesk GeoSpatial Growth path. According to Ostyn specific solutions for utilities will be released later this year. Raster data are getting more and more popular, said Ostyn. Either paper drawings or maps are digitized or satellite images and aerial photos fit in. This is why Autodesk Raster Design is quite popular. Version 2007 offers interoperability with Autodesk Map 3D 2007 and Autodesk Civil 3D 2007s DEM support. The software is also compatible with AutoCAD Electrical and supports ESRI GRID files. There is a new JPEG2000 library and Autodesk Raster Design 2007 reads support for DTED format elevation data from the national Imaging and Mapping Agency, or NIMA.
The AJAX Viewer delivers raster based maps to almost any browser, including Safari. This viewing option ensures that any user on any platform can access designs and maps without requiring a specific browser.
CAD Talk
Lets start with an overview of the 2007 products. Autodesk has retired the AutoCAD 2002-based family and it will not take long before exactly the same thing will happen to the AutoCAD 2004-based family of products. During the plenary session on March 2nd in Hotel New York, Rotterdam, Lazeroms first introduced Autodesk Inventor 11 as one of the new next generation products. This version is specifically meant for AutoCAD users wanting to move to 3D. True DWG compatibility, a complete concept to production process via fully integrated 2D/3D design solutions and a dedication to functional design are just a few phrases applicable to Autodesk Inventor 11. Another newborn family member is AutoCAD 2007 (www.autodesk.com/autocad): a version
based on an intuitive way of working and new visual styles/rendering tools to present concepts to non-technical audiences. Conceptual design and accessibility for both experienced users and beginners were key when developing this new version. When 2D drafting is concerned AutoCAD LT 2007 (www.autodesk.com/autocadlt) can be used: software that features Dynamic Block Authoring and integrated Layer management tools.
Open Source
Open source was another hot topic this afternoon. However Autodesk felt the need to elaborate on this one month later in Hotel Chez Gerard, London. At least this helped to clear up obscurities around new or replaced product names and the name change from the MapServer Foundation to the Open Source Geospatial Foundation (www.OSGeo.org). Officially the move towards Open Source started November last year when Autodesk released the code for MapServer Enterprise as open source software. Three months later, to be more precise on February 4th, the
Two Sessions
So far the most relevant part for CAD users. The afternoon of the press meeting with the subtitle Accelerate Your Ideas was split up into two sessions: Mechanical Solutions Division (MSD) and Infrastructure Solutions Division (ISD). The last one was led by
22
April/May 2006
Special
ping definition, previewing the layout and est PHP, .NET, and java stylization and managing data access. Users tools. Furthermore the can also integrate business logic written in user will find Feature PHP, ASP.NET or Java directly into the appliData Objects (FDO) cation and preview it within Studio. providers for SDF and SHP in MapGuide Open AJAX Viewer Source. FDO Providers And there is even more news: besides the for ODBC, ArcSDE, DWF viewer an AJAX viewer is also available. MySQL, GDAL Raster, Van der Pol told the select group of journalWMS, and WFS will ists present in London on April 5 that this is become available mid also a free viewer that offers the same func2006. All this is or will tionality as the DWF viewer (dynamic be open sourced. In pan/zoom, scale-dependent detail etc). The contrary to AutoCAD or difference is that the AJAX viewer is raster AutoCAD-related prodbased and panning and zooming happen a ucts!, stressed Van der The DWF Viewer uses an ActiveX control to display vector-based maps on lot more smoothly. Another advantage of Pol at the meeting on Microsoft Windows systems running Internet Explorer or Firefox for viewing of the AJAX viewer is that it works both with April 5. MapGuide Open maps, designs, and related data. Use of DWF technology provides printing and Internet Explorer and Mozilla Firefox. Source 1.0 (preview verplotting, as well as support for a disconnected mode for taking spatial data Development is simple: all you have to do is sion, as well as docuinto the field. writing your application logic within your mentation and FAQ) can Open Source Geospatial Foundation server environment and it works with either be downloaded via https://mapguide.osgeo.org, (OSGEO), at that time called the MapServer viewer on any client. and is in fact called an Open Source Foundation, held its first meeting in Chicago. There is one more new product to cover Geospatial Foundation project . A board of Directors was formed which before lunch: the commercial CAD/GIS tool Autodesk MapGuide Enterprise represented organisations like Mapbender Autodesk Map3D 2007, available since the Then there is the commercial version of (Germany), GeoServer/GeoTools (The Open second week of April and equipped with Open Source MapGuide: Autodesk MapGuide Planning Project, USA), and MapGuide functionality to publish data & stylization to Enterprise. At the moment a beta version is (Autodesk, USA). the MapGuide Server. This product is built on being tested but the actual product will be On March 06 the open source geospatial AutoCAD 2007 and developed for users put into the market mid 2006. Van der Pol: community officially announced the formation wanting to integrate CAD and GIS data Autodesk MapGuide Enterprise will be availof the Open Source Geospatial Foundation. throughout an organization. New in this verable in several languages localized by As the official press release states the mission is the ability to directly access spatial Autodesk. Whether the open source version sion of this not-for-profit organization is: data like SDF. ESRI Shape, Oracle Spatial, will be available in other languages depends support (financially, organizationally and ESRI ArcSDE, SQL Server, mySQL, OGC WMS, on the open source community. The product legally) and promote the collaborative OGC WFS, DTED, ESRI GRID and GeoTIFF. It is will contain everything availdevelopment of open geospatial technologies able in the Open Source verand data.. It will also serve as an sion, plus additional FDO independent legal entity to which community functionality (Oracle, SQL members can contribute code, funding and Server) and commercial-grade other resources, secure in the knowledge projection support from that their contributions will be maintained Mentor. In contrary to for public benefit. MapGuide Open Source, this MapGuide Open Source version will be thoroughly Roughly six months after the release of the tested and certified. Currently code for MapServer Enterprise we now have available as a preview (via MapGuide Open Source: free web mapping www.autodesk.com/ software composed of a Linux/Windows servmapguidestudio) is Autodesk er, web extensions (for application developMapGuide Studio, a commerment), Studio (for map authoring), viewers cial authoring tool which can (both raster and vector) and a site adminisbe used both with MapGuide trator. The product is licensed under the GNU Open Source and Autodesk Lesser General Public License (LGPL). This MapGuide Enterprise. Besides enables users to develop and distribute spaa developer-friendly authoring tial and design data over the web and can environment (modelled on reduce total cost of ownership for a web popular web development mapping solution. tools) the product offers The software provides the option to autostreamlined authoring. Users install and configure the Apache HTTP server, can define rules for importing Director ISD Northern Europe Frank Ostyn: Instead of being a user the custhe PHP scripting language, and Tomcat, the and converting data. Other tomer is turning into a developer. This is why open source is becoming increasingly important. Apache servlet engine. It works with the latfeatures are thematic map-
April/May 2006
23
selling a product to buying a model of a product and finetuning it inhouse and instead of being a user the customer is turning into a developer. This is why open source is becoming increasingly important: companies want to exchange and use codes from other companies. On April 5th in London Ostyns colleague Van der Pol highlighted aspects like following a Autodesk MapGuide Studio can be used to produce attractive thematic maps and trend set by companies provide spatial analysis and reporting functions here, creating buffer zones like Sun, IBM and around selected parcels. Redhead (though these organisations are not also possible to import Civil 3D design data exactly similar to Autodesk), improving the and new vector, raster and 3D surface visibility of Autodesk in the market and the engines are provided as well. According to ability to have quicker software releases Van der Pol it will offer enhanced stylization (twice instead of once a year). and advanced labelling: just tell the softAutodesk already started building the new ware and labelling around the corner will be MapGuide architecture two years ago. The done automatically for example. discussion and decision to go open source was made halfway 2005. Van der Pol admits it is still a big experiment. What we do know is that for example in France and Germany there is a need for open source software. We also notice that there were 1368 Windows Open Source downloads and 1984 total Source Code downloads between Linux and Windows within 18 days from www.OSGeo.org last month. New developers are approaching Autodesk MapGuide Studio puts data and resources close at hand and is meant at making it easier to organize and manage maps and geospatial data. The Autodesk and are startability to preview maps provides immediate feedback when authoring and ing to build new applicastreamlines application development. tions on our products. We also believe that Why Open Source? open source can be beneficial to education Putting 60 man years of development into and (local) government by offering less the open is something we can at least call expensive alternatives. However only future quite remarkable. So what made Autodesk can tell whether we made the right move. decide to take this major step? During the Sonja de Bruijn (sdebruijn@geoinformatics.com) is press meeting on March 2nd in Rotterdam editorial manager of GeoInformatics. Ostyn talked about a shift from companies
Column
The phone was a big success, and the marketing people were having a good laugh. As this issue of GeoInformatics shows, the mapping industry has fully embraced the web and mobile markets. There are endless opportunities to improve productivity or help us get lost more efficiently. But what about the good old paper map? Can it survive the age of digital enlightment? So, in the spirit of that Siemens advert, lets have a look. But before we start, please note that this column is entirely satirical and untrue. Honestly. With a paper map, what you see is what you get in full A0-size colour. With a mobile device what you see is a map the size of a peephole, pixelated like the face of a criminal on TV. Yes, there is a zoom button but the scale ratio resembles my phone number divided by the age of my neighbours cat. And since you need to carry the Hubble telescope to read the screen, it makes the device slightly less portable than intended. Which is especially annoying when its battery dies half-way up the mountain. You might as well have thrown it over the first cliff and saved yourself the trouble. The paper map just keeps going. Some have lasted 4,000 years. That is three million times longer than the average battery. Granted, digital storage media dont need power, but can you still read a 20-year old floppy disk? A paper map still works after you have dropped it from the 20th floor, driven over it in a Jeep, or even checked it in as airport luggage. Paper also has unbreakable interoperability and scalability. You can collect as many paper maps as you like, and use them for any purpose without format conversion. Okay, my Times Atlas weighs as much as a Linux server farm, but it is compatible with any table or cupboard, and does not stop working when I upgrade the furniture. Its also immune to viruses, so there is no need to set off the alarm every time someone else is having a look inside. The paper map is interoperable with any human mind,
and sometimes it even produces magic. In World War I a group of Hungarian soldiers, lost in a blizzard, saved their lives with a map of the Pyrenees even though they were in Switzerland. GPS would probably have guided them into the next ravine. And what about romance? An in-car navigation system might save your marriage for various reasons, but how boring is it never to get lost in the woods anymore? Also as a gift to your lover, a stylishly framed map, for example to commemorate the location of your first kiss, might produce further passionate lat/long events. Not every lover enjoys maps but you must admit it beats a KML file by a mile. So, as a true explorer, buy yourself a paper map (it doesnt even matter which one). Walk until you are beyond the reach of the charged battery, and youll be able to enjoy the whole place in peace. Actually I just made that up. Please remember everything I just said is a joke.
Thierry Gregorius (thierry.gregorius@shell.com) is Programme Manager for Geomatics and Information Management at Shells international headquarters in the Netherlands, and was previously Global GIS Coordinator. The views in this column are entirely personal.
The paper map just keeps going. Some have lasted 4,000 years. That is three million times longer than the average battery.
P.S. As my previous column went to press, the organisers of AGI2006 changed the venue from Chelsea FC to a non-football location in London, and the date from November to September. This also made that column a joke but at least it helps a consistent theme emerge.
April/May 2006
25
Special
Value
In 1999 The Paper Industry Research Association (PIRA) valued geographic datasets assembled over the previous 10-15 years in the then European Community at 36bn. It estimated that it was double this value in the United States of America. Recent estimates indicate the value has risen to 100bn in the E25 community. Most of this geographic data was collected before GPS became ubiquitous. Not only that but this asset needs to be used in support of sustainable policies and development across Europe. The ability to process these data automatically in situations not envisaged by the original collection programmes (we will term it re-use) becomes an essential goal of delivering on the i2010 agenda. Already spatial data quality problems are holding back the European initiatives. There is
evidence that the heavily fragmented geographic community in Europe is failing to tackle interoperability and spatial data aggregation. The 2006 eContent+ programme focussing on making digital content more accessible, usable and exploitable was only able to support 3 GI projects with a value of 3.5m against a total available budget of 50m. Something is holding back a spatial contribution to the European knowledge economy.
e-Government Initiatives
The demand for increasingly accurate data will be driven by the need to automate spatial data processing. In order to deliver on eGovernment initiatives or joined-up decisionmaking if you prefer that term, data from different sources need to be made available across the web and aggregated without human intervention. Interoperability is the start
Certifying
The supply chain for spatial data re-use, starts with discovering data in registries across the web. Data in these registries needs more than
26
April/May 2006
Special
SWRL
There is much work still to be undertaken for data re-users in Europe to take advantage of the valuable geographic datasets already created. Further development of the semantic web rules language (SWRL) for use with spatial data constructs is necessary. Reusers will still be faced with the problem of semantic interoperability, or the difficulty in aggregating data that were collected and tagged using different vocabularies and different perspectives on the data. To achieve semantic interoperability, systems must be able to exchange data in such a way that the precise meaning of the data is readily understandable and the data itself can be converted or translated by any GIS or web mapping tool into a form that it understands. In addition for the re-users or aggregators there is a need to have access to a set of presentation standards. Web Processing Service (WPS) is still immature. It describes individual geoprocessing services accessible via the Web.
The following data quality elements are defined in ISO 19113 not all of these will be applicable to every dataset: completeness presence and absence of features, their attributes and relationships; logical consistency degree of adherence to logical rules of data structure, attribution and relationships; positional accuracy accuracy of the position of features; temporal accuracy accuracy of the temporal attributes and temporal relationships of features; thematic accuracy accuracy of quantitative attributes and the correctness of nonquantitative attributes and of the classifications of features and their relationships. So we have the basis for making the assessments we need for re-use. What we need now are: A framework for making assessments; Mechanisms for presenting data to the assessment framework. Laser-Scan has developed a framework for making the assessments. It is called Radius Studio, see Figure 2. This is based on a common language interface and makes use of mainstream standards: W3C, OWL, ISO TC/211, OGC GML and WFS.
April/May 2006
27
Special
Future Plans
By Allison Pullen
The BIPT plans to revise and improve its corporate Web site to include the new Web-based system, while still maintaining the separate domain address, enabling users to access the information from multiple locations. Future plans also include the augmentation of possibilities for making queries within the system. Additional functionalities of GeoMedia WebMap Publisher will be incorporated, including better zoom in/zoom out capability. The process of updating the data will eventually be automated. As it exists now, the data resides in Access databases. These databases will be replaced by SQL server connections, in which new data will be generated on a daily basis.
Allison M. Lowery Pullen (allison.pullen@intergraph.com) is Corporate
User-Friendly
The BIPT had specific requirements in mind when searching for its Web-based system. First of all, the system had to be user-friendly and intuitive, requiring minimal staff training. Secondly, the system had to have a quick implementation and turnaround time. The BIPT didnt want to waste time implementing a time-consuming system when they could be up and running in approximately one day. The BIPT sought a system that When the user double-clicks on a point, he receives attribute would provide quick and easy information about the antenna. One of the attributes can be a online access to communication hyperlink to a report. antennae information and radiation reports, reduce work hours Spreadsheet spent answering communication antennae Previously, when other administrations wanted inquiries, and improve business processes. to know the location of communication antenGeoMedia Suite of Products nas within their territory, one of the BIPTs 250 The BIPT selected Intergraph to deliver an employees had to query the database and online information system (www.sites.bipt.be) export the results as an Excel spreadsheet. to enable the general public to access comAnalysis reports for the electromagnetic radiamunication antenna data and radiation tion then had to be added to the spreadsheet, reports in a user-friendly environment. and everything was then sent via e-mail or in GeoMedia and GeoMedia WebMap were used hard copy. As one can imagine, this became a to manage approximately 7,000 individual very time-consuming and daunting task. data records and GeoMedia WebMap The BIPT sought to implement a Web-based Publisher to publish data to the Internet system that would enable the general public without customization and programming. BIPT to access both the location sites of communialso selected Intergraph technology for its cation antennas in any given territory along open architecture, ease of use during implewith the corresponding radiation reports for mentation, and ability to further expand the antennas within the area.
Intergraph Corporation
(www.intergraph.com) is a global provider of spatial information management (SIM) software. Security organizations, businesses and governments in more than 60 countries rely on the companys spatial technology and services to make better and faster operational decisions. Intergraphs customers organize vast amounts of complex data into understandable visual representations, creating intelligent maps, managing assets, building and operating better plants and ships and protecting critical infrastructure and millions of people around the world.
Communications Manager with Intergraph Corporation. For more information on BIPT, visit www.bipt.be. Intergraph Corporation can be found at www.intergraph.com.
April/May 2006
29
Special
Blueprint
Before getting at open standards, let's take a step back to define standard. This is from Bob Sutor, the Vice President of Standards and Open Source for the IBM Corporation: "A standard is like a blueprint. It provides guidance to someone when he or she actually builds something." He goes on to note that it is a blueprint upon which many people need to agree. The Open Geospatial Consortium (OGC, www.opengeospatial.org) develops consensus on "blueprints" for software APIs. An open standard can mean that a standard is open to anyone to use, even though it has restrictive licensing or requires a fee. The OGC goes a bit further and defines open standards as being: Freely and publicly available: free of charge and unencumbered by patents and other intellectual property; Non discriminatory: available to any one, any organization, any time, any where with no restrictions; No license fees: no charges any time for their use; Vendor neutral: in terms of their content and implementation concept and do not favor any vendor over another; Data neutral: the standards are independent of any data storage model or format; Agreed to by a formal, member based consensus process: the standards are defined, documented, and approved by a formal, member driven consensus process. The consensus group remains in charge of changes and no single entity controls the standard. The key aspect of OGC open standards is that they are freely available for anyone to access and implement at any time. Software developers and development organizations,
Figure 1: Interface of GRASS 6.1. Recent releases include support for the OpenGIS Simple Features Implementation Specification.
Understanding
It is worth noting at the outset the confusion in the community over the terms open source software and open software standards. The word open is used extensively in articles, marketing materials, email lists, and blogs. But what does this term really mean? The definitions vary, sometimes referring to a software product's interfaces (Application Programming Interfaces, APIs) and sometimes to the source code. Open source refers to whether or not the source code behind software is made available, among other things. If it is made available, and users can copy, modify and redistribute the source code without paying royalties or fees, it is termed open source. (For the complete story, visit the Open Source Initiative www.opensource.org/.) The
30
April/May 2006
Special
gether
nologies
whether creating commercial or open source software, decide if they want to implement specific standards. It is important to realize that software packages, whether open source or proprietary, can interoperate if they all implement the same standard. There are more than a dozen approved OpenGIS Specification open standards www.opengeospatial.org/specs/?page=specs) implemented in hundreds of packages and products.
Figure 2: Surface Ozone concentration map created by GeoServer using data delivered via OpenGIS Web Coverage Service Implementation Specification (WCS) delivered to QuickWMS viewer via OpenGIS Web Map Service Implementation Specification (WMS). Image Courtesy Fire Chemistry Unit, Rocky Mountain Research Station, Missoula, MT.
three of the many open source geospatial projects that have embraced open standards, either from the start, or after several years of use, based on user needs. There are perhaps several dozen other similar stories to be told.
GeoServer
GeoServer has a long history using open standards. First developed to help leverage geographic data to enable urban planning tools such as traffic modeling, the project has become a sort of poster child for both the open source and open standards in the geospatial arena. The Open Planning Project (TOPP), a non-profit organization, felt that a standards-based server of geospatial data was a key piece in pulling together the framework for complex traffic modeling and other needs. That platform, developers Rob Hranac and Chris Holmes determined, needed to include
three key characteristics: support for open standards, ease of use and integration of multiple geospatial formats. The format support was particularly important if the code was to be used in a variety of disciplines, such as local government. How to begin? The team found a technology core in GeoTools, an open source Java GIS development platform launched in 2001. GeoTools (www.geotools.org) seemed like just the base needed as a building block, but it did not include support for PostGIS, the open source database solution for open source database PostGreSQL. That was so important, and the tools so good, that TOPP staff spent time implementing the needed code in GeoTools. "At first," says Holmes, "we felt like we were putting in all this work and getting nothing back, but in the end we had access to all sorts of format support via code developed by others."
WFS
The word open is used extensively in articles, marketing materials, email lists, and blogs. But what does this term really mean?
Latest News? Visit www.geoinformatics.com
As GeoTools began to take shape, the team determined the value of implementing the OpenGIS Web Feature Service Implementation Specification (WFS), the specification for sharing vector data. TOPP
April/May 2006
31
Special
There is clearly much more to come from the marriage of open source and open standards in geospatial technologies.
had already built a WFS implementation of its own. Still, the team saw the benefit of bringing that experience to the table to work collaboratively on WFS for GeoTools. At about the same time, OGC was looking for an open source reference implementation for WFS as part of its Compliance & Interoperability Test & Evaluation Initiative (CITE). TOPP was selected to provide the reference implementation and received funding to insure full compliance of GeoServer with the specification. Web Map Service (WMS) support was added to GeoServer as well, based on work by GeoTools users in Britain, but ultimately completed by Gabriel Roldan, an Argentine programmer working for a Spanish client, see Figure 2. Other GeoServer users needed support for the OpenGIS Web Coverage Service Implementation Specification (WCS), the specification for sharing gridded data via the Web. Simone Giannecchini and Alessio Fabiani, consulting for the NATO Undersea Research Centre (www.nurc.nato.int), staff at the USDA Fire Service and a researcher in New Caledonia (South Pacific) worked together on that effort, making the result available for every other GeoServer user.
MapServer
So, then why is GeoServer perhaps not as well known as another open source Web map server, MapServer? Holmes is quick to point out that WFS is just coming into widespread use, while WMS, which MapServer supports, came on the scene earlier. WFS needs a fairly robust client (the returned vector data must be "understood" and rendered on the client) while WMS "picture maps" can be seen in a browser. "With Geography Markup Language [GML] maturing and more desktop WFS clients including open source uDig and MapBender and proprietary ones from companies like Cadcorp and ESRI, WFS and thus GeoServer will have a larger role in the Spatial Web," Holmes predicts. Holmes and his colleagues are excited about the newest additions to GeoServer, which include tools to manage changes to geospatial databases. While OGC's WFS-T (T for transactional) offers the blueprints for adding, editing and deleting features, formal use of such tools requires software to rollback changes and/or limit who can commit changes. These new tools, combined with others to ensure that added or changed features meet specific requirements (Are they
The combined power of DAT/EM Systems International and Inpho GmbH. Precise geospatial data collection with 3D stereo viewing from your desktop.
DAT/EM SUMMIT EVOLUTIONTM: A full-featured photogrammetric softcopy workstation. DAT/EM STEREOCAPTURETM: 3D image feature collection from the SUMMIT EVOLUTIONTM Stereoplotter into ArcGIS. DAT/EM CAPTURE : Collects 3D images features from a wide variety of stereoplotters into AutoCAD or MicroStation.
TM
INPHO MATCH-ATTM: Automatic digital aerial triangulation. INPHO MATCH-TTM: Fully automated DTM generation. INPHO ORTHOBOXTM: The power of ORTHOMASTER and ORTHOVISTA combined. DAT/EM - INPHO CAPTURE CONTOURTM: Automatically creates contours from DTM or mass points within DAT/EM Capture using SCOP processing.
DAT/EM MAP EDITORTM: Automated editing package for AutoCAD or MicroStation files.
For a demonstration, please visit booth number 215 during the ASPRS 2006 Annual Conference, May 1-5, 2006 in Reno, Nevada, USA
32
April/May 2006
Special
uDig is widely used to test Web Map Servers. "I want the same thing with WFS. I want the same thing with Catalogue (OpenGIS Catalogue Service Implementation Specification)" says Ramsey of the other specifications the products supports, see Figure 3. He is quick to point out the value of standards as a design baseline from a development standpoint. But with that comes "good news and bad news." The bad news is that "because the OGC specs tend to be more general than most implementations of GIS design, the implementation overhead we incur building the infrastructure necessary to handle them is very high." The good news is that "once we have suffered through the implementation hell we have a framework which is flexible enough to handle very odd cases, cases which cause developers with less generic models to graft onto the sides of their systems."
long enough? Do they connect to other features? etc.) will lead to a whole new kind of Spatial Web offering. It will allow a "geowiki" sort of collaboration where many members of a community can participate in building and maintaining a shared spatial database via the Web.
map window to see them, etc). Ramsey and his company are already known for their work in developing PostGIS and felt strongly that an open source project would best serve the citizens of Canada. GeoConnections is the name of the geospatial program in Canada. According to this program, the free open-source product will provide a data access and maintenance tool that governments and the private sector can use regardless of budgets. Users of uDig will be able to access the CGDI without buying expensive proprietary desktop GIS licenses simply to view CGDI data. Consequently, uDig will make CGDI data accessible to a much wider potential audience.
There is clearly much more to come from the marriage of open source and open standards in geospatial technologies. The demand for interoperability, flexibility and widespread distribution of products has and will continue to push these efforts. New programmers, working with new building blocks created around consensus-built standards are likely to be a key step in building national and global data infrastructures that not only reach to the far corners of the earth but are usable by their inhabitants regardless of budget or underlying technology.
Adena Schutzberg (Adena@abs-cg.com) is principal of ABS Consulting Group, Inc, based outside Boston, Massachusetts, and consultant to OGC.
April/May 2006
33
Special
ping technology. Tuning maps according to what the end user wants is the fourth option, and can be combined with the third possibility. According to Lakerveld there are essentially two types of web mapping: embedded web mapping and an application specifically meant for the end user. The Director of the Geospatial Center of Excellence explains what he means by the latest: I am talking about providing information based on predictions on what the end user wants to know. Accordingly only the requested information is offered which is retrieved from a content management system. Offering information this way implies not only content managers working behind the screens, but also a communications specialist.
Intranet Webmapping
Web publishing is not new or hot to Bentley: ISIS, a Dutch company that was incorporated by Bentley two years ago, started developing this technology six years ago. At that time the application was called Flexiweb. This was an environment for the management and publication of all kinds of geographic information within an organisation, making use of Internet technology. Speed was essential: being able to use, view, analyse, print and plot hi-resolution vector and raster data, images, multi-media and multiple databases as fast as possible in one integrated environment, key issue with Bentley. Database information, documents, geospatial information, all of these data are
configurable from the server. Instead of having to implement technology, all data are directly accessible from a configurable database. In a very short time the user has a complete GIS environment on the web available. This is the main difference with other web mapping products in the market.
Making Predictions
He continues: Many organisations think offering a huge amount of information is priority. Bentleys opinion is that it is much better to first think about what a user wants to know before putting all the information on a portal for example. A visitor should be able to find, retrieve and make use of the information he is looking for in a very short time, without having to search or being bothered by GIS technology. Lakerveld sees Web Mapping as the first modest step for organisations to a Service Oriented Architecture (SOA). Web Mapping is not a purpose in itself.
Four Types
Bentley offers four options in the field of web mapping. One of them is making all information available via web technology (intranet), the second being the integration of geospatial data in an enterprise system. Making data available via the Internet for several purposes is the third option within Bentleys web map-
April/May 2006
35
Special
supports user-initiated or event-driven interoperability with the ESRI ArcSDE Geodatabase. The Connector for ArcGIS supports an intelligent extract and post paradigm. Bentley users can retrieve Geodatabase data for use in AEC and mapping workflows, and later post the appropriate information for use by ESRI users.
Customers
(Local) authorities are the main customers in the field of web mapping, but telecoms and water are emerging markets. Lakerveld: Amongst other things pipeline networks, but also electric, coax, copper and fiber networks, can be published and managed with our software. Bentley just delivered a huge implementation at the Dutch Ministry of Finance, where Flash based Web Mapping technology is applied to integrate geospatial data (both raster and Vector) related to 2 million parcels inside the SAP environment. Another customer is the Municipality of Eindhoven, who in the year 2000 did research to determine the need for geo-related information for all workstation seats. This study, performed by ISIS, now Bentley Benelux, showed that there was a tremendous interest amongst the employees. However the average user was not always able to get the right information easily. Therefore the Municipality of Eindhoven decided to focus not only on core technology, but also to develop an interface that could seduce its users. This means: raise interest and let the user himself determine how to reach his goals with professional support.
He regards web mapping as part of an integrated system. This so-called system integration started about six years ago and I am convinced it is still strongly evolving. Just look at Oracle with their SOA implementation, which facilitates the development of modular business services that can be integrated and reused, for an adaptable IT infrastructure. He regards web mapping takes place outside the traditional boundaries of the CAD and GIS environment and with that makes geospatial content available for multiple purposes in numerous workprocesses.
were available with the quality level we were used to. Therefore we worked together with Bentley to define a presentation tool with interactive functionality. Within three months this service became operational. According to Tros it is greatly valued by customers and now operational in public issues regarding zoning maps, sense of the city, and the public relations project city of light. The web product shows information that is retrieved from the professional data generated in the back-office. However the Internet audience only needs a small subset of this operational data. Bentleys Web-publishing solution retrieves this subset from the Oracle Spatial database which means no conversions and no additional technical requirements. Here seduction works too: for the first time the professional receives compliments for the work he performs and is encouraged to deliver even more quality. Tros comments: Our municipality is currently positioning this tool as a very important means of communication with the inhabitants and interest groups in our city. We are convinced we can achieve this by having faith in our own quality and stimulating the use of our information.
No Web Editing
Essentially web mapping is meant to provide information that is as unambiguous as possible. As for the Bentley web mapping software, viewing, redlining, making descriptions, printing and plotting are all applicable, in contrary to editing. Surely editing and more interaction are already possible, and Lakerveld is convinced several applications in this area will become available in near future. But Bentley is not going to follow this path, because it is not advisable from an organizational point of view. Lakerveld: Geospatial data has to be created and managed as a service to other users of that data in several work processes (create once, use many). To be able to implement web editing, you have to implement a secure, multi-user transaction based environment. The browser is simply not suitable to support this completely. High resolution editing therefore should take place on the desktop, making maximum use of the rich functionality and dedicated access to Internet Services. like WMS and WFS servers.
Sonja de Bruijn (sdebruijn@geoinformatics.com) is editorial manager of GeoInformatics. Of particular interest are the white papers on www.bentley.com, to be found under the vertical section, left-hand side of the homepage.
Source Information
Intranet and Internet Bentley web applications work with diffeent databases. Source information can be retrieved on-the-fly. In order to maintain security there is a second database behind the firewall. Several systems and data stores are compatible with the Bentley web mapping solutions. Lakerveld: With Bentleys Web Publishing technology it is possible to publish Oracle Spatial live, as well as DGN in native form. The same goes for WMS, both server and client. I think it is quite remarkable that also SAP and ESRI data can be fully integrated. For instance, Bentley Geospatial Enterprise server
Compiling Information
Rob Tros works as an information manager with the Municipality of Eindhoven. He says: The Intranet Webmapping environment that was created has been specifically developed for our own professionals. They are very capable of finding and compiling the information for their own needs, and find it very useful. This is why we also want to present it to our customers, other governmental bodies, or interested outsiders. He continues: In 2004 we felt the need to provide information on the Internet to every end user in a simple, fast and safe Intranet GIS on the Internet, Municipality of Eindhoven, the Netherlands. way. No off-the-shelf products
36
April/May 2006
Special
International Standards
travel to come to that point. By Georges Antoine Strauch
What emerges is a software package called Cartes & Donnes (meaning Maps & Data) with which the user can produce maps, endorsed by a statistical report and expert counselling provided upon request. The maps produced comply with international standards and the software has been validated by the GIP RECLUS, the principal European network of geographers. Ultimately, this software was enthusiastically welcomed and rapidly settled in the sphere of Education and Research. However, Articque kept its first conception of the software as a set of items usable in a components library. The work done through Cartes & Donnes materialized itself in the organigram, the real skeleton behind a future customer-server application or Intranet. The main idea was to use the work and knowledge of an experienced analyst, and make it available to the other users. This was the genesis of the integrating tool, CartoExtension, which was still under development when, in 1997, the European Community financed a project requiring this technology.
Figure 1: Thanks to the organigram (on the left corner of the screen), it is possible to display the same data with several statistical representations in one single process.
Ch@ppe dOr
On January 26, 2006, the French chapter of the Internet Society, ISOC France, Adminet, the Cawa and partner honoured personalities of the Internet in the areas of politics, sciences, arts, business and civil society, at the Museum of Arts et Mtiers in Paris. Jean-Michel Billaut, from the BNP-Paribas workshop, rewarded Strauch with the Ch@ppe dOr. Since 1991, Strauch has been convinced that cartography should not be reduced to the simple illustration, location and displaying of distribution networks. He thought that instead of an Excel-like representation, the data would have much to gain from its representation in maps, and that this illustration should complement and enhance a purely mathematical anal-
ysis. One should then be able to go back in the process, using these more illustrative techniques to make changes and build layer upon layer. It was by taking this idea one step further that the organigram, see Figure 1, was born. Strauch wanted to propose cartography on the France Telecom Numeris network with data and maps, respectively provided by the National Institute of Statistics and Economic Studies (INSEE) and the National Geography Institute (IGN). From 1992 through 1994, he tried to convince these companies to cooperatively launch an interactive service of statistical mapping. He encountered some difficulties with the IGN, since at that time the outlines of French municipalities were available for the modest price of 11.500 ($ 13,800).
April/May 2006
37
Special
Scheme representing the functioning of the tool developed for the Daedalus project presented for INFO 2000.
Disaster management: cooperative application to save and restore the Atlantic sea coasts - Participative web mapping supporting the citizen.
ment he wants to apply and selects a cartographical displaying. Once these variables are entered, he just executes the process to obtain a map he will save to include with his documents later.
Participative Mapping
On June 13, 1999, Articque puts the first cartography of the European elections online, with a constant update of the results. One year later, the company launches FranceElectorale.com, which becomes the first digital electoral board to display the French electoral map, as well as election results. FranceElectorale.com is dedicated to the elections, to elected officials and to electoral forecasting, and offers all candidates the possibility to freely and easily register their campaigning. They enrich an electoral map displaying the current elected officials by filling in an online form. Candidates of future elections are then able to broadcast the first news about their electoral campaign on a website totally independent from political parties. During the municipal elections of 2001, www.FranceElectorale.com achieved 500,000 visitors within a month.
and taking into account their available resources. The data on the evolution of the oil slick and its impact on these shores were immediately collected on the map without the need for Articque to manipulate them, allowing a real-time update on the http://erika.articque.com website. At that time, this initiative was relayed by television (with the French channel TF1) and newspapers such as Le Point, La Tribune, and Les Echos.
taneous users, Articque relied on Linux, Apache, Java, MySQL and its mapping engine awarded by the European Community and the ANVAR.
Best Coverage
In 2004 Articque was contacted by the Medical Services of the SNCF who needed to develop an Intranet solution. Its function was to ensure the best and most thorough coverage possible of the French territory, in order to guarantee a fair and equal quality of service to each and every single SNCF employee, with respect to health services. The solution developed by Articque is conceived for regional administrator, enabling them to: Attribute each municipality to one medical sector and to one general practitioner (GP) in particular; Improve the number and the location of the chartered GPs with respect to the SNCF personnel. The use of mapping allows the administrators to carry out their investigations for the covering of national territory by chartered GPs able to respond to SNCF personnel needs. It also allows them to quickly construct decision-making files. One of the main assets of the application is its ability to display the municipalities where a GP is affected, consequently allowing the administrators to extract the ones which need an allocation.
Georges Antoine Strauch (gas@articque.com) is CEO of Articque. To get more information: www.articque.com, www.cartomatique.com, www.cartesetdonnees.com, and www.mapanddata.info.
Mapping Observatory
In 2000, the CFE CGC, the French Executive Trade Union, comes into action and asks Articque to build a custom-made application. It is the first trade union to equip itself with a Mapping Observatory of the companies. This application is a geostatistical tool conceived by Articque for the elections to come, and facilitates the decentralized entering of data by local representatives of the Union. It allows the constant update of a file containing very precise information about their militants. All this information is displayed on maps calculated in real time. Only decentralized entries can enable such a big organization to maintain files up to the minute. Yet it is in a centralized way and with geographical criteria that these data are consulting, taking into account the country as a whole, as well as regions, municipalities, towns and so on. Each level of consultation corresponds to a synthetic view of the Trade Union, such as number of companies, and number of elected people. The Executive Trade Union acts on a local basis but pilots the project on a national level, which allows the Confederation to make strategic decisions based on synthetic and hierarchical data. To support the connection of nearly 150 simul-
Atlantic Seashore
Another major event took place in December 1999, when the sinking of the oil tanker Erika on the French coast stirred up strong emotion. Articque put online a map of the Atlantic seashore and proposed the constitution of a civilian network, the Coast Watches. Their role was to collect data to be centralized and diffused through interactive maps on the web, representing the coastal municipalities affected by the disaster and their distance to the wreckage, while also attending to their needs
April/May 2006
39
Special
Menu Commands
Moving from left to right across the menu there is: Welcome, Scratch Pad, Locate Me, Permalink, Add Pushpin, Directions, Settings, Community, Help and About. The Welcome screen is displayed to the left of the main map area. It displays news and information about MS Live Local. The Scratch Pad allows the user to store a search result, or he can click a spot on the map, and save that spot on the map to the Scratch Pad. The Scratch Pad remembers where you have been and makes returning a click away. It is also possible to mail and/or blog your Scratch Pad. The Locate Me link activates a Windows Live Local application (Location Finder) that attempts to determine your present location and launch a Windows Live Local map of that location. There are two techniques used by the Locate Me feature: 1. If the user is on a computer with a Wi-Fi card, Wi-Fi signal strength from nearby wireless access points can be used to determine location. This is generally accurate to between 50
Figure 1.
Overview
Live Local is completely web based - no plug-ins or downloads are necessary, see Figure 1. The application has all the typical features of a web based mapping service: search, routing, and more. In this overview the unique features will be described. The interface provides interaction through search, from a menu or context sensitive commands (right click), and offers keyboard shortcuts to assist with navigation of the map. The search features a What and a Where in area. In the What area the user may enter a specific topic such as museum, or cuisine. In the Where area an address can be entered. Or it is possible to search from the What and have the Where be set as use the current map view. The application delivers on the What content as illustrated in a search of Brewpubs in the Fort Collins Colorado area map, see Figure 1.
Figure 2.
40
April/May 2006
Special
w with Microsoft
Passions They Care About
Permalink is a way to save and share your Windows Live Local experience with others; Pushpins are a way to add a custom Push pin to a map much like one might have done in the past with a pin on a paper map. The pushpin allows you to name it and also add 200 characters of text in a note describing the Pushpin. The Pushpin can also be emailed to someone else; Directions provide turn by turn directions with route highlights and maneuvers integrated into the map whether in road or aerial view with printing and email options available. To and from directions can be created by clicking anywhere on the map, or by traditional methods such as search results and entering and address. More on directions later in this article. Settings provide a way to set a limited set of behaviors as to what is saved on exit, map navigation, and searching options; The Community Link provides interaction with others in the community through: A hyperlink to blogs and threaded discussions boards about Virtual Earth; The ability to vote on simple Virtual Earth polls; Provide free text feedback regarding Virtual Earth. Help provides a rich content of help for the Live Local tool. About offers background into the technologies applied in Live Local, and credits all the data and technology providers.
Figure 3.
and 200 feet; 2. If the computer does not have a Wi-Fi card, the Internet protocol (IP) address of the computer is used to determine an approximate location of the user. (IP-based location is generally accurate to the city or county level. In either case, if a location can be determined, the map is updated to reflect this location and a Pushpin is drawn on the map centered on the consumers location.
Figure 4.
April/May 2006
41
Special
in knowing that if we do a good job of executing our Live initiative across all of non US markets, then our competitive position will take care of itself. We don't intend to react to our competition; we are going to respond to our customers access to stae-ofthe-art' needs. That's what will define our competitive position.
I see from the website information that ORBIMAGE is a contributor - will there be full coverage of the earth available at the resolution of IKONOS imagery?
Microsoft and ORBIMAGE have entered into a five year exclusive deal in which Microsoft will be the sole online distributor of world wide satellite imagery from ORBIMAGE. The partnership between Microsoft and ORBIMAGE will enable Microsoft's mapping and location assets to offer rich satellite imagery to both consumer and business customers outside of the United States and will accelerate our ability to offer this imagery in non-US markets. Moreover, ORBIMAGE plans to launch a new satellite within the next 18-24 months which will give Microsoft and its customers access to state-of-the-art satellite imagery.
Figure 5.
Auto-Refreshed Search Results: as the user moves around the map, the search results refresh dynamically to always give the user the most relevant information that pertains to the selected map view; Yellow Page directories: these directories have been incorporated into the Windows Live Local index so that users can query the information in rich, flexible ways; Multiple Searches: give users the ability to conduct multiple local searches and have all of the relevant information show up on the map alongside one another.
gated with a center pivot irrigation system. Some other interesting things to point out with this image; look at the mileage and time for the route. The routing algorithms obviously account for speed by road type. Also notice all the acknowledgements in the south east corner of the map.
One of the features I liked best is the ability to click on a place on a map, right click and set that point as a start or an end point to a destination and get directions to that destination without the need to enter an address for either location. One can then reverse the directions from B to A as illustrated in figures 2 and 3 in an area in London. The reverse directions feature is quite handy when dealing with one-way streets; many one-way streets in the area depicted in Figures 2 and 3 looked like a difficult place for the application to choose a route.
Literal
It is interesting how literal routing software can be; point A to point B along any road. I selected a route between Moffat and Silver Cliff Colorado, see Figure 4. I dont believe this route is passable by anything short of a high ground clearance vehicle, and then only in the summer. Note that the peaks relisted in the map are all above 14,000 feet. This figure illustrates the high quality of the images; even at this scale landforms are easily distinguishable. Did you notice the geometric circles in the southwest of the image? On first sight these appear to be some sort of problem with the image, but the circles are actually accurate renderings of crops irri-
What is the street source data used for routing within Live Local?
We use NAVTEQ as the underlying mapping data for about 75 percent of the maps. NAVTEQ creates the digital maps and map content that power navigation and locationbased services solutions.
Other Features
Aerial Photo with labels: refers to the navigation feature that displays the aerial images with an overlay of road networks and points of interest that is correctly geo-referenced on the image;
April/May 2006
43
Special
What are the plans to add content for the European community?
The MapPoint 06 product that will be out this summer, it will have expanded Eastern European coverage. We are making huge investments in Europe and very soon we will be able to zoom in to much greater detail with Live Local in many countries around Europe. Europe obviously is very important to us, and we have a long history of CD products as well as the MSN Maps and Directions service which is the precursor to Windows Live Local. We are heavily investing to expand the Windows Live Local experience to many European countries, and obviously that will roll out over time. Many of the European countries will get a Windows Live Local experience this calendar year.
Figure 6.
aerial view to a Birds Eye (45 degree) view of the map. Currently, this feature includes about twenty-five percent the United States including major cities such as New York, LA, San Francisco, and Boston (a list of Birds Eye locations are provided in Help). The service brings up a text box notifying us that there is Birds Eye imagery available when we are located in an area with Birds Eye imagery. We navigate the Birds Eye view the same as the map or image view, in addition we have a group of thumbnails that are Be sure to take Live Local and other location services for a test drive and judge for yourself which you like the best. There are a number of web based mapping/location/routing services; some are listed below: Mapquest - www.mapquest.com Yahoo Maps www.maps.yahoo.com Google Local - www.google.com/lochp Very Interesting Historical maps www.davidrumsey.com/index.html European Sites: http://rp.rac.co.uk/rp/routeplanner www.multimap.com/ http://mappoint.msn.com Developers will also want to check out: Virtual Earth Developer APIs: Free versions for both commercial and non-commercial use: Virtual Earth APIs easy-to-use JScript Map control MapPoint Developer Center on MSDN (http://msdn.microsoft.com/mappoint/) The Virtual Earth developer blog (http://blogs.msdn.com/virtualearth/) for news and other information about the Virtual Earth APIs. Via Virtual Earth (www.viavirtualearth.com), is a third-party
clickable and a large feature/small feature scale zoom, as well as a compass to rotate the view North, South, East or West. Take a look at the Birds Eye View of Niagara Falls located on The US Canadian border oriented from the North, see Figure 5, and South, see Figure 6.
Greg Baca (gbaca@geoinformatics.com) is a freelance writer for GeoInformatics. Windows Live Local can be found at http://local.live.com.
Are there plans to integrate MapPoint or Streets and Trips into Live Local?
Streets and Trips and MapPoint are to become more integrated with the Windows Live Local consumer destination on the web. In the 2006 version there is the capability to show any of your streets and trips views in Live Local, by simply right clicking on the map - it will say show in Windows Live Local and it will open (in the US) a screen in Live Local which you will be able to evaluate and with which you can do the same thing as in Windows Live Local; such as view the aerial imagery, and Birds Eye Views to help enhance your trip planning experience. This is the first of many integration points of what can be done between the offline rich win32 client application and the online service.
web site sponsored by Microsoft and dedicated to Virtual Earth development. There you will find code samples, a gallery of existing Virtual Earth powered applications, and other information to help start creating applications that incorporate a Windows Live Local-like experience by utilizing the Virtual Earth APIs. Other Virtual Earth links can be found in the article on Microsoft's Proposed Take-Over of Vexcel, page 6 of this issue of GeoInformatics. Live Local credits the following companies as providing content for Windows Live Local: NAVTEQ, TeleAtlas, USGS, NASA, Pictometry, Harris ImageLinks, OrbImage and EarthData. www.navteq.com www.teleatlas.com http://terraserver-usa.com http://seamless.usgs.gov http://earthobservatory.nasa.gov www.pictometry.com www.harris.com www.es-geo.com/ www.orbimage.com www.earthdata.com
Is there a way to set units for to anything other than feet and miles?
Units will be localized when we launch in Europe.
April/May 2006
45
Interview
Who are your partners and which products are developed in cooperation with these partners? What about future partnerships?
Cameron: Pacific Crest continues to work with major OEMs and dealers alike in an attempt to provide them with what the market needs. We have developed internal radios at the pcb level for our major OEM customers as well as complete solutions at the box and accessory level for our dealers.
Which trends do you see in wireless data communication? What does the future look like?
Iyemura: In general, we see the need for programmable products that maximize the value for the customer. Cameron: Wireless communications are the most successful when the product is completely transparent to the user; meaning that the product works and does not require regular maintenance. We believe that we develop products in anticipation of customers needs before they know that they need it. Rick Gosalvez: Current trends in the various wireless data communication industries are: product-integration, standard-based products movement towards increased range, flexibility,
From left to right: Werner Kozel, Mario Gosalvez, John Cameron, Ron Iyemura, and Rick Gosalvez.
What is the goal of Pacific Crest? Has this changed over the years? What is done to reach this goal?
John Cameron, General Manager: At Pacific Crest, we deliver rugged communications solutions to our customers. We have evolved our kits and products over the last twelve years in an effort to provide the best solution for our customers. As we have grown, we have been able to develop and typeapprove radios for countries all over the world. Rick Gosalvez, Product Marketing Manager: Our goal is to be an industry leader of wireless technologies in the high-precision positioning market. To help us reach this, the following things are currently being done at a high level: frequent customer contact, advanced R&D, the construction of market solutions, service and support.
What can you say about competition? What makes Pacific Crest different from its competitors?
Rick Gosalvez: We believe Pacific Crests differentiating characteristics are: service, support, and reliability in the field. Compatibility is another major differentiator since our radios are already compatible with most systems out in the field.
In which markets or segments is Pacific Crest mainly active? Which new segments would you like to reach and why?
Ron Iyemura, Sales Manager Asia: Pacific Crest is mainly active in the GPS surveying market by providing data links for precision differential signal transmission. Rick Gosalvez: The high-precision positioning market is Pacific Crests main market. This markets main trend leans towards con-
Pacific Crest Corporation (www.pacificcrest.com) gets its name from the beautiful and majestic trail located in the Sierra Nevada Mountains. The Pacific Crest headquarters and primary facilities are located in Santa Clara, California. These offices provide design, quality assurance, sales, shipping, and customer support. Pacific Crest maintains a sales office in Europe, as well as a worldwide network of reseller agencies to serve customers around the world. Service and support is available at Pacific Crest service centres in Europe, Asia, and North America.
46
April/May 2006
Interview
In December 2004 Trimble acquired Pacific Crest. What influence did and does this have on Pacific Crests strategy and/or products?
Cameron: Trimble purchased Pacific Crest for its capabilities and accordingly has encouraged us to continue with our business model and plans. Rick Gosalvez: Pacific Crests incorporation into the Trimble family is a positive thing and allows Pacific Crest to better serve the market. The merger gave us access to additional resources, financial support, and access to more industry experts.
ing economies. As these regions economies continue to grow, more infrastructure and surveyors will be needed, which could mean an increase in radio sales. Basically, as these areas grow; Pacific Crest grows. Our business is inextricably tied to the growth of these countries. Pacific Crest is just one part of the surveying solution and it is our goal to be compatible with many of these other solutions. Mario Gosalvez, Sales Manager Americas: South America is beginning to open up more for RTK solutions. The efficiencies that RTK provides a North American user are beginning to be recognized in South America.
Initially planned in October 2005 but opened late Q1 2006 there now is an Authorized Service Centre in China. What effect does this have on the customer? What can you say about the Asian market in general?
Iyemura: The Asian customer base receives faster repair service and we believe this develops satisfied customers. The Asian market place is growing very fast and we are aiming at supporting this. Kozel: The repair centre means quicker turnarounds on product repairs and increased customer service for customers in that region of the world. Generally, customers living in Asian countries are used to a higher standard of customer service than can realistically be achieved by shipping everything back to the United States, which is counterproductive.
Is there anything you would like to add, perhaps a message to our readers?
Rick Gosalvez: The high-precision market requires the integration of technology. This market is already exhibiting signs towards product integration that simplifies system complexity and improves user flexibility. Thanks to new and innovative products that allow for much easier integration, system integrators now have much more flexibility to customize their systems according to their field requirements.
Sonja de Bruijn (sdebruijn@geoinformatics.com) is editorial manager of GeoInformatics. Surf to www.pacificcrest.com to find out more about the company and its products.
Pacific Crest's machine control product, Sitecom, with two mounting options.
Looking at America, Europe and Asia; what can you say about things like adaptation of the technology you are offering, the saturation of these markets and which market means booming business to you?
Aldert Kluft, Sales Manager Europe: There are many alternative technologies to RTK: spread spectrum, UHF, GSM/GPRS, and many others. Our goal is to be compatible with many of these existing technologies in order to satisfy the survey market. Pacific Crest is primarily involved in the survey industry with new regional markets in Europe, the Middle East, Asia, and Russia; which all have grow-
April/May 2006
47
Article
By Huibert-Jan Lekkerkerk
Gravity Field
Satellites are equipped with very accurate atomic clocks, as was discussed in the previous article. Nonetheless there are still small errors at work mainly due to variations in the gravity field of the earth. As a result of relativity related errors, the satellite clock can show small discrepancies when compared to the mother clocks on earth. Furthermore small changes in the gravitational field of the earth will cause small changes in the satellite orbits. It was already shown that ground stations are constantly tracking the satellites. These control stations determine the corrections for both orbit and clock and transmit these to the satellites once a week. This implies that it is possible to calculate satellite positions based on an almanac which is almost a week old and possibly incorrect. For GPS applications where accuracy is of utmost importance, the correct almanac is therefore applied afterwards to the raw satellite measurements (post-processing).
Selective Availability
Shortly after the GPS system was completed tests showed that the system functioned better than expected. Instead of the predicted precision of 50 100 meters for the civil signal (C/A code Standard Positioning Service) the results were in the order of 10 20 meters. Although these results were very positive in a scientific sense, the American government felt these results were a threat. The main reason for this was that all users could calculate positions with a precision that was almost equal to that of the military signal (P-code Precise Positioning Service). It was thus decided in 1989 to introduce errors in the C/A coded signals, bringing the precision artificially back to 50 100 meters. This signal degradation was called Selective Availability (SA) and has been in use for over a decade, with the exception of the first Gulf war in 1991 when the American army did not have enough military GPS receivers for their
own troops. On the first of May 2000, president Clinton declared that, as a result of the broad use of GPS and DGPS, there was no further need to continue SA and it was switched off. This switch-off was however conditional with the reservation that it could be put back on in times of emergency. Until today SA has been switched off, even after the events of September 11.
Troposphere
The earth atmosphere consists of a number of layers, the troposphere being the first layer (up to a height of approximately 13 kilometres) where the weather is formed. Since the GPS satellites are orbiting high above the earth, their signals need to cross the atmosphere before reaching our receiver. Factors like humidity influence the speed of light, and as such delay the GPS signals resulting in travel time errors in the order of tens of meters.
48
April/May 2006
Article
GPS receivers do employ an atmospheric model to correct for these delays. Local weather variations cannot be modelled however and will result in errors of meters in the pseudorange measurement. The amount of delay depends on the time it takes the signal to travel through the atmosphere, which in turn depends on the satellite elevation above the horizon, see Figure 1. Satellites directly above the horizon will cause the smallest error, and as a rule of thumb, keep the elevation of the satellites used above 10 to 15 in order to reduce the potential error as much as possible. Another method by which tropospheric error can be reduced is the use of a multi-frequency receiver. It has been demonstrated that the amount of delay depends on the frequency of the radio signal. If we measure the travel time for both the L1 and the L2 frequency, we can estimate the tropospheric error to some degree. Most dual frequency receivers use the P-code for correcting the atmospheric error. Since this code is transmitted on both frequencies (L1 / L2) but has an unknown starting point, it cannot be used for determination of the absolute travel time. We can however take differential measurements since the code starts at the same point in time for both frequencies.
Figure 2: number of (predicted) sun spots for the current solar cycle. (source: www.taborsoft.com )
Ionosphere
The ionosphere is the layer in the atmosphere reaching from 50 to 500 kilometres. The sun ionises the air in this layer, creating a charged particle layer. A striking example of this ionisation is the polar light. The ionised particles delay the GPS signal, creating errors of up to 30 meters in the daylight or 6 meters at night. Large sources of ionisation are the so-called sunspots and related magnetic storms. These sunspots have an 11-year cycle with the next peak occurring in 2011 2012, see Figure 2.
At the moment we are approximately at the minimum of the solar cycle. This effect will also occur around the year in locations with a large amount of exposition to the sun (equator, around noon). With a small amount of ionisation the problem will be measurement errors. When there is a lot of activity, the GPS signal can be influenced in such a way that reception is impossible, see Figure 3. When using DPGS systems the effective range can, as a result of the solar activity, be reduced with a factor 2 to 4. Ionospheric errors as a result of sunspots cannot be predicted, but the regular ionisation of the atmosphere can be predicted using an ionospheric model. A multi-frequency receiver can resolve these errors in the same manner as with the tropospheric error.
Multipath
Just as light is reflected by a shiny surface, radio signals can be reflected by things like the water surface, tanks filled with oil and water, but also by cars and ships or bridges. The reflected signals will interfere with the signals that are received via a direct path, see Figure 4. The receiver may start using the reflected signal, which has a longer travel time, instead of the direct signal. As a result the position will be calculated incorrectly, with the position shifting in the direction of the multipath source. Since multipath is hard to correct for, it is better to prevent it altogether. As the first rule in preventing multipath is to keep the antenna as far away as possible from reflectors. Enlarging the elevation mask of the receiver can be of some help as can changing the height of the antenna. A multipath error will last a couple of minutes and will disappear as soon as the signal is no longer reflected towards the antenna. Nowadays most professional GPS antennas have a built-in ground plate or choke ring, see Figure 6, which prevents the reception of reflected signals from under the antenna horizon.
User Errors
The main sources of error in GPS measurements are user errors or as they are usually called, blunders. As a rule, blunders can be prevented by a consequent measurement strategy using as many control options as practically possible. Common blunders are: Measuring too close to objects with either multipath or shielding from the horizon as a result. This results in a degraded
Figure 3: RTK GPS measurements in november 2001. The scale for both X,Y and Z is 0.25 meters. The Kp index is an indication of the radio environment in the ionosphere (red = bad). (source Kp index: http://www.sec.noaa.gov)
April/May 2006
49
Figure 6: Position error through an incorrect choice of geodetic datum. In the example we read ED50 positions (centre). The WGS84 positions from the GPS receiver are 180 meters further in coordinates.
Figure 4: Through reflection of GPS signals a longer travel time is registered, resulting in position errors.
position and a difficulty to detect. Large steel structures such as cranes or masts will shield the horizon just as a bridge or a tree, a fact that is not always appreciated in the field; The use of height aiding without entering the correct antenna height above sea level. As was discussed in the previous article, the use of height aiding should be questioned these days since sufficient satellites are available for a good positioning fix under normal conditions; Incorrect initialisation position after a cold start of the receiver. This will not result in an incorrect position, but in no position reading altogether; Incorrect geodetic settings. GPS calculates all positions in WGS84 coordinates, but most receivers have the option to transform these to any other coordinate system for presentation on the screen. With most receivers the output message will however contain WGS84 coordinates. Errors as a result of the selection of an incorrect geodetic datum can be as high as hundreds of meters, see Figure 5.
Quality Control
To gain insight into the quality of a calculated position there are a number of quality control parameters available in most GPS receivers. The most important one probably is the Dilution of Precision (DOP). The DOP describes the geometric strength of the satellite configuration, or in other words the spreading of the satellites around the horizon. When all satellites are on one side of the horizon, see Figure 7a, the receiver will calculate a high DOP value. There are a number of DOPs available, but with ordinary GPS positioning the Horizontal DOP (HDOP) and geometric DOP (GDOP) are possibly the most important ones. Next to the DOP, some receivers have the ability to calculate the so-called Line of Position Mean Error (LPME). This is an indication of the precision of the position itself and will factor in other parameters like the travel time measurement. Some manufacturers present the user with a so-called quality figure that is said to indicate the precision of the position determination. This quality figure is usually calculated from
parameters like the HDOP and LPME. As a rule one should treat these figures with due caution since the formula used to calculate this is generally unknown to the user.
Summary
From this article it can be seen that there are a large number of error sources influencing the GPS position determination. We should take these error sources rather serious when performing high quality GPS measurements. A number of the errors described in this article can be corrected using DGPS, which will be described in the next article.
Huibert-Jan Lekkerkerk (hlekkerkerk@geoinformatics.com) is a freelance writer and trainer in the field of positioning and hydrography.
Figure 7: the Dilution of Precision is high (a) when all satellites are on one side of the antenna and low (b) when there is an even geometric spreading of the satellites.
50
April/May 2006
Column
If you look at maps on the Web, which I tend to do quite a bit, and not only because I have to, I notice that I take less time to look at them. It looks as if those maps are more businesslike: they have just as old maps by the way a purpose, but you have to interact to reach your goal. You have to pan, zoom, switch layers on and off to get that (often small part) of the map on the screen that fulfils your purpose. These maps are very efficient for that purpose, and some of them are even very well designed and can be classified as beautiful. And if you look from a technology perspective the approach to the solutions is very clever using the latest features of for instance Flash or SVG. Still, because these maps are so fast so to say, your appreciation is different when looking at maps at an exhibition. Todays mapping technology is no longer an excuse not to create good graphic quality. However, it is also a matter of design. Many of the early web maps were interesting in functionality but poor in design. This was partly because the maps produced were just bitmaps, and partly because the producers were already happy with the fact that their technology worked. Nothing new you will say. Indeed, when looking back at the map production history, the introduction of new technology, being it the introduction of lithography, or the plotter, the first results were always a step backwards before one could really benefit from new technology. In this respect it is interesting to page through the ESRI map books from the last decade. These show quite an improvement. The only problem is the page size of the map book where beautiful designs of large maps are reduced to stamps. So size is not just a Web problem. That is why some have started making on-line map galleries. It doesnt solve the size problem, but allows interactivity. Here I would like to draw your attention to a recent initiative to start a journal that focuses on maps as
such. The editor claims rightfully that in todays publishing world there are limits in the use of colour, let alone if one would like to publish map a size other then the journal page. Have a look at the Journal of Maps at www.journalofmaps.com and see if you can enjoy the maps published. Another example where we can witness improvement and where the joy of maps returns are the web map services which help you find location or get route information. MapQuest was one of the early players in this field. From a technological point of view is was quite something that you got a response within seconds for every question asked. The result: millions more maps than ever are produced. The quality of the maps? It is poor, because virtually no attention was paid to design. Today MapQuest hasnt changed that much, but there are new players. Lets pick one randomly: Google Maps. The maps one can produce with Google
Prof. Dr. Menno-Jan Kraak (kraak@itc.nl) is a scientist at ITC, the International Institute of Geo-Information Science and Earth Observation, Department of Geo-Information Processing, Enschede, the Netherlands.
that again will limit our pleasure in viewing. This makes me think of the ESRI maps in our geo-community. One could always easily recognise them because of the typical legends they applied for line-features, like the zig-zag line for roads. Developments in our geo-world have their ups and downs, but if you understand the environment in which our maps have to play a role I guess we can recognize, appreciate and enjoy the beauty of maps.
Todays mapping technology is no longer an excuse not to create good graphic quality. However, it is also a matter of design.
Maps are automatically well designed, crisp and clear at any zoom level. They are also an example of how the maps are fully integrated into the whole idea behind the search engine. The whole Internet is a link to their maps, which in Europe are currently limited to Britain only. And interestingly enough it is possible to create and build your own maps using the Google maps as a base map. Pleasant to see, yes, but considering the influence of the company all maps will look like Google maps, and
April/May 2006
51
Educational Corner
Learning anywhere.
More Fun
Audio guides in museums already provide information exactly where you need it. This is more fun than teaching yourself in advance or reading through some leaflets. More fun usually means better attention, but besides the motivational aspects building a spatial and thereby visual or haptic connection between a learning object and the learning content stands for a better cognition and remembrance. This concept of adapting educational content to the learners position is called Location-based Learning (LBL). Obviously, this requires cooperation between scientists and practitioners both from spatial information and pedagogy fields. The following paragraphs are dedicated to finding a definition of LBL and describing its characteristics as well as some research projects. Finally, a description of an LBLenabling platform is given.
really location technology-oriented but emphasizes the spatial and cognitive relationship between a learning object and the appropriate content. Our museum scenario also works with fixed units in front of the learning object to profit from the cognitive advantage of actually sensing the art work while getting the relevant information no locational technologies needed for that. Establishing a computer
system in front of the art work and creating an easy-to-handle user-interface should do. But there is another side to it which promises to be more fun by taking mobility into account. With more precise positional technologies and more powerful mobile devices available this concept can be extended to the outdoor world. Possible scenarios are users on the move in a nature reserve while learning on plants, land usages, climate and so on in combination with a game platform. Also, travel parties can experience a virtual guided tour in combination with some tutorial support to get away from the passive listeners role. These user cases are a form of mobile eLearning (Mobile Learning) which is didactically related to situation learning with positional information being part of the situational context of the learner. At a first glance, these scenarios sound very familiar also to geographers ears and resemble Location-based Service-applications.
What is LBL?
LBL does not necessarily imply being mobile. The term location-based is not
52
April/May 2006
Educational Corner
Educational System
Another Location-based Service?
However mobile Location-based Learning (mLBL) is different from being a Locationbased Service. Surely both put a focus on providing information to the user related to her or his position and other context information. But mLBL is more aimed at what the user actually does with the information. You could say it is a marriage between Ubiquitous Computing and Ubiquitous Learning. So, a technology which supports mLBL requires a mobile workbench which allows for cooperation with other learners, for solving tasks as well as collaboration tools to work on the same exercise and tutorial support. Of course, these requirements are user-case specific since a school class has other requirements on a learning platform than tourists do. But generally, learning means more than offering an information platform. However it is yet to be proven if this is an idea with a long-term perspective. learning motivation like being obliged to learn what is important to pass a test. Learning by collaboration is also considered to have positive effects on memorizing learning contents as well as on the development of social behaviour. Some of the first results point indeed to the fact that collaboration and gaming are also very attractive factors for learners when exploring internal or external environments. So far, these projects have a clear research focus.
Nature Reserves
Components of an mGES.
Research Projects
It is no use investing in positional technologies and a different infrastructure for mLBL when the whole concept of positional learning with mobile devices will not prove to be an educational success in the long run. The main focus of mLBL is teaching and learning. There are already a few research projects focusing on didactic and technical implications of location-related learning. These are learning scenarios like making first-termers familiar with their campus environment or leading pupils through a nature reserve. Under the auspices of the Nesta Futurelab, Bristol, a research project was accomplished with the didactical focus of connecting mobile gaming with collaborative, self-controlled and experimental learning. Mobile Technologies should allow for learning scenarios outside a class-room. A mobile gaming environment for 7th formers was established to study animal behaviour. The settings were visualized by abstract PDA maps which were enriched with specific game information. In a separate control-room the game activities were presented on an interactive whiteboard. One of the results was that instabilities experienced while using GPS often complicated execution of the game. But again, new technologies like SIRF III-based systems can open up some new opportunities.
Another example for an mLBL-scenario which covers environmental learning would be to use infrastructure and educational content that are already available in nature reserves as well as in pedagogical expertise. A lot of people learn about the environment by reading information on huge presentation boards. A survey untertaken in some nature reserves in Germany has proven that learning on environmental issues usually doesnt take place in this environment and therefore the positive effect of sensing a learning object while learning is absent. Fixed computer systems are used in nature reserve centres and of course the good old presentation board is used, although some reserves have quite ambitious projects to get visitors acquainted with the natural environment. In future learning scenarios RFID tags on natural phenomena like special trees or geological features could send information to the learners device that can accordingly be used for instance to perform a problem-solving task or to add new data to a specific location. Since RFID requires no visual connection between the tag and the reading device, the learner can even be guided to discover some phenomena on his own. Because the information can be sent to many devices at once, group learning is also feasible. Thereby, spatial information technology can really support didactically sound learning.
Motivational Effects
The projects mentioned above all have in common that they focus on groups of pupils or students with a homogenous age group and put an emphasis on gaming and collaboration. Gaming is known to have strong motivational effects on learners with an extrinsic
Visualizing
Environmental learning is a prominent candidate for mLBL. Visualizing ecological processes is crucial for understanding. Refining maps with own primary or secondary measuring data is a typical exercise for students of related disciplines. Combining those maps
April/May 2006
53
Educational Corner
with real learning content to give background information on the natural environment to the student and allowing for explorative on-site learning might solve the gap often experienced between theoretical knowledge from the library and field-trips. Especially, three aspects of the examples described above seem to make mLBL an interesting playground for spatial information scientists and practioners: 1) choosing the right positioning technology for specific learning scenarios, 2) creating application-specific maps which include learning content and 3) integrating didactic requirements into a geographic platform. These requirements could be covered by a mobile Geographic Educational System (mGES) which would serve as a base to provide for mLBL-scenarios.
mGES
Technological requirements still pose major challenges of Location-based Learning environments. This means outside learning scenarios with an audio guide automatically adapting to your position is still some steps ahead. But with innovation cycles ever so fast lets not talk about hardware devices, but focus on features you could expect from a platform that integrates spatial information and educational requirements. A mobile Geographic Educational System (mGES) like that would acquire a combination of the known features of a learning platform as well as the components of a GIS. When exporting raster or vector maps and attributing data to a mobile device, why not exporting georeferenced learning content as well? Creating application-specific maps and handling large amounts of geodata would be on the desk of a spatial information expert as well as running spatial analysis tools and supporting mobile devices with various positioning technologies. The eLearning expert would be responsible for designing the layout of an mGES from a didactical point of view. This would among others cover defining tools for tutorial support, collaboration, task-solving and evaluation. Before deciding on the functional scope of a mGES and the relevant tool support it is crucial to define the spectrum of application scenarios which can be covered by mLBL as well as the appropriate target groups. There dont seem to be clear requirements or definitions yet on how an integration of educational and geographic content can be achieved. As well as from the functional or tool-oriented point of view, creating didactically sound learning material adapted to the users situational context requires cooperation between spatial information and pedagogic scientists and practitioners. Maybe it would
be a good starting point to experiment with Google Earth as a base system for an mGES.
Whats Next?
An interesting task would be to apply existing research results and integrate them into new application scenarios. Mobile Learning applications have quite some history for mobile workers or being a component in Blended Learning scenarios which use mobile devices as a supplement for existing face-to-face courses. An extension of those projects towards mLBL could provide for an additional value. So, starting from our environmental learning scenario one possible application setting would be an export solution from an mGES to a mobile device which is equipped with positioning technology. First of all, it has to be decided who should learn what. First formers simply need different learning material than sixth formers. Going from there, necessary content has to be identified and adapted to the specified learners via a target group analysis. Here, nature reserves are an excellent starting point since they often have a mission to educate and possess excellent learning content. Then, media files like audio and video will have to be generated which are connected to the learning content. Appropriate learning exercises and evaluation mechanism have to be added as well. Those Knowledge Nuggets can be exported to the mobile device. This is business as usual but the geographical value comes with the fact that this knowledge would be georeferenced and includes some maps to visualize content and allow for show-
ing the positional data of the user. This scenario could be augmented by data pushed to the users device while exploring the natural environment. By offering analysis tools and tutorial support it can be assured that the new knowledge is understood and worked on to allow for good educational results.
Summary
As the preceding text has shown, a combination of spatial information technologies and educational applications can provide for mobile Location-based Learning scenarios. Results of mLBL research projects already can be used to find successful configurations for a mobile Geographic Educational System, which could serve as a standard platform for mobile geographic applications with an educational focus.
Anja Kipfer (Anja.Kipfer@eml.villa-bosch.de) is a Geographer and has a Master in Educational Media. She works as a Software Developer in the Spatial Information Department of the EML GmbH, Heidelberg.
April/May 2006
55
Product News
High density point cloud from phase-based Leica HDS4500 scanner in PDMS.
Supports a variety of laser scanners Including native data formats from Leica Geosystems scanners; CloudWorx toolbars Access CloudWorx operations; Visualize a new design concept directly in context with reality.
Key features and capabilities of Leica CloudWorx 1.0 for PDMS include: Measure Using PDMS own measuring tools; Automated clash checking - Using PDMS built-in clash management and reporting tools; PDMS Design Point (D-Point) placement - At pick point or center-of-pipe, D-Point placement lets users create intelligent as-built models directly in PDMS using catalog components and objects; Easy point cloud management By Scan, Limit Box, Cutplane slices and sections, HideRegion;
Internet: www.leica-geosystems.com/hds
April/May 2006
57
Product News
By means of a web conference and a press release just before this virtual gathering halfway March Bentley wanted to draw attention to remarkable news indeed: MicroStation has been connected to the Google Earth service. Joe Croser, global marketing director, Bentley platform products, and Ray Bentley, lead developer, explained that this means infrastructural projects can be viewed in context. Adding models, 3D viewing, zooming in and out, turning off and on local information and reference files, turning on levels like parking or roads, all this is possible. The connection MicroStation- Google Earth means that CAD and GIS data are combined (the user first having to register CAD files for Google Earth to recognize them), which I think everybody agrees with is quite an interesting aspect.
58
April/May 2006
Product News
Leica TPS1200 total stations creating a GNSS SmartStation. Source: Leica Geosystems
Internet: www.leica-geosystems.com
Leica GeoMoS is a solution for multi-sensor structural monitoring using a range of high precision geodetic instruments from Leica Geosystems and third-party sensors. Version 2.0 of the software provides a step forward in secure data replication, synchronization and post processing. The new Leica GeoMoS Server, introduced in Leica GeoMoS 2.0, is
Thales introduced GPSDifferential Module, a software extension for MobileMapper CE that seamlessly adds the power of post processing to virtually any mobile GIS/mapping software application. With the GPSDifferential Module, sub-meter and up to sub-foot mapping are also achievable where real-time corrections are not available, or when used in difficult signal environments required by applications such as forestry. GPSDifferential Module is a software extension that fully integrates into third-party mobile GIS software applications. Behind the scenes and without interrupting normal workflow, GPSDifferential Module automatically logs the raw data that is required for reliable sub-meter post-processed differential corrections. In certain conditions and with an external Thales GPS precision antenna, accuracy up to subfoot can be consistently achieved. The Thales GPSDifferential Module is a business partner and software integrator tool that includes MobileMapper Office software for post-processing. When GPSDifferential Module is integrated into a GIS application, raw data collection functions can be accessed that will store the data in a separate raw data field that will be later recognized by MobileMapper Office. This will affect neither real-time data collection storage nor the structure of the realtime data format. The raw data file stored by this software extension can be post-processed using MobileMapper Office. Post-processed data is exported with attributes in industry standard formats. Software Integrators find that GPSDifferential Module easily merges with their own solutions without the need to change the data structure to support post-processing or create office-based post-processing software. Source: Thales Internet: www.thalesgroup.com/navigation
Trimble introduced the latest in a long line of GPS timing receivers, the Acutime Gold GPS smart antenna. Slightly larger than a baseball and housed in a rugged, environmentally sealed enclosure, the Acutime Gold provides a pulse-per-second (PPS) output syn-
April/May 2006
59
Product News
Sokkia Series30R
Bluetooth wireless communication is now available for Sokkias Series30Rtotal station line providing cable-free communication with data collectors. The Sokkia Field-info Xpress (SFX) function, fitted as standard, enables data transfer via the Internet using mobile phones. As Series30Rs Bluetooth wireless communications module has a dial-up function, SFX can be used without cables using a mobile phone with Bluetooth technology. This latest version of Series30R is also equipped with enhanced surveying programs. The Sokkia Series30Rtotal station line offers IP66 level of dust and water resistance, and distance measuring capabilities of survey-grade accuracy +/-(3+2 ppm x D)mm from 30 cm to 350 m (Class 3R models). Source: Sokkia Internet: www.sokkia.net
Boeing has released version 4.1 of its SoftPlotter digital map production software, enabling users to provide more accurate and efficiently produced digital mapping products to their defense and commercial mapping customers. Digital Globe QuickBird sensor support for panchromatic and multispectral imagery offers SoftPlotter users full, rigorous sensor
translators are included, and Visual Basic workflow wizards provide streamlined workflow setups for batch processes. Of interest to KDMS users, SoftPlotter 4.1 provides a COM interface callable from macros and a database interface for collection of fully attributed vector map data.
60
April/May 2006
Product News
NavCom Technology introduced the newly enhanced VueStar aerial survey solution. This combines its new StarPac utility software that facilitates better integration into pre-existing workflows. VueStar complete global navigation system is configured specifically for all aerial sur-
Pictometry Viewer ActiveX control enables third-party software vendors and system integrators to embed Pictometry's oblique imagery and analytical tools directly into end user applications. Using Pictometry's ActiveX control, third-party software companies can now integrate their own customized version of Pictometry's software tools, similar to Pictometry's own Electronic Field Study (EFS) software without having to leave their native application environment. This gives users access to Pictometry's software functionality in a third-party application. The company has been working with several technologies and business partners to test, implement, and deploy its ActiveX control in other geospatial related systems. One of the first set of solutions where Pictometry has successfully integrated its ActiveX control is in ESRI's suite of GIS products that include ArcIMS and ArcGIS. Sample screen capture images of Pictometry technology in ESRI and microDATA GIS solutions can be viewed at www.pictometry.com/pressrelease/activex.asp. Companies that are interested in partnering with Pictometry and using its Active X control can contact Pictometry Vice President Scott Sherwood. Source: Pictometry Internet: www.pictometry.com
Leica Geosystems introduces a new version of its GPS Spider software and SpiderWEB V1.3, a web-based solution for distribution of GPS reference data via the Internet. Amongst software optimizations, such as further improved data processing for network RTK, the graphical user interface of the new Spider software has been enhanced with consistent map views supporting now loadable background maps and a graphical continous raw data status view. In view of GPS
April/May 2006
61
Industry News
al citizens, deployed the SPS Application which recently won the award for best project in Government to Business category in the Irish "Innovation Through Technology Awards". The winning project reused the iMAP solution built by eSpatial and Accenture, the international management-consulting and technology services company, integrates application-form processing data with geospatial information enabling the department to manage and process Single Farm Payment applications and payments in an integrated and web-based environment. The spatial component of the system is built on eSpatials iSMART technology, which draws upon the spatial capability within Oracle 8.1.7 Database and Oracle9i Application Server.
www.espatial.com
Landmark Building Added to Ordnance Survey's Data Mapping Assembled for Senedd
The Senedd - the National Assembly of Wales' new debating chamber - now features on the nation's most detailed maps. The outline of the Cardiff Bay building is set out in fine detail in OS MasterMap. The 67 million building, designed by Lord Richard Rogers, hosted its first debate and First Minister's questions last week, and was officially opened by the Queen on St David's Day, 1 March 2006. The Senedd - parliament or senate in Welsh - uses natural wood and slate and is designed with energy efficiency in mind. It has won praise for its construction, winning the Building Research Establishment's highest award for sustainable construction.
www.ermapper.com
www.nttdocomo.com
ESRI Book Remote Sensing For GIS Managers ESA Joins Forces with Japan on New Infrared Sky Surveyor
A high-capability new infrared satellite, ASTRO-F, was successfully launched on 21 February by the Japan Aerospace Exploration Agency (JAXA). In a collaborative effort involving ESA and scientists across Europe, the spacecraft is now being prepared to start its mapping of the cosmos. Orbiting the Earth, ASTRO-F (to be renamed Akari (light) now that it is in orbit) will make an unprecedented study of the sky in infrared light, to reveal the distant phenomena hidden from our eyes that tell the story of the formation and evolution processes taking place in the universe. ASTRO-F will be in polar orbit around the Earth at an altitude of 745 kilometres. From there, after two months of system check-outs and performance verification, it will survey the whole sky in about half a year, with much better sensitivity, spatial resolution and wider wavelength coverage than its only infrared surveyor predecessor, the Anglo-Dutch-US IRAS satellite (1983). The all-sky survey will be followed by a ten-month phase during which thousands of selected astronomical targets will be observed in detail. This will enable scientists to look at these individual objects for a longer time, and thus with increased sensitivity, to conduct their spectral analysis. This second phase will end with the depletion of the liquid helium needed to cool down the spacecraft telescope and its instruments to only a few degrees above absolute zero. ASTRO-F will then start its third operations phase and continue to make observations of selected celestial targets with its infrared camera only, in a few specific infrared wavelengths. Hundreds of illustrations and examples merge in Remote Sensing for GIS Managers, a new book from ESRI Press that reveals the power of interpreting information gathered from aerial photography, radar, satellite, and other remote-sensing methods. Readers will travel from the vast ocean depths to the far reaches of outer space as they learn everything from the basics of remote sensing to the challenges of interpreting, managing, and storing the ever-increasing range of remotely sensed data available today. Designed for new and experienced users, Remote Sensing for GIS Managers is written for GIS managers, professionals, and students who want to become more knowledgeable users of remote-sensing services and manage the development of innovative solutions suited to the needs and goals of their organizations. The books case studies illustrate the use of remote sensing in national security, urban and regional planning, resource inventory and management, and scientific disciplines ranging from forestry and geology to archaeology and meteorology. Remote Sensing for GIS Managers (ISBN 1-58948081-3, 524 pages, $69.95) is available in bookstores and from online retailers worldwide or can be purchased at www.esri.com/esripress or by calling 1-800447-9778. Outside the United States, contact a local ESRI distributor; see www.esri.com/international for a current distributor list. Books published by ESRI Press are distributed by Independent Publishers Group (tel.: 1-800-888-4741, Web: www.ipgbook.com).
www.ordnancesurvey.co.uk
www.esri.com
www.esa.int
Irish Dept. of Agriculture & Foods SPS Uses eSpatial Spatial Technology
eSpatial has announced that the Irish Dept. of Agriculture & Foods recent award winning Single Payment System (SPS) uses spatial technology based on eSpatials iSMART platform. The Dept. of Agriculture & Food, whose range of customers encompasses government departments and individu-
www.topconeurope.com
www.loy.co.uk/
April/May 2006
63
Industry News
Terra Digital Chooses Infoterras Pixel Factory
widespread adoption and use of GI. It highlights the continuing maintenance of the national georeferencing infrastructure on which the market in GI can develop, plus Ordnance Survey's focus on customers and collaborative work with government, business and other stakeholders. The strategy involves the goal of ensuring that Ordnance Survey data remains a key enabler to support new demands and requirements throughout the information industry. A major component to support joined-up information sharing is the ongoing development of OS MasterMap as a seamless geographic database compatible with accepted web standards and ordered through an online interface. tor for geographic information system (GIS) software in Iraq. Since 1989, the Republic of Iraq, with the help of Atlas for GIS and Surveying Systems, has continued the arduous task of updating the country's infrastructure maps while working toward the establishment of national information and GIS centers.
www.esri.com
People
New Petroleum Industry Solutions Manager Joins ESRI
Brian Boulmay has joined ESRI as Petroleum Industry Solutions manager. With more than eight years of experience in the petroleum industry, Boulmay comes to ESRI from Shell in Houston, Texas, where he was team leader for geo-information and GIS. To meet the diverse needs for geographic technology in this industry, ESRI is expanding its petroleum team with highly qualified, GIS, Petroleum professionals. During his time with Shell, Boulmay served on the Petroleum User Group (PUG) steering team assisting with the PUG conference as well as leading the PUG 3D working group focusing on gridding and contouring, and 3D capabilities of ESRI software. With Shell, he led a team of spatial professionals and managed local and global projects, improving the company's use of spatial technologies. A recent project was working to standardize the GIS IT architecture globally for Shell.
www.ordnancesurvey.co.uk
Infoterra France announces the delivery of its Pixel Factory photogrammetric suite to Terra Digital, an advanced geo-service provider in Germany. The Pixel Factory, developed by ISTAR (now part of Infoterra France) during the last 15 years, offers the opportunity to rapidly generate a wide range of cartographic end products such as Digital Surface Models, Digital Terrain Models, Ortho and TrueOrthoTM photos. According to Infoterra France there is a high level of automation and multi-processor architecture which help to produce vast quantity of raw data in outstanding time.
www.infoterra-global.com
www.esri.com
www.digitalglobe.com
www.geoeye.com
www.vianovasystems.co.uk www.vianova-systems.fr
www.vexcel.com
index.htm.
www.thalesgroup.com/navigation
April/May 2006
65