You are on page 1of 49

Viewpoint

Microsoft's Proposed Take-O


Part of a Global Heavyweight Struggle
The news that the Microsoft Corporation is in the process of trying to acquire the Vexcel Corporation was first reported on 15th March 2006 by the Daily Camera, a local newspaper published in Vexcel's home town of Boulder, Colorado. Somewhat belatedly, this leaked news was confirmed by spokesmen for Microsoft and Vexcel a day or two later. All of which triggered an outpouring of comment on numerous Web sites from Microsoft "watchers" and "bloggers". It was only too apparent from the published comments that many of these "bloggers" simply have little or no idea as to what Vexcel does and why it should be attractive to Microsoft. So the publisher invited me to supply my own thoughts and opinions on the matter of this projected take-over of Vexcel. It is very much a personal viewpoint! A Commentary by Gordon Petrie

The Vexcel UltraCam D large-format airborne digital camera showing the multiple lens cones that produce its panchromatic and multi-spectral frame images. (Source: Vexcel).

I - Introduction & Background


The background to this proposed take-over of Vexcel - which has still to receive regulatory approval from the appropriate authorities in the U.S.A. and Europe - is the intense competition between Microsoft and Google regarding their respective Internet search engines that are so fundamental to the success of the World Wide Web. Recently the competition between these two companies has moved into the area of Earth imaging with both companies having identified such imagery, combined with the supply of the associated maps, location-based business data and travel information, as being a vital component of the services being offered by their respective search engines. Besides these two heavyweight competitors, it is worth noting that Yahoo! may also join this competition, since it has recently

introduced a beta version of its Yahoo! Maps software, see http://maps.yahoo.com/beta/. However up till now, Yahoo! has not utilized imagery in this particular product. Still it would be no surprise if it decided to do so.

I (a) - TerraFly, Skyline & Keyhole


There have been quite a number of interactive services that, for some years, have offered extensive ground coverage based on satellite and aerial images in combination with the local business information associated with each specific image. A prominent example is TerraFly, which is a service covering the U.S.A. offered by Florida International University (FIU) with support from IBM, NASA and USGS, see http://terrafly.fiu.edu/. Skyline Software Systems has offered a similar service with maps located side-by-side with the correspond-

ing satellite images, see www.skylinesoft.com/. Yet another similar service, entitled EarthViewer, was offered from 2001 onwards by the Keyhole Corporation, see www.keyhole.com/. This last service allowed customers to start from a world globe and to zoom rapidly, smoothly and in a very dynamic way to the specific area of interest, often at a very high resolution - down to street level within cities. This service was aimed originally at business users who paid a quite substantial subscription (annual fee) of several hundred dollars for the service. Indeed the full "enterprise" version of the software for multiple users within a large organization cost $20,000. However, in 2003, Keyhole introduced a lightweight version of the software with somewhat reduced capabilities - called Keyhole LT - which was aimed at the consumer market and which retailed at some tens of dollars. Keyhole needed to populate the very large (multi-terabyte) image and map database required to support all its various products and services. To this end, it obtained aerial photography from AirPhotoUSA; satellite imagery from i-cubed and DigitalGlobe; and map and address data from Geographic Data Technology (GDT), now owned by Tele Atlas. At that time, the Keyhole service mainly covered the U.S.A. at high resolutions.

II - Google
In October 2004, Google announced that it had acquired the Keyhole Corporation. The two companies are both based in Mountain View, California, which meant that there was a minimum disruption to the staff. Since then,

April/May 2006

Viewpoint

ver of Vexcel
Colorado - was acquired by Google. @Last's main software product is SketchUp, which allows the simple construction and modelling of 3D objects, especially buildings, from scratch. It has gained a substantial customer base among architects, graphic artists and game developers. Moreover two plug-in versions of SketchUp had also been developed (i) for use with ESRI's ArcGIS package, and (ii) to develop 3D content for Google Earth. An annotated perspective view of part of Chicago showing the skyscraper One presumes that Google decidbuildings in the downtown area of the city - as displayed on Google Earth. ed to acquire @Last Software so (Source: Google) that this valuable tool would not matters have moved rapidly with the various fall into the hands of a competitor, see www.sketchup.com/. Keyhole products being renamed and integrated closely with Google's search engine III - Microsoft and with Google Maps and Google Local. From The Microsoft Corporation has been active in the users' standpoint, the most notable the field of supplying spaceborne and airborne feature is that the map data can be superimimagery for some considerable time - very posed directly over the image data of the much longer than Google! terrain. In June 2005, Google launched its Google Earth product based on the Keyhole III (a) - TerraServer technology. In particular, the lightweight This involvement began with its participation version of the product was offered free for in the TerraServer project that started in personal use through a simple download from 1997/98. Initially this was a collaboration Google's Web site without any need for the between Aerial Images, Compaq and Microsoft. user to register. The previous enhanced, The Aerial Images company, through an agreebusiness and enterprise versions of the ment with the Sovinformsputnik company, software - now called Google Earth Plus, supplied Russian SPIN-2 high-resolution space Google Earth Pro and Google Earth Enterprise imagery as the initial baseline imagery to cusstill have to be paid for, though at a lower tomers using TerraServer. Compaq supplied its level than before. However the availability of high-powered Alpha servers to provide the the free version of the Google Earth triggered considerable computing resources that were a massive explosion of public and media interrequired for this on-line service. est and much favourable publicity. While, for its part in the partnerThe Google Earth database has continued to ship, Microsoft supplied a scaledbe developed at a rapid rate. From the specific up version of its Windows NT point of view of Western Europe, many more software and its SQL relational image data suppliers have been signed up database management system. and coverage is now much more extensive Essentially, at that early stage, than before. However the rest of the world Microsoft viewed TerraServer outside North America and Western Europe is mainly as a research project and still very poorly served - at least in terms of test bed for the development of high-resolution imagery. advanced database technology. no longer has any stake in the TerraServer venture. Furthermore the main source of the imagery has also been changed. The baseline imagery is now the USGS aerial photography with a 1m GSD that covers almost all of the United States. Internationally the baseline imagery became the 15m GSD Landsat imagery, mainly supplied by EarthSat (now MDA Federal), and the 1km NASA imagery as processed by the Globe Explorer company. These systematic coverages are supplemented by (i) more scattered higher resolution aerial photographic image coverage, mainly of the United States, supplied by AirPhotoUSA, Sanborn and other commercial aerial photography providers and (ii) by satellite coverage supplied by Digital Globe. Topographic map coverage at scales ranging from 1:24,000 to 1:250,000 is also available from TerraServer, together with aeronautical charts at still smaller scales, see www.terraserver.com.

III (b) - TerraServer-USA


From 1998 till 2003, TerraServer was hosted by Microsoft's MSN Network, after which, the agreement was terminated. Since then, Microsoft has been engaged in its own quite separate TerraServer-USA operation - see http://terraserver.microsoft.com/. In fact, this project started up in 2001. This service provides free access to a large part of the vast store of geospatial data of the USGS, including the USGS Digital Orthophoto Quadrangles (DOQs) and its Digital Raster Graphics (DRGs). TerraServer-USA also hosts the NASA data sets used by NASA's World Wind service. In May 2005, the TerraServer-USA operation was transferred from the Microsoft Research

II (a) - @Last Software


Two days before the news broke about Microsoft's proposed take-over of Vexcel, another much smaller company, @Last Software - like Vexcel, also based in Boulder, Over the next few years, the TerraServer project gradually changed, In particular, the ownership changed and Microsoft

A perspective view of an area in which the buildings, roads and car parks have been constructed using the SketchUp software from @Last Software an example from the SketchUp gallery. (Source: @Last Software/Google)

Latest News? Visit www.geoinformatics.com

April/May 2006

Viewpoint

TerraServer-USA into a single application, see http://virtualearth.msn.com/. Like Google Earth, Virtual Earth is being offered free as part of MSN and it too has received a great deal of favourable comment. Virtual Earth is entirely Web-based and there is no need to download and install special software as there is with Google Earth. With this product, Microsoft entered into a direct head-to-head competition with Google which had released the beta version of its Google Earth product on 23rd June 2005.

III (c) - Content Provision for Virtual Earth


Needless to say, a main concern for Microsoft has been to provide systematic image coverage of large areas of terrain in order to underpin Virtual Earth. The availability of low and medium resolution imagery of large areas has not been the problem. This is largely taken care of by using Landsat imagery. The main problem is to get access to high resolution imagery, especially for large metropolitan areas. Thus Microsoft (like Google) has signed up a large number of aerial photographic service providers, as follows:(i) Pictometry - On 23rd May 2005, Microsoft announced a 5-year agreement with

A screen shot taken from Windows Live Local, powered by Microsoft's Visual Earth. The lower part of the screen contains an annotated aerial image of Seattle; the upper part shows the three views provided by the corresponding "street side" images. (Source: Microsoft)

group to the company's MapPoint group. The latter group had for some time been selling its MapPoint map products for travel location, navigation and planning. On 30th June 2005, Microsoft's MSN Search rolled out a beta version of its Local Search service that allowed a

MapPoint map or a TerraServer-USA aerial photo image to be displayed alongside the results of a search. On 25th July 2005, MSN released the beta version of its Virtual Earth product that fully integrated MSN Search, MapPoint's Maps & Directions and the

April/May 2006

Viewpoint

Another screen shot from Windows Live Local - in this case, showing the area of Fort Washington in New York city. This gives a "bird's eye view" of the area that utilizes the geo-referenced oblique photography supplied by Pictometry. (Source: Microsoft)

Pictometry that will allow its geo-referenced image and space image data if Virtual Earth oblique aerial photography to be used in is to become fully operational, not only in Virtual Earth. From a West European the U.S.A. (which is its first objective) but perspective, it will be interesting to see if this world-wide. A complication with aerial imagery agreement will be extended to cover is of course that the image data obtained by Pictometry's European licensees - the Blom service providers does not always remain with IV - Vexcel group, including Simmons Aerofilms (U.K.), them - in many cases, the original films and With regard to Microsoft's proposed purchase CGR (Italy), GeoTec (Germany), FM-Kartta digital data are delivered up to the customers of Vexcel, one can see immediately that certain (Finland), Seficart (Iberia), etc. who then own the data. This is often the parts of Vexcel's activities are very attractive to (ii) ORBIMAGE - Shortly after this agreement case with government-owned mapping Microsoft - but not all. An overview of these with Pictometry was signed, Microsoft signed organizations. activities was given in my profile of the another similar 5-year agreement with Vexcel Corporation that was published in ORBIMAGE for it to supply its global the December 2004 issue of space imagery acquired by its OrbView-2 GeoInformatics. My analysis (and opin(at low-resolution) and OrbView-3 (at ion) of the potential value of these activhigh resolution) for use in Virtual Earth. ities to Microsoft will be conducted Since then, in December 2005, ORBIMunder three main headings - geospatial AGE has taken over Space Imaging and data, software and hardware. formed the new GeoEye Corporation. Again it will be interesting to see if this IV (a) - 3D Urban Model Data means that high-resolution IKONOS As discussed above, the acquisition of space imagery will also be available for image, map and terrain data to act as use in Virtual Earth. additional content for Virtual Earth (iii) EarthData - Yet another potentially would appear to be a high priority for important supplier of aerial imagery was Microsoft. In which case, particular attensigned up by Microsoft in December tion would have been paid by Microsoft 2005 in the shape of the EarthData An aerial image with a set of measurements (distance, area, height) to the high-resolution 3D building and Corporation, one of the largest U.S. superimposed over the image - produced by GeoTango's SilverEye software. terrain model data sets that have been aerial mapping companies - again on the (Source: GeoTango). generated from stereo-pairs of aerial basis of a 5-year agreement. photographs by Vexcel's Mapping Services III (d) - GeoTango Division for numerous cities in North America However notwithstanding the recruitment of Besides data, Microsoft required additional and elsewhere. Although these data sets were all of these major suppliers of imagery, it is software for the display, manipulation and generated primarily for use in the planning of important to realize that they can only supply visualization of the image data held in Virtual cellular phone networks by telecomms a part of the content that is needed to Earth. To satisfy part of this requirement, on providers, they have also been made available populate Virtual Earth. Thus Microsoft still 23rd December 2005, it bought the GeoTango to other users off-the-shelf under Vexcel's needs to find other major sources of aerial

company based in Toronto, Canada. GeoTango had already developed various software packages for 2D and 3D content building and visualization, see www.geotango.com/. Its GlobeView software generates a Digital Earth that allowed image data and locationbased information from anywhere on the Internet to be streamed and displayed on the user's screen. Its SilverEye software is designed to ease the task of generating 3D models of high-value facilities and urban landscapes. SilverEye also allows the rapid collection and display of quantities such as distance, area, volume, slope or bearing to an acceptable standard of accuracy for many purposes using a single airborne or spaceborne image. Thus it does not require the provision of a stereo-pair of images for this particular task. A third development at GeoTango is its SmartDigitizer software that allows the semiautomatic feature extraction of lines and polygons to be carried out on remotely sensed imagery. In fact, SmartDigitizer has already been incorporated into PCI's Geomatica remote sensing image processing suite. Quite obviously all three of GeoTango's software packages could be utilized within Virtual Earth and in the Windows Live Local service that is powered by Visual Earth.

Latest News? Visit www.geoinformatics.com

April/May 2006

Viewpoint

IV (b) - Terrestrial Photogrammetric Software


Regarding Vexcel's extensive range of software, the most obviously useful package in the context of Virtual Earth is the FotoG closerange photogrammetric software package. This has been used extensively both in-house and, through licensing, by various outside research and industrial organizations (including NASA and General Motors) to produce 3D CAD models direct from film and digital photographs. In the context of Virtual Earth, this software could be extremely valuable in generating urban facades and models of buildings from photographs and video images taken at ground or near-ground level, see www.vexcel.com/products/crange/fotog/ In this particular context, it is also worth noting that the Facet Technology Corporation which is a mobile mapping company - has been collecting comprehensive street-level imagery of many cities in the U.S.A. using multiple digital cameras mounted in its vans equipped with GPS/IMU systems to provide the necessary locational information. Already

Microsoft has included part of the Facet company's street-level coverage of Seattle and San Francisco in its beta version of Virtual Earth under the title Street Side Views. It is also worth noting that Tele Atlas is undertaking similar surveys in Western Europe. This uses the mobile mapping van system developed by the Polish GeoInvent company - which Tele Atlas has bought. However, up till now, there has been no news of Microsoft making use of this European imagery.

IV (c) - Aerial Photogrammetric Software


Turning next to Vexcel's aerial photogrammetric software, most of its UltraMap WorkSuite was acquired when Vexcel bought the Canadian ISM company in 2004. This WorkSuite includes the DiAP digital photogrammetric workstation (DPW) and various complementary triangulation and orthophoto programs. Of course, this is a useful state-of-the-art photogrammetric suite that is being used by customers for Vexcel's airborne digital cameras as well as the former ISM user community. However it is difficult to envisage Microsoft going in for the

A perspective view of downtown Chicago based on the 3D terrain model and building height data produced from aerial photographs by Vexcel's Mapping Services Division. (Source: Vexcel)

Global Landscape product line. This 3D building and terrain model data and the associated aerial photographic data from which it was produced should be invaluable additions to Microsoft's data portfolio to be used in Virtual Earth, see www.vexcel.com/ services/mapping/.

10

April/May 2006

Viewpoint

mass production of orthophoto, map and terrain elevation data on the scale needed to populate Virtual Earth. To outside observers, Microsoft is much more likely to purchase or license the required data from the numerous existing photogrammetric service providers. Still the UltraMap software and its associated archiving facilities are there if needed for specific purposes or projects, see www.vexcel.com/products/photogram/diap/.

will be potential buyers of these successful product lines. Still there are some other considerations to be kept in mind. As one might expect, the marketing manager of Vexcel, Jerry Skaw, sent out a letter by e-mail to all of Vexcel's customers concerning the proposed take-over of the company by Microsoft. It included the following statements - "... the acquisition is expected to bring with it resources and support that enhance our offerings and allow us to expand in ways that greatly benefit our current customer base ...... We also plan to continue to develop, sell and support our current products - many of which will contribute directly to our new role. ..." The complete text of Skaw's letter is available at http://industry.slashgeo.org. However potentially much more significant in this particular context is the news that Professor Vincent Tao has been appointed Director of Virtual Earth. Prof. Tao holds the Canada Research Chair in Geomatics Engineering at York University in Toronto. He is a quite outstanding photogrammetrist and has been the main driving force behind the software developments at the GeoTango company that Microsoft purchased only three months ago. His appointment to the position at Microsoft appears to have been made very shortly after its purchase of GeoTango. This puts a quite different complexion on the whole take-over saga - both regarding the proposed acquisition of Vexcel in the first place and the potential to fully exploit all the different elements of its hardware and software portfolio outlined above. I look forward with great interest to Microsoft's future developments in this area with Prof. Tao as Director of Virtual Earth. The following links may be useful with regard to Microsofts Virtual Earth:Windows Live Local (powered by Virtual Earth)http://local.live.com/ Windows Live Local Community http://local.live.com/community/default.aspx Windows Live Local Technology Preview Street Side Views http://preview.local.live.com/ Windows Live Local / Virtual Earth Blog http://spaces.msn.com/VirtualEarth/ MSN Virtual Earth Groups Message Board http://groups.msn.com/MSNVirtualEarth
Gordon Petrie (g.petrie@ges.gla.ac.uk) is Emeritus Professor in the Dept. of Geographical & Earth Sciences of the University of Glasgow, Scotland, U.K.

IV (d) - Remote Sensing Software


Vexcel's other software offerings are concentrated heavily on a quite different area - namely the advanced processing of remotely sensed satellite image data and especially the processing of SAR (radar) imagery for which it is renowned internationally. This has included the extensive use of this software in the Radarsat Antarctic Mapping Project (RAMP) and in the Shuttle Radar Topography Mission (SRTM) both of which have involved the mapping of enormous areas based on SAR image data. Undoubtedly these are very valuable software products and might be useful to Microsoft in specific circumstances and projects. However, again as seen from the outside, Microsoft seems much more likely to be interested in the RAMP and SRTM data sets than in the software that has been used to create them, see www.vexcel.com/products/remote/.

A satellite ground receiving station that has been supplied by the Vexcel Corporation. (Source: Vexcel)

IV (f) - Remote Sensing Hardware


Vexcel's remote sensing hardware systems are of a completely different character - comprising satellite ground receiving stations and their associated image processing systems. Again these are very high value items which Vexcel has sold and supported very successfully against strong competition - principally from MDA in Canada and from ViaSat and Datron (now part of L3 Communications) in the U.S.A. However it is difficult to see how the sale of satellite ground stations to other independent organisations can be of value to Microsoft in the specific context of its Virtual Earth operation, see www.vexcel.com/products/remote/.

IV (e) - Aerial Photogrammetric Hardware


Like the software, Vexcel's hardware products fall into two quite distinct and largely unrelated groups - used (i) for aerial photogrammetry; and (ii) for satellite remote sensing respectively. The two principal aerial photogrammetric hardware products are the UltraScan high-precision photogrammetric film scanner and the UltraCam D large-format digital frame camera. The UltraScan has been an outstanding commercial success for Vexcel with over 400 units sold. However, with the increasing sales of digital cameras, one may assume that sales of the UltraScan are now past their peak. In the case of the UltraCam, currently it is engage in a fierce struggle with its competitors in the area of large-format airborne digital imagers principally Intergraph (with its DMC frame camera) and Leica (with its ADS40 pushbroom scanner). Vexcel has more than kept its end up in this highly competitive market. One may assume that there are still more sales to come - though, in the end, there must be a finite number of customers who can afford to buy large-format airborne digital imagers at the present price of around $1,000,000 per unit if a GPS/IMU system is deemed to be a necessary part of an overall system. See www.vexcel.com/products/photogram/.

IV (g) - Overall Assessment


If the photogrammetric and remote sensing hardware systems that are currently being offered for sale by Vexcel are all state-of-theart and fairly successful products, where will they stand if Microsoft's bid for the company does receive regulatory approval? Unlike the data sets and part of Vexcel's software portfolio, none of these hardware systems are an obvious fit into Microsoft's present wide range of activities. Indeed, in their editorial written on 29th March 2006 for the on-line edition of Directions magazine, Adena Schutzberg and Joe Francica have expressed the view that ".... it is hard to imagine that Microsoft will want to be in the business of selling sensors and ground stations, so our guess, just now, is that these will be sold off in time". If this prediction is correct, then undoubtedly there

Latest News? Visit www.geoinformatics.com

April/May 2006

11

Article

ALK Technologies Winner of


FleetCenter Integrates with CoPilot Live
Navteq, one of the global providers of digital map data for vehicle navigation and location-based solutions, recently announced the winners of the Navteq Global LBS Challenge. This global program challenged application developers to build innovative location-based services (LBS) that work with mobile phones or wireless handheld devices using dynamic positioning technology and Navteq maps. By Robin Wevers
Executives from wireless carriers, hardware and device manufacturers, venture capitalists and other influential players in the wireless industry served as the official judges and completed the judging in February during 3GSM World Congress in Barcelona. The judges based their final decisions on its commercial viability, unique functionality, and ease-of-use. Overall, out of 141 registration applications from 23 European countries Fleetcenter was judged to have the superior solution when these parameters were aggregated.

CoPilot Live FleetCenter


FleetCenter integrates with CoPilot Live to provide integrated satellite navigation, tracking and fleet control using connected Windows Mobile-based devices. FleetCenter is developed in partnership with organizations that are already deploying the solutions. The products combine to provide businesses of all sizes with a combination of commercial grade GPS navigation, real-time asset location-tracking, fleet optimization and management. Vehicle location data and other information is reported to FleetCenter via a mobile Internet connection, providing visibility of the locations of mobile assets in real-time. Fleet managers can monitor particular groups of vehicles and filter by group, status or estimated time of arrival. FleetCenter integrates with ALKs CoPilot Live 6 navigation, that turns Windows Mobile-based phones, Pocket PCs and Symbian-based phones into portable satellite navigation systems, complete with turn-by-turn voice guidance, address input and street maps. CoPilot Live incorporates real-time services as standard and makes use of Internet-connected phones to combine navigation with locationspecific information. In Germany the Bavarian Red Cross are deploying mobile devices equipped with CoPilot and will be using FleetCenter to manage vehicles at the 2006 World Cup.

ALK Technologies
The builders of FleetCenter, ALK Technologies, were founded in 1979 with headquarters in Princeton, New Jersey. The company develops solutions for corporate and consumer customers globally. ALK's CoPilot Live mobile GPS navigation solutions are available in Europe, North America and Australia as retail-branded products and as the basis for OEM navigation

The Award
The Navteq Global LBS Challenge was created to drive growth in the LBS industry by bringing together the key players in the LBS-wireless value chain. The Global LBS The Challenge awarded a grand-prize winner and category

winners. The grand prize winner was ALK Technologies, that developed CoPilot Live FleetCenter, a fleet-tracking, messaging, reporting and optimization application, which integrates with the CoPilot Live mobile navigation solution to enable real time asset visibility.

12

April/May 2006

Article

Navteq Global LBS Challenge


systems. The company's PC*MILER routing, mapping and fleet management solutions are used by over 22 thousand transportation, logistics and manufacturing companies worldwide. ALKs European operations are headquartered in Central London, with further offices in Germany and France. In total the company has over 120 employees. value chain is of content provider at the beginning of the value chain but they have developed relationships across the value chain. Fondrevay: We have relationships with every segment in the wireless services value chain and this helps us to be more effective in evangelizing location. The Global LBS Challenge is one of the vehicles that we are leveraging to help move this along. The challenge, launched in 2003, is intended to stimulate the development of location-enabled applications in the wireless space. It drives new application development while surfacing developer talent and raising the quality bar. Navteq believes the challenge stimulates location awareness and it brings LBS applications to the forefront.

The Future of LBS


David Quin, Marketing Director of ALK Technologies Ltd. has high expectations of the LBS-market: The integration of navigation with real-time tracking and fleet management is a functionality that I believe is going to make fleet management accessible to businesses of all sizes. It is a massive growth area. He also says: Integrated live services, for example real-time traffic information, and the ability to import databases and POI information relevant to your location or your route will become increasingly sophisticated and more accessible. As the quality and variety of content increases rapidly, location-specific content will become more compelling for users. Furthermore the increased variety of mobile devices with integrated GPS and highspeed mobile web connectivity mean that connected GPS navigation solutions will become ubiquitous for consumers and businesses. Jennifer Fondrevay, Marketing Communications Director of Navteq Europe, expresses similar views: The most important thing that is both happening and on the horizon is the integration of GPS chips into handsets. Operators in North America, South America and Asia have made the first move and are now beginning to launch services based on dynamic location in earnest. On the GSM side, we are seeing and hearing about the next generation of phones on broadband networks (UMTS and W-CDMA) also expected to include this integration. Cumulatively, it is creating a baseline and critical mass of location-ready devices and application developers are emerging to provide the requisite solutions that will increase the data services for the wireless operators.

Location Based Information


What makes location-based information so important according to Navteq? Fondrevays view: Location-based information and mobility is the perfect marriage of technology and utility. End users are placed in the position of not only being able to make decisions based on where they are or things around them are but also to modify and make new decisions or even extend the decision in real time. Location and mobility are perfectly aligned, mobility providing the need for real time decisions and the location enabling these decisions to be geographically relevant. Quins view on LBS: Location based and route-specific information is becoming accessible and meaningful enough to genuinely help businesses and consumers make informed decisions when travelling. Businesses will be able to achieve greater efficiency through being able to control their mobile team remotely and through fleet optimization. Consumers will have easy access to locationspecific information when in an unfamiliar place or to avoid delays. Simply put, location based information fully integrated with navigation makes being mobile easier.
Robin Wevers (r.r.wevers@freeler.nl) is a freelance writer of geo-ict-articles. More information can be found at www.alk.eu.com, www.alk.eu.com/copilot/cp_professional.asp or www.navteq.com.

FleetCenter integrates with CoPilot Live to provide integrated satellite navigation, tracking and fleet control using connected Windows Mobile-based devices.

Location Enabled Services


Fondrevay continues: Navteq believes that the initial market emphasis will be turn by turn navigation solutions because consumers inherently recognize the value of getting from here to there. Eventually, the focus will turn to

horizontal services for which location will be the enabler and not necessarily the vertical service itself, like navigation. Fondrevay prefers to talk about locationenabled services instead of LBS: Navigation will continue to be a part of the service offer, however, because even as you are becoming location-aware of the things, place and even people around you, you will always want to have at least the option to find your way "there". This is fundamentally why we believe the term LBS is a misnomer. The more descriptive term is location-enabled services because there are very few services for which integration of location does not create incremental value but also very few services whose core value proposition is location. The term LBS is a bit narrow in scope and potentially minimizes the role location plays; location enabled services better reflects the future role location devices can play in peoples lives.

Navteq Strategy
Navteq wants to continue to push the market to embrace location as a core, lifestyle enhancing enabler. Their nominal role in the

Latest News? Visit www.geoinformatics.com

April/May 2006

13

Interview

The Future for Leica's Terrestrial Laser Scanning Business


Interview with Ken Mooyman
Ken Mooyman, a Canadian by birth (of Dutch extraction), was educated as a surveyor at Algonquin College in Ottawa, Ontario. During the early 1980s, he managed a GPS services business and a project management consulting practice. Later he worked for Trimble Navigation as the company's Director of Sales for Canada and the Western U.S.A. He then left Trimble and, for a short period in 1999, he worked as Sales Director for the Vicinity Corporation. However he quickly returned to the surveying instrumentation field, joining Cyra Technologies (based in the San Francisco area) as Vice-President of Sales. Shortly aftewards, the Cyra terrestrial laser scanning business was bought by Leica Geosystems. In 2001, Mr. Mooyman became Vice-President for European sales and support for Cyras scanner products, based in Rijswijk in the Netherlands. He then returned to the San Francisco area business headquarters in 2003 as Senior VicePresident in charge of world-wide sales and support for what had by then became the High-Definition Surveying (HDS) Division of Leica Geosystems. Under his leadership and drive, the Division's revenues have grown spectacularly over the last two years. In recognition of Kens background and his successes with Cyra and Leica Geosystems, Ken Mooyman has just been appointed as head of Leica Geosystems' ground-based laser scanning business. by Gordon Petrie
Ken Mooyman, head of Leica Geosystems ground-based laser scanning business with examples of the company's HDS laser scanners in the background.

GP - Now that Leica Geosystems has been taken over by Hexagon AB, and given Hexagon's existing strengths in the metrology field, one might expect new product development to go towards very short-distance laser scanners that could be used indoors for metrology applications - for example, reverse engineering, prototyping and the measurement and recording of small objects. Will entry into this area be a long-term aim for Leica Geosystems and a strategic objective of Hexagon AB?
KM - One of the strategic objectives at Hexagon is to remain the leader both in the micro and macro measurement space. As the existing leader, we will continue to monitor the needs of our customers in all our markets and develop the required solutions. I think the trend will continue and even accelerate to move towards laser scanning as a mainstream tool in both these micro and macro markets. The benefits of accurate,
14 April/May 2006

high definition as-built (as-is) information are not only valuable to an engineer designing a bridge retrofit, but also to an engineer who is reverse engineering an automobile part. Laser scanning in all markets for Hexagon, including those within Leica Geosystems, will continue to be a long term aim.

GP - The former HDS Division of Leica Geosystems concentrated its efforts on the manufacture, sales and support of laser scanners that operated over medium distances between 1m and 25m (with the HDS4500 instrument) and up to 300m (with the HDS2500 and HDS3000 instruments). Does the newly formed Geosystems Division have an interest in extending its product range into the area of still longer range scanners - measuring distances up to 1,500m, like the ILRIS-3D instrument produced by Optech?

KM - There are really two aspects to this question. The first aspect is a historical one. Our founder was actually a civil/structural engineer who ran a large engineering and construction management company. In that business, he experienced at first-hand the need for much better as-built (as-is) information, especially for plants and related structures, than that which was typically available using traditional as-built methods. He recognized that better as-built information allowed better retrofit design, which could significantly reduce construction costs and risks for retrofit projects. He also saw that this was a large, industry-wide problem. So this led to the development of high-accuracy scanning systems with a maximum range of about 300m. This turned out to be a sweet spot in the market in terms of the wide variety of applications that users could benefit from the technology and it has been one of the main reasons behind Leica Geosystems commercial success thus far.

Interview

The second aspect is a looking forward perspective. If you look at the fundamental characteristics of terrestrial scanning systems for much longer ranges, you will find a few inherent shortcomings that limit the types of applications for which the technology is really wellsuited. One shortcoming is accuracy. At long range, the magnitude of the distance and angle errors combine with a much large spot size and wider point-to-point spacing such that high accuracy is just not achievable at very long ranges. Therefore, you see such systems being relegated to low-accuracy applications such as mining and terrain models. These are useful applications of the technology, but the cost of this technology today may be too expensive. If we can bring the right value to customers, we will certainly do it. The fact that all laser scanners are line-of-sight instruments imposes another practical constraint as you go out further. There are often obstructions between the scanner and the surface of interest and things tend to get worse as you get further from the scanner. Even if a scanner could capture data at 500m range, you may not have a clean line-of-sight to the surface of interest. So, in practice, users have to move a scanner around a site, only getting 50m, 100m or perhaps 200m practical line-ofsight range. Yet another problem at longer ranges is the scanners angle of incidence to horizontal surfaces. If the angle of incidence is too low, you cannot get a return and if you do, it is just not very accurate. The bottom line is that when you look at longer range, wide-area applications, many users would begin consider a combined solution of a Leica aerial scanner, such as the ALS50, and a terrestrial scanner such as the Leica HDS3000. This solution provides customers with the ability to map large areas and to augment points of interest with higher accuracy points.

with excellent quality and at fair cost. I think customers care more about local service and support, especially with a technology like laser scanning, in which support can be so critical to their success. So let me address what this move means from both of these perspectives. First of all, the talk you were hearing is correct and I am very excited about this move. With this move, we will be able to fully leverage Leica Geosystems high-end, state-of-the-art manufacturing facility in Switzerland. The Switzerland team is exceptional at quality, cost-efficiency, and longterm product serviceability. This is something that can be a challenge for a smaller manufacturing facility and especially one that is located in the San Francisco Bay area, which has very high manufacturing labor costs. Although we are moving the main manufacturing line to Switzerland, we are keeping a state-of-the-art testing, calibration, and repair facility in our San Ramon offices. So, customers in the Americas will still be able to get prompt service and support. Our European customers now also have access to a state-of-the-art European-based service, calibration, and repair facility. Asian customers will have access to both of these facilities. A final benefit of the manufacturing move for our customers is that without the distractions of periodically supporting manufacturing issues, we can better focus our San Ramon talents on product development (R&D), marketing, service and support.

This will lead to better and faster innovations for our customers. In summary, the move allows the California team to focus on product innovation and service/support and the Swiss team to do what our customers have come to rely on - produce high quality Leica Geosystems products and provide first class service and support.

GP - In aerial photogrammetry, there is much interest in data fusion, especially combining airborne laser scanned data with the image data produced by digital frame cameras and pushbroom scanners. Is there a similar strong interest in data fusion among land and engineering surveyors? If so, do you foresee further developments on the instrumentation side with calibrated photogrammetric imagers such as digital panoramic cameras being integrated into ground-based laser scanners? Presumably this would require very close co-operation between your own unit and the photogrammetric side of your company's Geospatial Imaging Division, both regarding software as well as hardware.
KM - There are currently numerous examples of data fusion in the market. One example - combining digital photographs and clouds of points - has already become common practice in laser scanning. This is done either using cameras that are integrated directly into the scanner itself or using images from external digital cameras that are not attached to the scanner. There has been a lot of activity in the industry over the last year in this area, including our own Cyclone 5.4 release.
There are many different sensors to capture imagery or spatial information such as total stations, GPS, digital aerial cameras, and terrestrial laser scanners. The key is how to combine this information efficiently to provide intelligent data to users. Today, existing customers routinely use their total stations and GPS instruments to complement the data captured by laser scanning. I think the real key in the future is not with the sensors but with the software that can intelligently reference, measure, analyze and present this fused data. Software applications like Cyclone should not care about the type of sensor used to capture the information. The challenge from a vendor standpoint will be to continue to keep pace with the sources of new spatial and image information and make it easy for users to take full advantage of this information in their office software and in their networked and web environments.

GP - Before the take-over by Hexagon, Leica Geosystems had already moved its production of the ALS50 airborne laser scanner from Massachusetts to its main manufacturing plant in Heerbrugg, Switzerland. Now there is "talk" within the surveying industry that the production of the HDS3000 ground-based laser scanner will be transferred from the factory in California to Heerbrugg. Would you please like to comment on this "talk"? I am sure that the answer will be of much interest to your existing customers!
KM - Actually, I dont think most customers care too much about where products are manufactured, as long as they are produced

16

April/May 2006

Interview

where the value in laser scanning is not so much the productivity to capture the as-built info but more in the downstream benefit of lower construction costs, lower construction risks and shorter down-times of the facility. Reducing the period of time an oil rig is down for a revamp project is just as valuable in Malaysia as it is in the North Sea or the Gulf of Mexico. From a geographic standpoint, things are somewhat more competitive in Europe than other areas, as this is where most of the other hardware vendors happen to be headquartered. These vendors tend to focus their marketing and sales efforts close to home and customers are a bit less worried about support if the vendors headquarters are next door. In terms of vertical markets, I think we have a strong leadership position in each one except for the mining market, although we have some good successes here as well. Open pit mining is one market where the very long range systems have a good fit, as accuracy requirements are low, line-of-sight may not be a problem, and, if you are scanning vertical walls in an open pit, you dont have an angle-of-incidence problem. The last portion of your question about the level of sophistication of prospective customers - is an interesting one. We are fortunate in that we do well across the board, but I think that we probably tend to get an even higher share of the more sophisticated users who have done their homework. For example, there are as yet no standards for specifying laser scanners. Some vendors are much more aggressive than others with their specs and what they include or exclude from their specs. Leica Geosystems has tended to be very conservative with specs, often choosing to let the actual performance of our products significantly exceed our specs. So, newcomers are well advised to evaluate vendors specs carefully. More sophisticated prospects will do the extra homework to make sure that they get what they want. They will talk with other users and actually test products before buying. They also tend to be more successful when they do buy, which leads to more referral-based business for their vendors. Because of this, we put a lot of effort into educating the market about the technology to help customers make well informed decisions.
Gordon Petrie (gpetrie@ges.gla.ac.uk) is Emeritus Professor in the Dept. of Geographical & Earth Sciences of the University of Glasgow, Scotland, U.K. Visit www.leica-geosystems.com for more information about the company and its products

An image of the interior of the central dome of the famous Hagia Sophia church in Istanbul, Turkey that has been generated by a Leica Geoosystems HDS laser scanner.

As part of Hexagon, we are fortunate to add yet another dimension to the aspects of data fusion. As you know, Hexagon is expert in micro measurement technology. There are also data fusion needs and opportunities between the micro and macro worlds. In Hexagon, sharing R&D and other competencies is a key company value. We work closely within Hexagon to share development for both hardware and software efforts to bring more complete solutions to customers. Our customers can be assured of this.

GP - Turning next to your market leadership in ground-based laser scanners, what do you feel are the main factors that have produced this success? And where does your main competition come from - does it vary (i) from one regional market (North America, Europe, Asia, Australia) to another; (ii) within different industries (surveying and non-surveying applications); and (iii) according to the level of knowledge and sophistication of your existing users and your potential customers?
KM - Well, it is no accident that we have such a strong leadership position in the industry. There are really several key factors. (i) One is having been an industry pioneer we were able to get off to a fast start right out of the gate. Together with early adopters, we created the growth of this market. Today scanning is already accepted as the preferred solution in a number of market segments. (ii) A second factor is that I think we have made some sound strategic decisions. We offer state-of-the-art software and hardware. The micro-chip laser offers the cleanest dataset and that helped to improve accuracy. We have the exclusive right to the two

patents on the use of micro-chip lasers. The power of the Leica Cyclone software is well known. Customers expect our commitment to develop these technologies further. They trust Leica Geosystems to have the stamina and the resources to further advance this technology for their benefit. I think the market has highly valued this in Leica Geosystems. (iii) I think a third factor behind Leica Geosystems success is that we have hit the mark in terms of the breadth and the capabilities of our hardware and software products. Laser scanners and associated software are still fairly costly to acquire, so customers really want them to be as versatile as possible in terms of the number of applications that they effectively address. The capabilities and actual performance of both our hardware and software have met this need and this has been a big plus. (iv) If you ask around, youll find that customer support is the fourth factor. Especially with a new technology, support can be just as critical as the products themselves as far as helping to make users successful. And, in the end, this has really been the biggest factor of all: the success of so many of our customers. It is their success that fundamentally drives adoption of the technology and has continued to bring the majority of new customers to Leica Geosystems. The main competitor in the laser scanning industry is not another manufacturer but the way customers are currently doing their asbuilt activities. This barrier does vary from region to region. In areas of generally higher labor costs, such as much of Europe, North America and certain parts of Asia/Pacific, laser scannings productivity advantages are more appealing. In regions with low labor costs, we focus on industries like oil & gas,

Latest News? Visit www.geoinformatics.com

April/May 2006

17

Conferences & Meetings

A Discussion on SDI in the UAE


Seminar on Spatial Data Infrastructure
GIS Development and the Military Survey Department of the United Arab Emirates (UAE) presented a Seminar on Spatial Data Infrastructure in Abu Dhabi, UAE, on Feb. 12, 2006. Invited speakers from Australia, Canada, India, Malaysia and the United States shared the challenges they have faced developing and implementing a Spatial Data Infrastructure (SDI). By Daniel Shannon
Chief of GIS Development reminded us of the following: Development of a stateendorsed online spatial data system is still a distant dream in the Internet age for most developing and emerging countries. This is not to say that there is either a public or a private sector solution. As General Khalifa stated: A partnership between government and the private sector needs to be developed to realize the benefits of SDI.

Urgency
Spatial Data Infrastructure, all presenters had compelling cases for why an SDI is required. According to Major General M Gopal Rao, Surveyor General of India, an SDI will save time, effort and money in finding data and will avoid unnecessary duplication. Host and keynote speaker Brigadier General Khalifa Al Romathi pointed to better outcomes through improved economic, social and environmental decision making while others cited standardisation and support for the development of the geospatial industry itself. The need for government leadership appeared to be the consensus opinion. The role of government in the establishment of a Spatial Data Infrastructure does not, however, preclude or diminish the focus on the private sector. Both the general public and industry are expected to benefit when doing business within the framework of a fully realised SDI. Indeed, Canadas vision of Spatial Data Infrastructure specifically focuses on enabling value-added commercial activities. Growth of the geospatial technology industry is prompting the organisations in attendance in Abu Dhabi to move quickly, and the sense of urgency only builds as volumes of legacy data grow. The acquisition of geospatial data is accelerating, and any major re-definition of spatial framework will have a major impact. For example, Dr. Taibs paper articulated the challenges of implementing the GDM2000 datum and the difficulty in applying such changes to the catalogues of data built upon what is now obsolete datum. This is an issue particularly for early adopters of GIS those who have built the largest catalogues of data on early frameworks. Global economic growth will place a premium on being able to effectively deploy and manage traditional infrastructure. This was put in the spotlight recently with the controversy surrounding the transfer of U.S. port management contracts to Dubai World Ports. The financial community has also acknowledged this trend, as evidenced by the creation of new Infrastructure Funds. In the same way, exponential growth of geospatial data and applications will challenge our ability to move spatial data seamlessly across markets and jurisdictions. How well we fare may depend upon how the global geospatial community delivers on the promise of an SDI.
Daniel Shannon (daniel.shannon@telus.com) is

GITAs Executive Director Bob Samborski.

Parallel
Many speakers drew a parallel between traditional infrastructure and SDI, and the setting couldnt have been more appropriate. The Emirates of Abu Dhabi and Dubai provide stark visual examples of the demands being placed on infrastructure in the face of the massive growth occurring in the region. The Geospatial Information & Technology Association (GITA)s Executive Director Bob Samborski spoke on the organisations research into return on investment for geospatial technology implementations. Several of his observations were germane to the discussion on Spatial Data Infrastructure. Particularly relevant was the assertion that sound financial analysis is fundamental to any investment in geospatial technology, and that those investments must support business objectives.

Economic Benefits
But will private industry wait? Virtually all of the six nations represented in Abu Dhabi highlighted economic benefits as a primary objective of their SDI. However, they also spoke of efforts that have taken decades, and some that are expected to take just as long. In the world of information technology, this is a long time to spend defining ontology. More to the point, Ravi Gupta, Editor-in-

Manager Data Operations with TELUS Geomatics in Edmonton, Alberta, Canada.

SDI Necessary?
While it is difficult to quantify the value of something as large and overarching as a

Latest News? Visit www.geoinformatics.com

April/May 2006

19

Article

GNSS Update
Launch of New GNSS Receivers and Chipsets
At the moment GNSS product manufacturers are busy developing new products. Chipsets are becoming available that are capable of receiving signals from all three GNSS systems. Furthermore the production of chipsets for the new GPS frequencies is coming up to speed. But the development of GNSS systems themselves is not at rest either. One fact is that GPS has become ten to fifteen percent more precise over the last few months. By Huibert-Jan Lekkerkerk
ESA's Director General, Mr Jean-Jacques Dordain, delivering his address at the contract signing ceremony. (source: www.esa.int).

Galileo
A contract between the European Union and Galileo Industries Gmbh for the development of the first four Galileo satellites was signed in Berlin on the 19th of January this year. For the European Union Giuseppe Viriglio, director EU and instustrial affairs from ESA, signed the contract. For Galileo Industries CEO Gunter Stamerjohanns signed the contract. This is an important step towards the development of an operational Galileo system. Furthermore the European Union and Korea signed a contract for mutual cooperation in the development of Galileo, after six months of negotiation. Korea is not the first Asian country to participate in the development of Galileo since earlier contracts were signed with both China and India. GIOVE-A, which was launched December 2005, is fully operational and extensive tests are taking place. Ground stations in the Netherlands, Belgium and Great Britain are tracking the satellite and the broadcasted signals. The great radio telescope in Chibolton, Great Britain, is for example used to track the signals from GIOVE-A in order to gain insight into the radio environment in the satellite orbit. Furthermore tests are performed to check whether the Galileo satellite signals are interfering with other radio signals. The first experimental receivers made by Septentrio, Belgium, are being tested with the use of GIOVE-A as well. This is an important aspect of the development of Galileo since this provides insight into the practical use of Galileo. The sister satellite of GIOVE-A, GIOVE-

Signing of the contract for the first four Galileo satellites (source: www.esa.int).

Egnos
In December 2005 the use of Egnos for controlling rail traffic was tested in South Africa. In this test only GPS, Egnos and train bound sensors were used, eliminating the need for expensive railroad based sensors. Furthermore this was a test for calculating and broadcasting Egnos signals over Africa. It is expected that Egnos will be expanded towards the African continent in (near) future. March this year showed further testing in Portugal performed by Alcatel Alenia Space. This test was directed at localizing GSM telephones for better response to 112 emergency calls. It is expected that in the years to come over half of the mobile telephones will use the technology tested. This technology, which uses a combination of GPS, Egnos and GSM positioning, makes exact telephone location possible, both indoors and outdoors.

20

April/May 2006

Article

B, is currently being put together at Alcatel Alenia Space in Italy. When complete, the satellite will be transported to Estec, the Netherlands, for testing in the laboratories under simulated space conditions. The launch of GIOVE-B is planned for the autumn of 2006.

GPS
The GPS satellite tracking system used by the American air force was recently updated. As a result twice as many orbital information is collected, resulting in an improvement of ten to fifteen percent of the precision of the GPS system. The first Block IIF satellite, which is being built at Boeing, has undergone the first radio tests with success. In January Boeing received an order for three additional Block IIF satellites. Including options this amounts to a total of nine satellites commissioned to Boeing. Boeing has a rich history in building GPS satellites since they also built the block IIA satellites. Of these block IIA satellites two have been in operational service for more than 15 years, twice the design life.

GLONASS
After the successful launch of three satellites in December 2005, Russian president Poetin

has decided to become personally involved in the GLONASS program. As a result four additional satellites will be built in 2006, resulting in three satellites to be launched in 2006, followed by seven in 2007. Building additional satellites however is no luxury. In the last article we already mentioned the large amount of First radiospectrum received from GIOVE-A (source: www.esa.int). old(er) GLONASS satellites. Over the last few months, The GNSS market is not only preparing for three have stopped functioning, bringing the Galileo, but even GLONASS seems back in number of active satellites down to 11. Of the grace after a number of years with virtually no satellites launched in December, two have not GLONASS receivers (or satellites) available. been activated yet so the number of active Some highlights: satellites can go up to 13 if no other satellites Trimble recently introduced their new breaks down. GLONASS and GPS combined receiver GNSS Receivers (R8 GNSS). Apart from being able to It is good to see that GNSS systems are in receive Glonass signals, this receiver can constant development, but without the also handle the new L2C and L5 frequenreceivers there is little use for these improvecies introduced into GPS. There is no ments. However, the manufacturers of GNSS sign of Galileo compatible receivers at receivers and chipsets seem to realize this as Trimble at the moment; well and at the moment we see one new sys Leica launched a new series of receivers tem after the other. and reference stations as well as supporting full GNSS capability, including the new GPS frequencies, GLONASS and Galileo; Topcon, a company which has always been a full GNSS supplier, brings the G3 technology thereby choosing the same approach as Leica; Novatel chooses a different approach with the GPS+ technology for GPS and GLONASS L1/L2 on one hand and the Galileo / GPS technology on the other hand. The latter is capable of receiving both L1/L5 and E5 frequencies; Javad, who until half 2005 had an exclusive agreement with Topcon for the development of land survey GPS receivers, is currently producing GPS and GLONASS combined products only. They have however recently announced the first products based on the new GeNiuSS chipset, which is capable of Galileo reception as well.
Huibert-Jan Lekkerkerk (hlekkerkerk@geoinformatics.com) is a freelance writer and trainer in the field of positioning and hydrography.

Chilbolton Observatory in England where GIOVE-A tests are being performed. (source: www.esa.int)

Latest News? Visit www.geoinformatics.com

April/May 2006

21

Special

Autodesk Goes Open Source


Giving Way to Two-year-old Visionary View
Over the past few months the author of this article got a bit confused by some of the announcements made by Autodesk. The release of the 2007 portfolio of products, nicely demonstrated by Marketing Manager Northern Europe Sjoerd Lazeroms at a press meeting in Rotterdam, the Netherlands, May 2nd, was quite straightforward. But what about the Open Source version of MapGuide? And where did MapServer Enterprise, MapServer Cheetah, and the MapServer Foundation go? Time for an overview. By Sonja de Bruijn
Director ISD Northern Europe Frank Ostyn, who talked about the success of Civil 3D as being one of the most rapidly adopted products in company history during 2005. A remarkable fact since only now there seems to be hardware available which is capable of meeting the demands of this type of software. Civil3D can be an aid in automating workflow, and probably wont be a total stranger to AutoCAD users since it is based on this Autodesk product. Ostyn also introduced Autodesk MapGuide Enterprise, which will be discussed later on in this article.

Autodesk Topobase
Since the acquisition of C-Plan Autodesk has been working hard on making Autodesk Topobase applicable to the vertical market. The main aim is not only to offer building stones but implementation as well. Ostyn explains that this software is the final step in the Autodesk GeoSpatial Growth path. According to Ostyn specific solutions for utilities will be released later this year. Raster data are getting more and more popular, said Ostyn. Either paper drawings or maps are digitized or satellite images and aerial photos fit in. This is why Autodesk Raster Design is quite popular. Version 2007 offers interoperability with Autodesk Map 3D 2007 and Autodesk Civil 3D 2007s DEM support. The software is also compatible with AutoCAD Electrical and supports ESRI GRID files. There is a new JPEG2000 library and Autodesk Raster Design 2007 reads support for DTED format elevation data from the national Imaging and Mapping Agency, or NIMA.

The AJAX Viewer delivers raster based maps to almost any browser, including Safari. This viewing option ensures that any user on any platform can access designs and maps without requiring a specific browser.

CAD Talk
Lets start with an overview of the 2007 products. Autodesk has retired the AutoCAD 2002-based family and it will not take long before exactly the same thing will happen to the AutoCAD 2004-based family of products. During the plenary session on March 2nd in Hotel New York, Rotterdam, Lazeroms first introduced Autodesk Inventor 11 as one of the new next generation products. This version is specifically meant for AutoCAD users wanting to move to 3D. True DWG compatibility, a complete concept to production process via fully integrated 2D/3D design solutions and a dedication to functional design are just a few phrases applicable to Autodesk Inventor 11. Another newborn family member is AutoCAD 2007 (www.autodesk.com/autocad): a version

based on an intuitive way of working and new visual styles/rendering tools to present concepts to non-technical audiences. Conceptual design and accessibility for both experienced users and beginners were key when developing this new version. When 2D drafting is concerned AutoCAD LT 2007 (www.autodesk.com/autocadlt) can be used: software that features Dynamic Block Authoring and integrated Layer management tools.

Open Source
Open source was another hot topic this afternoon. However Autodesk felt the need to elaborate on this one month later in Hotel Chez Gerard, London. At least this helped to clear up obscurities around new or replaced product names and the name change from the MapServer Foundation to the Open Source Geospatial Foundation (www.OSGeo.org). Officially the move towards Open Source started November last year when Autodesk released the code for MapServer Enterprise as open source software. Three months later, to be more precise on February 4th, the

Two Sessions
So far the most relevant part for CAD users. The afternoon of the press meeting with the subtitle Accelerate Your Ideas was split up into two sessions: Mechanical Solutions Division (MSD) and Infrastructure Solutions Division (ISD). The last one was led by

22

April/May 2006

Special

ping definition, previewing the layout and est PHP, .NET, and java stylization and managing data access. Users tools. Furthermore the can also integrate business logic written in user will find Feature PHP, ASP.NET or Java directly into the appliData Objects (FDO) cation and preview it within Studio. providers for SDF and SHP in MapGuide Open AJAX Viewer Source. FDO Providers And there is even more news: besides the for ODBC, ArcSDE, DWF viewer an AJAX viewer is also available. MySQL, GDAL Raster, Van der Pol told the select group of journalWMS, and WFS will ists present in London on April 5 that this is become available mid also a free viewer that offers the same func2006. All this is or will tionality as the DWF viewer (dynamic be open sourced. In pan/zoom, scale-dependent detail etc). The contrary to AutoCAD or difference is that the AJAX viewer is raster AutoCAD-related prodbased and panning and zooming happen a ucts!, stressed Van der The DWF Viewer uses an ActiveX control to display vector-based maps on lot more smoothly. Another advantage of Pol at the meeting on Microsoft Windows systems running Internet Explorer or Firefox for viewing of the AJAX viewer is that it works both with April 5. MapGuide Open maps, designs, and related data. Use of DWF technology provides printing and Internet Explorer and Mozilla Firefox. Source 1.0 (preview verplotting, as well as support for a disconnected mode for taking spatial data Development is simple: all you have to do is sion, as well as docuinto the field. writing your application logic within your mentation and FAQ) can Open Source Geospatial Foundation server environment and it works with either be downloaded via https://mapguide.osgeo.org, (OSGEO), at that time called the MapServer viewer on any client. and is in fact called an Open Source Foundation, held its first meeting in Chicago. There is one more new product to cover Geospatial Foundation project . A board of Directors was formed which before lunch: the commercial CAD/GIS tool Autodesk MapGuide Enterprise represented organisations like Mapbender Autodesk Map3D 2007, available since the Then there is the commercial version of (Germany), GeoServer/GeoTools (The Open second week of April and equipped with Open Source MapGuide: Autodesk MapGuide Planning Project, USA), and MapGuide functionality to publish data & stylization to Enterprise. At the moment a beta version is (Autodesk, USA). the MapGuide Server. This product is built on being tested but the actual product will be On March 06 the open source geospatial AutoCAD 2007 and developed for users put into the market mid 2006. Van der Pol: community officially announced the formation wanting to integrate CAD and GIS data Autodesk MapGuide Enterprise will be availof the Open Source Geospatial Foundation. throughout an organization. New in this verable in several languages localized by As the official press release states the mission is the ability to directly access spatial Autodesk. Whether the open source version sion of this not-for-profit organization is: data like SDF. ESRI Shape, Oracle Spatial, will be available in other languages depends support (financially, organizationally and ESRI ArcSDE, SQL Server, mySQL, OGC WMS, on the open source community. The product legally) and promote the collaborative OGC WFS, DTED, ESRI GRID and GeoTIFF. It is will contain everything availdevelopment of open geospatial technologies able in the Open Source verand data.. It will also serve as an sion, plus additional FDO independent legal entity to which community functionality (Oracle, SQL members can contribute code, funding and Server) and commercial-grade other resources, secure in the knowledge projection support from that their contributions will be maintained Mentor. In contrary to for public benefit. MapGuide Open Source, this MapGuide Open Source version will be thoroughly Roughly six months after the release of the tested and certified. Currently code for MapServer Enterprise we now have available as a preview (via MapGuide Open Source: free web mapping www.autodesk.com/ software composed of a Linux/Windows servmapguidestudio) is Autodesk er, web extensions (for application developMapGuide Studio, a commerment), Studio (for map authoring), viewers cial authoring tool which can (both raster and vector) and a site adminisbe used both with MapGuide trator. The product is licensed under the GNU Open Source and Autodesk Lesser General Public License (LGPL). This MapGuide Enterprise. Besides enables users to develop and distribute spaa developer-friendly authoring tial and design data over the web and can environment (modelled on reduce total cost of ownership for a web popular web development mapping solution. tools) the product offers The software provides the option to autostreamlined authoring. Users install and configure the Apache HTTP server, can define rules for importing Director ISD Northern Europe Frank Ostyn: Instead of being a user the custhe PHP scripting language, and Tomcat, the and converting data. Other tomer is turning into a developer. This is why open source is becoming increasingly important. Apache servlet engine. It works with the latfeatures are thematic map-

Latest News? Visit www.geoinformatics.com

April/May 2006

23

selling a product to buying a model of a product and finetuning it inhouse and instead of being a user the customer is turning into a developer. This is why open source is becoming increasingly important: companies want to exchange and use codes from other companies. On April 5th in London Ostyns colleague Van der Pol highlighted aspects like following a Autodesk MapGuide Studio can be used to produce attractive thematic maps and trend set by companies provide spatial analysis and reporting functions here, creating buffer zones like Sun, IBM and around selected parcels. Redhead (though these organisations are not also possible to import Civil 3D design data exactly similar to Autodesk), improving the and new vector, raster and 3D surface visibility of Autodesk in the market and the engines are provided as well. According to ability to have quicker software releases Van der Pol it will offer enhanced stylization (twice instead of once a year). and advanced labelling: just tell the softAutodesk already started building the new ware and labelling around the corner will be MapGuide architecture two years ago. The done automatically for example. discussion and decision to go open source was made halfway 2005. Van der Pol admits it is still a big experiment. What we do know is that for example in France and Germany there is a need for open source software. We also notice that there were 1368 Windows Open Source downloads and 1984 total Source Code downloads between Linux and Windows within 18 days from www.OSGeo.org last month. New developers are approaching Autodesk MapGuide Studio puts data and resources close at hand and is meant at making it easier to organize and manage maps and geospatial data. The Autodesk and are startability to preview maps provides immediate feedback when authoring and ing to build new applicastreamlines application development. tions on our products. We also believe that Why Open Source? open source can be beneficial to education Putting 60 man years of development into and (local) government by offering less the open is something we can at least call expensive alternatives. However only future quite remarkable. So what made Autodesk can tell whether we made the right move. decide to take this major step? During the Sonja de Bruijn (sdebruijn@geoinformatics.com) is press meeting on March 2nd in Rotterdam editorial manager of GeoInformatics. Ostyn talked about a shift from companies

Column

The Brave New World of Mapping


When those GPRS-enabled mobile phones came out a few years ago, Siemens promoted one of their new models with a satirical advert entitled Waste time faster. It promised that the phone would accelerate the downloading of superfluous emails and nonsense websites, and concluded that productivity will triple because youll get three times as much nothing done in the same time.

The phone was a big success, and the marketing people were having a good laugh. As this issue of GeoInformatics shows, the mapping industry has fully embraced the web and mobile markets. There are endless opportunities to improve productivity or help us get lost more efficiently. But what about the good old paper map? Can it survive the age of digital enlightment? So, in the spirit of that Siemens advert, lets have a look. But before we start, please note that this column is entirely satirical and untrue. Honestly. With a paper map, what you see is what you get in full A0-size colour. With a mobile device what you see is a map the size of a peephole, pixelated like the face of a criminal on TV. Yes, there is a zoom button but the scale ratio resembles my phone number divided by the age of my neighbours cat. And since you need to carry the Hubble telescope to read the screen, it makes the device slightly less portable than intended. Which is especially annoying when its battery dies half-way up the mountain. You might as well have thrown it over the first cliff and saved yourself the trouble. The paper map just keeps going. Some have lasted 4,000 years. That is three million times longer than the average battery. Granted, digital storage media dont need power, but can you still read a 20-year old floppy disk? A paper map still works after you have dropped it from the 20th floor, driven over it in a Jeep, or even checked it in as airport luggage. Paper also has unbreakable interoperability and scalability. You can collect as many paper maps as you like, and use them for any purpose without format conversion. Okay, my Times Atlas weighs as much as a Linux server farm, but it is compatible with any table or cupboard, and does not stop working when I upgrade the furniture. Its also immune to viruses, so there is no need to set off the alarm every time someone else is having a look inside. The paper map is interoperable with any human mind,

and sometimes it even produces magic. In World War I a group of Hungarian soldiers, lost in a blizzard, saved their lives with a map of the Pyrenees even though they were in Switzerland. GPS would probably have guided them into the next ravine. And what about romance? An in-car navigation system might save your marriage for various reasons, but how boring is it never to get lost in the woods anymore? Also as a gift to your lover, a stylishly framed map, for example to commemorate the location of your first kiss, might produce further passionate lat/long events. Not every lover enjoys maps but you must admit it beats a KML file by a mile. So, as a true explorer, buy yourself a paper map (it doesnt even matter which one). Walk until you are beyond the reach of the charged battery, and youll be able to enjoy the whole place in peace. Actually I just made that up. Please remember everything I just said is a joke.

Thierry Gregorius (thierry.gregorius@shell.com) is Programme Manager for Geomatics and Information Management at Shells international headquarters in the Netherlands, and was previously Global GIS Coordinator. The views in this column are entirely personal.

The paper map just keeps going. Some have lasted 4,000 years. That is three million times longer than the average battery.

P.S. As my previous column went to press, the organisers of AGI2006 changed the venue from Chelsea FC to a non-football location in London, and the date from November to September. This also made that column a joke but at least it helps a consistent theme emerge.

Latest News? Visit www.geoinformatics.com

April/May 2006

25

Special

Assessing Fitness for Purpose for ReThe Industry Needs to Collaborate


Data used on the web must be constantly monitored and rigorously tested to ensure fitness for purpose. The user must be guaranteed accurate results from online queries. This means the data used for analysis and decision-making must be accurate. This article looks beyond the map and focuses on how web users might access spatial data from multiple sources. The importance of data certification and spatial data quality management will also be highlighted. By Mike Sanderson and Steven Rammage
point. In order to carry out effective data aggregation activities and tackle spatial data quality issues the industry needs to collaborate. In 2003, Laser-Scan worked with Autodesk, Intergraph, MapInfo and Oracle on an industry initiative called Open Spatial Enterprise. This was a customer driven initiative based on customer requirements at the City of Winnipeg and Thames Water. A realworld interoperable spatial data management platform started to emerge. While the applications could all share data from the same central Oracle database, the initiative aimed to standardise differences in the handling of cartographic text or the orientation of points around a given topological definition of object geometry. This resulted in an industry standard for managing spatial, annotation, and attribute information. The resulting specifications are the subject of an OGC standard and have been adopted as a table structure in Oracle Database 10g. Subsequently other vendors like Bentley and Star Informatic joined the initiative.

Data Quality Issues


The initiative mentioned above was data centric as opposed to GIS centric. What does this mean? The mainstream IT industry has been tackling these issues for several years longer than the GI community. According to Oracle Magazine (March/April, 2005) US businesses lose $600 billion per annum on business data quality issues. The smart way for the GI industry is to make use of these experiences. Han Wammes asked for the industry to revisit spatial data quality (Geoinformatics, March 2006). It can be graphically illustrated. Reading up on the attack on the Chinese Embassy in Belgrade in 1999, it is apparent that the CIA had access to maps of the area for 1992 and 1997. The 1992 map showed the embassy in a different location in the city. On the 1997 map an unlabelled building was selected as the target by comparing parallel road street numbering based on a visual inspection. Whilst the CIA was confident it had identified the Serbian Ministry of Supply, it was the Chinese Embassy that had re-located.

Figure 1: Data Quality Improvement Cycle.

Value
In 1999 The Paper Industry Research Association (PIRA) valued geographic datasets assembled over the previous 10-15 years in the then European Community at 36bn. It estimated that it was double this value in the United States of America. Recent estimates indicate the value has risen to 100bn in the E25 community. Most of this geographic data was collected before GPS became ubiquitous. Not only that but this asset needs to be used in support of sustainable policies and development across Europe. The ability to process these data automatically in situations not envisaged by the original collection programmes (we will term it re-use) becomes an essential goal of delivering on the i2010 agenda. Already spatial data quality problems are holding back the European initiatives. There is

evidence that the heavily fragmented geographic community in Europe is failing to tackle interoperability and spatial data aggregation. The 2006 eContent+ programme focussing on making digital content more accessible, usable and exploitable was only able to support 3 GI projects with a value of 3.5m against a total available budget of 50m. Something is holding back a spatial contribution to the European knowledge economy.

e-Government Initiatives
The demand for increasingly accurate data will be driven by the need to automate spatial data processing. In order to deliver on eGovernment initiatives or joined-up decisionmaking if you prefer that term, data from different sources need to be made available across the web and aggregated without human intervention. Interoperability is the start

Certifying
The supply chain for spatial data re-use, starts with discovering data in registries across the web. Data in these registries needs more than

26

April/May 2006

Special

use of High-Quality Spatial Datasets


a statement about content and temporal validity. Organisations should be certifying their spatial data quality to allow re-users of the data to make a determination of its fitness for purpose. The first phase of spatial data collection was driven by the need for automated map production. The current phase of spatial data usage needs to support the process of making critical business decisions electronically, with all the relevant and correct information at our fingertips. We need to be sure that it is fit for our particular purpose: because it is tested and certified to be so. This means identifying patterns in spatial data and providing data discovery tools to determine the rules behind those patterns. The next step is to check rule conformance. This process provides meaningful data certification where data conformance can be expressed as a percentage of conforming instances. This allows the supplying organisation to quantify conformance and establish EQFM frameworks (www.eiqa.com/products/ quality/model/index.asp) for investing in spatial data. It also allows the supplying organisation and the re-use organisation to assess the fitness for purpose that the data can be put to. The re-using organisation may be seen as an aggregator. It will change data for its own purposes. It may report some of these changes to the originating organisations, it may not. This data improvement cycle can be visually represented, see Figure 1. usage and lineage (that is: the history of a dataset); data quality elements: components of data quality, which can be expressed in quantitative form (see below).

SWRL
There is much work still to be undertaken for data re-users in Europe to take advantage of the valuable geographic datasets already created. Further development of the semantic web rules language (SWRL) for use with spatial data constructs is necessary. Reusers will still be faced with the problem of semantic interoperability, or the difficulty in aggregating data that were collected and tagged using different vocabularies and different perspectives on the data. To achieve semantic interoperability, systems must be able to exchange data in such a way that the precise meaning of the data is readily understandable and the data itself can be converted or translated by any GIS or web mapping tool into a form that it understands. In addition for the re-users or aggregators there is a need to have access to a set of presentation standards. Web Processing Service (WPS) is still immature. It describes individual geoprocessing services accessible via the Web.

The following data quality elements are defined in ISO 19113 not all of these will be applicable to every dataset: completeness presence and absence of features, their attributes and relationships; logical consistency degree of adherence to logical rules of data structure, attribution and relationships; positional accuracy accuracy of the position of features; temporal accuracy accuracy of the temporal attributes and temporal relationships of features; thematic accuracy accuracy of quantitative attributes and the correctness of nonquantitative attributes and of the classifications of features and their relationships. So we have the basis for making the assessments we need for re-use. What we need now are: A framework for making assessments; Mechanisms for presenting data to the assessment framework. Laser-Scan has developed a framework for making the assessments. It is called Radius Studio, see Figure 2. This is based on a common language interface and makes use of mainstream standards: W3C, OWL, ISO TC/211, OGC GML and WFS.

MapGuide Open Source


The recent open source announcement by Autodesk (http://usa.autodesk.com/adsk/servlet/item?sit eID=123112&id=6153839) and the formalization of the MapGuide Open Source site, which is hosted by the Open Source Geospatial Foundation, is an important initiative. It creates the opportunity to create a community. Is the value of public spatial datasets in Europe not worth creating a community that is focused on assessing the fitness for purpose for re-use? We believe it is.
Mike Sanderson is CEO of Laser-Scan, Steven Rammage works as a Product and Marketing Director for Laser-Scan. More information can be found at www.laser-scan.com or by sending an email to info@laser-scan.com.

Spatial Data Quality


Spatial data quality may be described as: Data quality overview elements: components of data quality which can only be described subjectively such as purpose,

Figure 2: Radius Studio Architecture.

Latest News? Visit www.geoinformatics.com

April/May 2006

27

Special

Online Information System BIPT


Use of Intergraph Products
As the Belgian regulatory body for postal services and telecommunication, the Belgian Institute for Postal Services and Telecommunications (BIPT), headquartered in Brussels, Belgium, provides a pivotal service. According to the Royal Decree of the 10th of August 2005, communication antennas must abide by radiation thresholds for electromagnetic waves for frequencies between 10 MHz and 10 GHz. In addition, operators have an obligation to share both the locations of communication antennas as well as radiation reports with the general public.
system in the future. According to Peter Van Huffel, engineer BIPT, The new Web-based system has been of benefit to our organization with regard to reducing the manual labor involved in answering queries. The public can now simply access the online system and search for communication antennas using a number of different criteria, such as searching by municipality, street name, or postal code. The system was easy to implement, requiring no customization. We had the site up and running in less than a day.

Future Plans
By Allison Pullen
The BIPT plans to revise and improve its corporate Web site to include the new Web-based system, while still maintaining the separate domain address, enabling users to access the information from multiple locations. Future plans also include the augmentation of possibilities for making queries within the system. Additional functionalities of GeoMedia WebMap Publisher will be incorporated, including better zoom in/zoom out capability. The process of updating the data will eventually be automated. As it exists now, the data resides in Access databases. These databases will be replaced by SQL server connections, in which new data will be generated on a daily basis.
Allison M. Lowery Pullen (allison.pullen@intergraph.com) is Corporate

User-Friendly
The BIPT had specific requirements in mind when searching for its Web-based system. First of all, the system had to be user-friendly and intuitive, requiring minimal staff training. Secondly, the system had to have a quick implementation and turnaround time. The BIPT didnt want to waste time implementing a time-consuming system when they could be up and running in approximately one day. The BIPT sought a system that When the user double-clicks on a point, he receives attribute would provide quick and easy information about the antenna. One of the attributes can be a online access to communication hyperlink to a report. antennae information and radiation reports, reduce work hours Spreadsheet spent answering communication antennae Previously, when other administrations wanted inquiries, and improve business processes. to know the location of communication antenGeoMedia Suite of Products nas within their territory, one of the BIPTs 250 The BIPT selected Intergraph to deliver an employees had to query the database and online information system (www.sites.bipt.be) export the results as an Excel spreadsheet. to enable the general public to access comAnalysis reports for the electromagnetic radiamunication antenna data and radiation tion then had to be added to the spreadsheet, reports in a user-friendly environment. and everything was then sent via e-mail or in GeoMedia and GeoMedia WebMap were used hard copy. As one can imagine, this became a to manage approximately 7,000 individual very time-consuming and daunting task. data records and GeoMedia WebMap The BIPT sought to implement a Web-based Publisher to publish data to the Internet system that would enable the general public without customization and programming. BIPT to access both the location sites of communialso selected Intergraph technology for its cation antennas in any given territory along open architecture, ease of use during implewith the corresponding radiation reports for mentation, and ability to further expand the antennas within the area.

Intergraph Corporation
(www.intergraph.com) is a global provider of spatial information management (SIM) software. Security organizations, businesses and governments in more than 60 countries rely on the companys spatial technology and services to make better and faster operational decisions. Intergraphs customers organize vast amounts of complex data into understandable visual representations, creating intelligent maps, managing assets, building and operating better plants and ships and protecting critical infrastructure and millions of people around the world.

Communications Manager with Intergraph Corporation. For more information on BIPT, visit www.bipt.be. Intergraph Corporation can be found at www.intergraph.com.

Latest News? Visit www.geoinformatics.com

April/May 2006

29

Special

Two Kinds of Open Come To


Open Source and Open Standards in Geospatial Tech
It is difficult to read an IT magazine, speak with a programmer or read a technical blog these days and not run into the terms open source and open standards. And, while many vendors and users advocate one, the other, or both, there have been few explorations of the implications of these ideas for the geospatial community. This article will do just that, and take a step further by revealing the power of uniting the two to solve today's challenges related to using geospatial data and services. By Adena Schutzberg
opposite of open source software is proprietary software; that's when the source code is not shared. In 2006 it is possible to identify many open source geospatial software projects and point to the newly formed Open Source Geospatial Foundation (OSGeo, www.osgeo.org) which aims to formalize some of the efforts.

Blueprint
Before getting at open standards, let's take a step back to define standard. This is from Bob Sutor, the Vice President of Standards and Open Source for the IBM Corporation: "A standard is like a blueprint. It provides guidance to someone when he or she actually builds something." He goes on to note that it is a blueprint upon which many people need to agree. The Open Geospatial Consortium (OGC, www.opengeospatial.org) develops consensus on "blueprints" for software APIs. An open standard can mean that a standard is open to anyone to use, even though it has restrictive licensing or requires a fee. The OGC goes a bit further and defines open standards as being: Freely and publicly available: free of charge and unencumbered by patents and other intellectual property; Non discriminatory: available to any one, any organization, any time, any where with no restrictions; No license fees: no charges any time for their use; Vendor neutral: in terms of their content and implementation concept and do not favor any vendor over another; Data neutral: the standards are independent of any data storage model or format; Agreed to by a formal, member based consensus process: the standards are defined, documented, and approved by a formal, member driven consensus process. The consensus group remains in charge of changes and no single entity controls the standard. The key aspect of OGC open standards is that they are freely available for anyone to access and implement at any time. Software developers and development organizations,

Figure 1: Interface of GRASS 6.1. Recent releases include support for the OpenGIS Simple Features Implementation Specification.

Understanding
It is worth noting at the outset the confusion in the community over the terms open source software and open software standards. The word open is used extensively in articles, marketing materials, email lists, and blogs. But what does this term really mean? The definitions vary, sometimes referring to a software product's interfaces (Application Programming Interfaces, APIs) and sometimes to the source code. Open source refers to whether or not the source code behind software is made available, among other things. If it is made available, and users can copy, modify and redistribute the source code without paying royalties or fees, it is termed open source. (For the complete story, visit the Open Source Initiative www.opensource.org/.) The

30

April/May 2006

Special

gether
nologies
whether creating commercial or open source software, decide if they want to implement specific standards. It is important to realize that software packages, whether open source or proprietary, can interoperate if they all implement the same standard. There are more than a dozen approved OpenGIS Specification open standards www.opengeospatial.org/specs/?page=specs) implemented in hundreds of packages and products.

MOSS and GRASS


There is a rich history of open source geospatial software beginning on the desktop in the early 1980s and moving to the Web in the 1990s. Two of the oldest and perhaps the most recognized names in open source desktop software are the Map Overlay and Statistical System (MOSS) which dates back to 1977 and the Geographic Resources Analysis Support System (GRASS, http://grass.itc.it/) which dates back to 1982, see Figure 1. Since both MOSS and GRASS pre-dated the Web as we know it, its developers and users did not have the advantages today's open source geospatial community enjoys such as wikis (editable websites), Internet Relay Chat (IRC, a multiuser online typed discussion tool) and oneclick downloads of files. Other open source desktop systems have grown up in GRASS' wake including QGIS, uDig, JUMP, OpenEV and others. Open source components also arrived early. Development began on BBN's OpenMap (http://openmap.bbn.com/) with U.S. government funding, in 1987, and cleared the way for proprietary developer components. OpenMap enabled some of the earliest Web mapping in 1995. These are just a few examples of projects that underlie today's active open source geospatial efforts. They illustrate, both by their longevity, geospatial power, and flexibility that the open source model works as well in the geospatial marketplace as in others. The case studies that follow highlight just

Figure 2: Surface Ozone concentration map created by GeoServer using data delivered via OpenGIS Web Coverage Service Implementation Specification (WCS) delivered to QuickWMS viewer via OpenGIS Web Map Service Implementation Specification (WMS). Image Courtesy Fire Chemistry Unit, Rocky Mountain Research Station, Missoula, MT.

three of the many open source geospatial projects that have embraced open standards, either from the start, or after several years of use, based on user needs. There are perhaps several dozen other similar stories to be told.

GeoServer
GeoServer has a long history using open standards. First developed to help leverage geographic data to enable urban planning tools such as traffic modeling, the project has become a sort of poster child for both the open source and open standards in the geospatial arena. The Open Planning Project (TOPP), a non-profit organization, felt that a standards-based server of geospatial data was a key piece in pulling together the framework for complex traffic modeling and other needs. That platform, developers Rob Hranac and Chris Holmes determined, needed to include

three key characteristics: support for open standards, ease of use and integration of multiple geospatial formats. The format support was particularly important if the code was to be used in a variety of disciplines, such as local government. How to begin? The team found a technology core in GeoTools, an open source Java GIS development platform launched in 2001. GeoTools (www.geotools.org) seemed like just the base needed as a building block, but it did not include support for PostGIS, the open source database solution for open source database PostGreSQL. That was so important, and the tools so good, that TOPP staff spent time implementing the needed code in GeoTools. "At first," says Holmes, "we felt like we were putting in all this work and getting nothing back, but in the end we had access to all sorts of format support via code developed by others."

WFS

The word open is used extensively in articles, marketing materials, email lists, and blogs. But what does this term really mean?
Latest News? Visit www.geoinformatics.com

As GeoTools began to take shape, the team determined the value of implementing the OpenGIS Web Feature Service Implementation Specification (WFS), the specification for sharing vector data. TOPP

April/May 2006

31

Special

There is clearly much more to come from the marriage of open source and open standards in geospatial technologies.
had already built a WFS implementation of its own. Still, the team saw the benefit of bringing that experience to the table to work collaboratively on WFS for GeoTools. At about the same time, OGC was looking for an open source reference implementation for WFS as part of its Compliance & Interoperability Test & Evaluation Initiative (CITE). TOPP was selected to provide the reference implementation and received funding to insure full compliance of GeoServer with the specification. Web Map Service (WMS) support was added to GeoServer as well, based on work by GeoTools users in Britain, but ultimately completed by Gabriel Roldan, an Argentine programmer working for a Spanish client, see Figure 2. Other GeoServer users needed support for the OpenGIS Web Coverage Service Implementation Specification (WCS), the specification for sharing gridded data via the Web. Simone Giannecchini and Alessio Fabiani, consulting for the NATO Undersea Research Centre (www.nurc.nato.int), staff at the USDA Fire Service and a researcher in New Caledonia (South Pacific) worked together on that effort, making the result available for every other GeoServer user.

MapServer
So, then why is GeoServer perhaps not as well known as another open source Web map server, MapServer? Holmes is quick to point out that WFS is just coming into widespread use, while WMS, which MapServer supports, came on the scene earlier. WFS needs a fairly robust client (the returned vector data must be "understood" and rendered on the client) while WMS "picture maps" can be seen in a browser. "With Geography Markup Language [GML] maturing and more desktop WFS clients including open source uDig and MapBender and proprietary ones from companies like Cadcorp and ESRI, WFS and thus GeoServer will have a larger role in the Spatial Web," Holmes predicts. Holmes and his colleagues are excited about the newest additions to GeoServer, which include tools to manage changes to geospatial databases. While OGC's WFS-T (T for transactional) offers the blueprints for adding, editing and deleting features, formal use of such tools requires software to rollback changes and/or limit who can commit changes. These new tools, combined with others to ensure that added or changed features meet specific requirements (Are they

Solid Starting Point


Why are open standards important in open source? Holmes notes OGC's open standards are a solid starting point. "Otherwise, you have to sit around and argue about what a feature is. OGC has already had a lot of smart people do that work and come up with a good answer. So, we start there. Our data store for GeoServer/GeoTools is based on the data access model of the WFS specification." But it is a two way street. He continues: "Open source implementations of open standards give back to the standards community by providing a free, 'open to look at' working implementation. It is far easier for programmers to explore and evaluate a specification for use with an open source project than simply seeing the results in a proprietary one."

The combined power of DAT/EM Systems International and Inpho GmbH. Precise geospatial data collection with 3D stereo viewing from your desktop.

DAT/EM SUMMIT EVOLUTIONTM: A full-featured photogrammetric softcopy workstation. DAT/EM STEREOCAPTURETM: 3D image feature collection from the SUMMIT EVOLUTIONTM Stereoplotter into ArcGIS. DAT/EM CAPTURE : Collects 3D images features from a wide variety of stereoplotters into AutoCAD or MicroStation.
TM

INPHO MATCH-ATTM: Automatic digital aerial triangulation. INPHO MATCH-TTM: Fully automated DTM generation. INPHO ORTHOBOXTM: The power of ORTHOMASTER and ORTHOVISTA combined. DAT/EM - INPHO CAPTURE CONTOURTM: Automatically creates contours from DTM or mass points within DAT/EM Capture using SCOP processing.

DAT/EM MAP EDITORTM: Automated editing package for AutoCAD or MicroStation files.

DAT/EM SYSTEMS INTERNATIONAL


8240 Sandlewood Place, Suite 101, Anchorage, Alaska 99507 USA PH: +907.522.3681 FX: +907.522.3688 Email: sales@datem.com Web address: www.datem.com

For a demonstration, please visit booth number 215 during the ASPRS 2006 Annual Conference, May 1-5, 2006 in Reno, Nevada, USA

32

April/May 2006

Special

uDig is widely used to test Web Map Servers. "I want the same thing with WFS. I want the same thing with Catalogue (OpenGIS Catalogue Service Implementation Specification)" says Ramsey of the other specifications the products supports, see Figure 3. He is quick to point out the value of standards as a design baseline from a development standpoint. But with that comes "good news and bad news." The bad news is that "because the OGC specs tend to be more general than most implementations of GIS design, the implementation overhead we incur building the infrastructure necessary to handle them is very high." The good news is that "once we have suffered through the implementation hell we have a framework which is flexible enough to handle very odd cases, cases which cause developers with less generic models to graft onto the sides of their systems."

The Road Ahead


Figure 3: uDig serves as a client to a Web Feature Server (WFS) and Web Map Server (WMS) built on MapServer & PostGIS. The desktop client is part of prototype biogeography server here showing Central America.

long enough? Do they connect to other features? etc.) will lead to a whole new kind of Spatial Web offering. It will allow a "geowiki" sort of collaboration where many members of a community can participate in building and maintaining a shared spatial database via the Web.

uDig and Open Standards


Open source desktop GIS projects have gotten a bit of a jumpstart in the recent years. While GRASS was the forerunner of these, today's user interface practices have substantially changed the look and feel, and enhanced ease of use. The offering with perhaps the biggest commitment to open standards is the user-friendly Desktop Internet GIS (uDig). uDig stemmed from a grant from GeoConnections Canada to create a software tool to "help ordinary computer users view, edit, and print data accessible through the Canadian Geospatial Data Infrastructure (CGDI) and local data sources." Because GeoConnections had previously chosen to implement its spatial data infrastructure on OpenGIS Standards, the new client would need to support them as well. Paul Ramsey, president of Refractions Research, notes that while the funding was contingent on supporting standards, he and his team also wanted to fill a hole in the OGC world of a true integrated client, where searching, seeing, querying, using, OGC services was a transparent part of the user interface (drag 'n 'drop search results into

map window to see them, etc). Ramsey and his company are already known for their work in developing PostGIS and felt strongly that an open source project would best serve the citizens of Canada. GeoConnections is the name of the geospatial program in Canada. According to this program, the free open-source product will provide a data access and maintenance tool that governments and the private sector can use regardless of budgets. Users of uDig will be able to access the CGDI without buying expensive proprietary desktop GIS licenses simply to view CGDI data. Consequently, uDig will make CGDI data accessible to a much wider potential audience.

There is clearly much more to come from the marriage of open source and open standards in geospatial technologies. The demand for interoperability, flexibility and widespread distribution of products has and will continue to push these efforts. New programmers, working with new building blocks created around consensus-built standards are likely to be a key step in building national and global data infrastructures that not only reach to the far corners of the earth but are usable by their inhabitants regardless of budget or underlying technology.
Adena Schutzberg (Adena@abs-cg.com) is principal of ABS Consulting Group, Inc, based outside Boston, Massachusetts, and consultant to OGC.

uDig Map Window


uDig supports WMS and the more complicated WFS. The "user friendly" part of the name shines through since users can simply drop URLs (Web addresses) of Web services onto the uDig map window and have them added to the map. Refractions takes great pride in its product's role in OGC Web Services Initiative Phase 3. Says Ramsey, "I think I am most impressed by the different usability of the different clients in the final demonstration; simply compare the amount of interaction required to display airports using a feature portrayal service. uDig [required] one drag and drop every thing else was automated others required up to three catalog lookups, and cut-and-paste URLs."

Latest News? Visit www.geoinformatics.com

April/May 2006

33

Special

Bentley Web Mapping all About Speed


Providing Information Based on Predictions
Bentleys Geospatial focus has always been and still is: creating, managing and sharing of geospatial data as efficient and quickly as possible. Web mapping, or publishing, is a logical step in this respect. Viewing, redlining, printing and plotting via the web are all possibilities, except for editing. Director Geospatial Center of Excellence Matty Lakerveld explains why. By Sonja de Bruijn

Matty Lakerveld, Director Geospatial Center of Excellence Bentley Benelux.

ping technology. Tuning maps according to what the end user wants is the fourth option, and can be combined with the third possibility. According to Lakerveld there are essentially two types of web mapping: embedded web mapping and an application specifically meant for the end user. The Director of the Geospatial Center of Excellence explains what he means by the latest: I am talking about providing information based on predictions on what the end user wants to know. Accordingly only the requested information is offered which is retrieved from a content management system. Offering information this way implies not only content managers working behind the screens, but also a communications specialist.

Intranet Webmapping
Web publishing is not new or hot to Bentley: ISIS, a Dutch company that was incorporated by Bentley two years ago, started developing this technology six years ago. At that time the application was called Flexiweb. This was an environment for the management and publication of all kinds of geographic information within an organisation, making use of Internet technology. Speed was essential: being able to use, view, analyse, print and plot hi-resolution vector and raster data, images, multi-media and multiple databases as fast as possible in one integrated environment, key issue with Bentley. Database information, documents, geospatial information, all of these data are

configurable from the server. Instead of having to implement technology, all data are directly accessible from a configurable database. In a very short time the user has a complete GIS environment on the web available. This is the main difference with other web mapping products in the market.

Making Predictions
He continues: Many organisations think offering a huge amount of information is priority. Bentleys opinion is that it is much better to first think about what a user wants to know before putting all the information on a portal for example. A visitor should be able to find, retrieve and make use of the information he is looking for in a very short time, without having to search or being bothered by GIS technology. Lakerveld sees Web Mapping as the first modest step for organisations to a Service Oriented Architecture (SOA). Web Mapping is not a purpose in itself.

Four Types
Bentley offers four options in the field of web mapping. One of them is making all information available via web technology (intranet), the second being the integration of geospatial data in an enterprise system. Making data available via the Internet for several purposes is the third option within Bentleys web map-

Latest News? Visit www.geoinformatics.com

April/May 2006

35

Special

supports user-initiated or event-driven interoperability with the ESRI ArcSDE Geodatabase. The Connector for ArcGIS supports an intelligent extract and post paradigm. Bentley users can retrieve Geodatabase data for use in AEC and mapping workflows, and later post the appropriate information for use by ESRI users.

Customers
(Local) authorities are the main customers in the field of web mapping, but telecoms and water are emerging markets. Lakerveld: Amongst other things pipeline networks, but also electric, coax, copper and fiber networks, can be published and managed with our software. Bentley just delivered a huge implementation at the Dutch Ministry of Finance, where Flash based Web Mapping technology is applied to integrate geospatial data (both raster and Vector) related to 2 million parcels inside the SAP environment. Another customer is the Municipality of Eindhoven, who in the year 2000 did research to determine the need for geo-related information for all workstation seats. This study, performed by ISIS, now Bentley Benelux, showed that there was a tremendous interest amongst the employees. However the average user was not always able to get the right information easily. Therefore the Municipality of Eindhoven decided to focus not only on core technology, but also to develop an interface that could seduce its users. This means: raise interest and let the user himself determine how to reach his goals with professional support.

He regards web mapping as part of an integrated system. This so-called system integration started about six years ago and I am convinced it is still strongly evolving. Just look at Oracle with their SOA implementation, which facilitates the development of modular business services that can be integrated and reused, for an adaptable IT infrastructure. He regards web mapping takes place outside the traditional boundaries of the CAD and GIS environment and with that makes geospatial content available for multiple purposes in numerous workprocesses.

were available with the quality level we were used to. Therefore we worked together with Bentley to define a presentation tool with interactive functionality. Within three months this service became operational. According to Tros it is greatly valued by customers and now operational in public issues regarding zoning maps, sense of the city, and the public relations project city of light. The web product shows information that is retrieved from the professional data generated in the back-office. However the Internet audience only needs a small subset of this operational data. Bentleys Web-publishing solution retrieves this subset from the Oracle Spatial database which means no conversions and no additional technical requirements. Here seduction works too: for the first time the professional receives compliments for the work he performs and is encouraged to deliver even more quality. Tros comments: Our municipality is currently positioning this tool as a very important means of communication with the inhabitants and interest groups in our city. We are convinced we can achieve this by having faith in our own quality and stimulating the use of our information.

No Web Editing
Essentially web mapping is meant to provide information that is as unambiguous as possible. As for the Bentley web mapping software, viewing, redlining, making descriptions, printing and plotting are all applicable, in contrary to editing. Surely editing and more interaction are already possible, and Lakerveld is convinced several applications in this area will become available in near future. But Bentley is not going to follow this path, because it is not advisable from an organizational point of view. Lakerveld: Geospatial data has to be created and managed as a service to other users of that data in several work processes (create once, use many). To be able to implement web editing, you have to implement a secure, multi-user transaction based environment. The browser is simply not suitable to support this completely. High resolution editing therefore should take place on the desktop, making maximum use of the rich functionality and dedicated access to Internet Services. like WMS and WFS servers.
Sonja de Bruijn (sdebruijn@geoinformatics.com) is editorial manager of GeoInformatics. Of particular interest are the white papers on www.bentley.com, to be found under the vertical section, left-hand side of the homepage.

Source Information
Intranet and Internet Bentley web applications work with diffeent databases. Source information can be retrieved on-the-fly. In order to maintain security there is a second database behind the firewall. Several systems and data stores are compatible with the Bentley web mapping solutions. Lakerveld: With Bentleys Web Publishing technology it is possible to publish Oracle Spatial live, as well as DGN in native form. The same goes for WMS, both server and client. I think it is quite remarkable that also SAP and ESRI data can be fully integrated. For instance, Bentley Geospatial Enterprise server

Compiling Information

Rob Tros works as an information manager with the Municipality of Eindhoven. He says: The Intranet Webmapping environment that was created has been specifically developed for our own professionals. They are very capable of finding and compiling the information for their own needs, and find it very useful. This is why we also want to present it to our customers, other governmental bodies, or interested outsiders. He continues: In 2004 we felt the need to provide information on the Internet to every end user in a simple, fast and safe Intranet GIS on the Internet, Municipality of Eindhoven, the Netherlands. way. No off-the-shelf products

36

April/May 2006

Special

Articque: From Statistical Mapping to Web Mapping


Cartography for Everyone
Since 1989, Articque has been betting on the Internet to promote statistical mapping. And for a few years now, several organizations have leaped on this opportunity to develop web solutions or have web solutions developed for them by Articque. The CEO of this French company, Georges Antoine Strauch, explains the lengths he had to
When presenting the project for financing, ANVAR, the French National Bureau for Research, thought it was too ambitious and persuaded him to settle for the mapping software he first had in mind. The bureau then agreed to finance the remaining developments up to 175.000 ($ 210,000).

International Standards
travel to come to that point. By Georges Antoine Strauch
What emerges is a software package called Cartes & Donnes (meaning Maps & Data) with which the user can produce maps, endorsed by a statistical report and expert counselling provided upon request. The maps produced comply with international standards and the software has been validated by the GIP RECLUS, the principal European network of geographers. Ultimately, this software was enthusiastically welcomed and rapidly settled in the sphere of Education and Research. However, Articque kept its first conception of the software as a set of items usable in a components library. The work done through Cartes & Donnes materialized itself in the organigram, the real skeleton behind a future customer-server application or Intranet. The main idea was to use the work and knowledge of an experienced analyst, and make it available to the other users. This was the genesis of the integrating tool, CartoExtension, which was still under development when, in 1997, the European Community financed a project requiring this technology.

Figure 1: Thanks to the organigram (on the left corner of the screen), it is possible to display the same data with several statistical representations in one single process.

Ch@ppe dOr
On January 26, 2006, the French chapter of the Internet Society, ISOC France, Adminet, the Cawa and partner honoured personalities of the Internet in the areas of politics, sciences, arts, business and civil society, at the Museum of Arts et Mtiers in Paris. Jean-Michel Billaut, from the BNP-Paribas workshop, rewarded Strauch with the Ch@ppe dOr. Since 1991, Strauch has been convinced that cartography should not be reduced to the simple illustration, location and displaying of distribution networks. He thought that instead of an Excel-like representation, the data would have much to gain from its representation in maps, and that this illustration should complement and enhance a purely mathematical anal-

ysis. One should then be able to go back in the process, using these more illustrative techniques to make changes and build layer upon layer. It was by taking this idea one step further that the organigram, see Figure 1, was born. Strauch wanted to propose cartography on the France Telecom Numeris network with data and maps, respectively provided by the National Institute of Statistics and Economic Studies (INSEE) and the National Geography Institute (IGN). From 1992 through 1994, he tried to convince these companies to cooperatively launch an interactive service of statistical mapping. He encountered some difficulties with the IGN, since at that time the outlines of French municipalities were available for the modest price of 11.500 ($ 13,800).

One Day Users


Since the end of 1998, Cartes & Donnes On Line, soon to become MakeYourMap.net, has been available on the web. Through this service, Articque targets the one day users, such as the executive ending off a report, the CEO preparing the presentation of his companys results on a global scale, or the sales manager willing to analyse the accuracy of his commercial sectors. The application can be used in Education and Research, Distribution and Franchise, Transport, Local Communities, to name a few examples. The user chooses a map, then one or two types of data, either from his own computer or from the data available on the server. When asked, he defines the type of statistical treat-

Latest News? Visit www.geoinformatics.com

April/May 2006

37

Special

Scheme representing the functioning of the tool developed for the Daedalus project presented for INFO 2000.

Disaster management: cooperative application to save and restore the Atlantic sea coasts - Participative web mapping supporting the citizen.

ment he wants to apply and selects a cartographical displaying. Once these variables are entered, he just executes the process to obtain a map he will save to include with his documents later.

Participative Mapping
On June 13, 1999, Articque puts the first cartography of the European elections online, with a constant update of the results. One year later, the company launches FranceElectorale.com, which becomes the first digital electoral board to display the French electoral map, as well as election results. FranceElectorale.com is dedicated to the elections, to elected officials and to electoral forecasting, and offers all candidates the possibility to freely and easily register their campaigning. They enrich an electoral map displaying the current elected officials by filling in an online form. Candidates of future elections are then able to broadcast the first news about their electoral campaign on a website totally independent from political parties. During the municipal elections of 2001, www.FranceElectorale.com achieved 500,000 visitors within a month.

and taking into account their available resources. The data on the evolution of the oil slick and its impact on these shores were immediately collected on the map without the need for Articque to manipulate them, allowing a real-time update on the http://erika.articque.com website. At that time, this initiative was relayed by television (with the French channel TF1) and newspapers such as Le Point, La Tribune, and Les Echos.

taneous users, Articque relied on Linux, Apache, Java, MySQL and its mapping engine awarded by the European Community and the ANVAR.

Best Coverage
In 2004 Articque was contacted by the Medical Services of the SNCF who needed to develop an Intranet solution. Its function was to ensure the best and most thorough coverage possible of the French territory, in order to guarantee a fair and equal quality of service to each and every single SNCF employee, with respect to health services. The solution developed by Articque is conceived for regional administrator, enabling them to: Attribute each municipality to one medical sector and to one general practitioner (GP) in particular; Improve the number and the location of the chartered GPs with respect to the SNCF personnel. The use of mapping allows the administrators to carry out their investigations for the covering of national territory by chartered GPs able to respond to SNCF personnel needs. It also allows them to quickly construct decision-making files. One of the main assets of the application is its ability to display the municipalities where a GP is affected, consequently allowing the administrators to extract the ones which need an allocation.
Georges Antoine Strauch (gas@articque.com) is CEO of Articque. To get more information: www.articque.com, www.cartomatique.com, www.cartesetdonnees.com, and www.mapanddata.info.

Mapping Observatory
In 2000, the CFE CGC, the French Executive Trade Union, comes into action and asks Articque to build a custom-made application. It is the first trade union to equip itself with a Mapping Observatory of the companies. This application is a geostatistical tool conceived by Articque for the elections to come, and facilitates the decentralized entering of data by local representatives of the Union. It allows the constant update of a file containing very precise information about their militants. All this information is displayed on maps calculated in real time. Only decentralized entries can enable such a big organization to maintain files up to the minute. Yet it is in a centralized way and with geographical criteria that these data are consulting, taking into account the country as a whole, as well as regions, municipalities, towns and so on. Each level of consultation corresponds to a synthetic view of the Trade Union, such as number of companies, and number of elected people. The Executive Trade Union acts on a local basis but pilots the project on a national level, which allows the Confederation to make strategic decisions based on synthetic and hierarchical data. To support the connection of nearly 150 simul-

Atlantic Seashore
Another major event took place in December 1999, when the sinking of the oil tanker Erika on the French coast stirred up strong emotion. Articque put online a map of the Atlantic seashore and proposed the constitution of a civilian network, the Coast Watches. Their role was to collect data to be centralized and diffused through interactive maps on the web, representing the coastal municipalities affected by the disaster and their distance to the wreckage, while also attending to their needs

Latest News? Visit www.geoinformatics.com

April/May 2006

39

Special

Review Live Local and Intervie


Providing Live Connections to People, Information and
This article provides an overview of Live Local as well as some of the authors experiences with the application. Interviews with Tom Bailey, the Director of Marketing for Live Local, and Ashley Johnson of Waggener Edstrom Worldwide, the Windows Live Local PR team for Microsoft, are also included. Bailey was able to answer several questions directed specifically at current and future plans for MS Live Local in the European community. By Greg Baca
Microsofts description of Live Local: Windows Live Local uses location as the way in which people interact with the Internet so that they can more easily find, discover and plan activities relevant to that location. It takes information consumers want (weather, traffic, hotels, restaurants, entertainment, photos) and brings it together in ways that enable people to answer the question "What's it like there?"

Menu Commands
Moving from left to right across the menu there is: Welcome, Scratch Pad, Locate Me, Permalink, Add Pushpin, Directions, Settings, Community, Help and About. The Welcome screen is displayed to the left of the main map area. It displays news and information about MS Live Local. The Scratch Pad allows the user to store a search result, or he can click a spot on the map, and save that spot on the map to the Scratch Pad. The Scratch Pad remembers where you have been and makes returning a click away. It is also possible to mail and/or blog your Scratch Pad. The Locate Me link activates a Windows Live Local application (Location Finder) that attempts to determine your present location and launch a Windows Live Local map of that location. There are two techniques used by the Locate Me feature: 1. If the user is on a computer with a Wi-Fi card, Wi-Fi signal strength from nearby wireless access points can be used to determine location. This is generally accurate to between 50

Figure 1.

Overview
Live Local is completely web based - no plug-ins or downloads are necessary, see Figure 1. The application has all the typical features of a web based mapping service: search, routing, and more. In this overview the unique features will be described. The interface provides interaction through search, from a menu or context sensitive commands (right click), and offers keyboard shortcuts to assist with navigation of the map. The search features a What and a Where in area. In the What area the user may enter a specific topic such as museum, or cuisine. In the Where area an address can be entered. Or it is possible to search from the What and have the Where be set as use the current map view. The application delivers on the What content as illustrated in a search of Brewpubs in the Fort Collins Colorado area map, see Figure 1.

Figure 2.

40

April/May 2006

Special

w with Microsoft
Passions They Care About
Permalink is a way to save and share your Windows Live Local experience with others; Pushpins are a way to add a custom Push pin to a map much like one might have done in the past with a pin on a paper map. The pushpin allows you to name it and also add 200 characters of text in a note describing the Pushpin. The Pushpin can also be emailed to someone else; Directions provide turn by turn directions with route highlights and maneuvers integrated into the map whether in road or aerial view with printing and email options available. To and from directions can be created by clicking anywhere on the map, or by traditional methods such as search results and entering and address. More on directions later in this article. Settings provide a way to set a limited set of behaviors as to what is saved on exit, map navigation, and searching options; The Community Link provides interaction with others in the community through: A hyperlink to blogs and threaded discussions boards about Virtual Earth; The ability to vote on simple Virtual Earth polls; Provide free text feedback regarding Virtual Earth. Help provides a rich content of help for the Live Local tool. About offers background into the technologies applied in Live Local, and credits all the data and technology providers.

Interview with Ashley Johnson


Please tell me about Live Local.
Windows Live Local was launched on Thursday, December 8, 2005. It is a free online local search and mapping service that combines maps, unique bird's eye imagery for many US cities, advanced driving directions, Yellow Pages and other local search tools, enabling people to learn, discover and explore a specific location. Windows Live Local is powered by Virtual Earth, delivering a core set of functionality that combines maps and directions, immersive Bird's Eye and aerial imagery and local search. Our product roadmap includes a community approach where people can create their own location-specific view and share it with others.

Why the name change from Virtual Earth?


This name change was part of the broader Windows Live strategy change launched on November 1, 2005. The vision for Windows Live is that it will help people have richer lives by providing live connections to people, information and passions they care about. Windows Live Local helps to deliver on this by combining immersive aerial imagery, customizable map annotations, innovative driving directions and the ability to share local search results with others. Windows Live Local falls under the Information Services tech pillar of Windows Live. The other pillars are Communication Services, Safety and Protection, as well as Storage and Roaming.

Figure 3.

and 200 feet; 2. If the computer does not have a Wi-Fi card, the Internet protocol (IP) address of the computer is used to determine an approximate location of the user. (IP-based location is generally accurate to the city or county level. In either case, if a location can be determined, the map is updated to reflect this location and a Pushpin is drawn on the map centered on the consumers location.

Rest of the Menu


A Permalink is a permanent URL which encapsulates the current state of the Virtual Earth browser window, including the map view, the map style, zoom level, open searches, and Scratch Pad contents. Creating a

How is Live Local different from Terra Server?


Today the aerial imagery in the Virtual Earth platform comes from Microsoft's TerraServer project (terraserver.microsoft.com). TerraServer has been digitizing publicly available aerial images taken by U.S. Geological Survey's National Aerial Photography Program (NAPP) for use on the Web. The frequency in which these images are updated varies.

What applications or web services do you see as the major competition?


Windows Live Local is part of a broader strategy to deliver such rich and innovative experiences to our users whether that is two or two million people together across PCs, devices and the web. We are confident

Figure 4.

Latest News? Visit www.geoinformatics.com

April/May 2006

41

Special

in knowing that if we do a good job of executing our Live initiative across all of non US markets, then our competitive position will take care of itself. We don't intend to react to our competition; we are going to respond to our customers access to stae-ofthe-art' needs. That's what will define our competitive position.

I see from the website information that ORBIMAGE is a contributor - will there be full coverage of the earth available at the resolution of IKONOS imagery?
Microsoft and ORBIMAGE have entered into a five year exclusive deal in which Microsoft will be the sole online distributor of world wide satellite imagery from ORBIMAGE. The partnership between Microsoft and ORBIMAGE will enable Microsoft's mapping and location assets to offer rich satellite imagery to both consumer and business customers outside of the United States and will accelerate our ability to offer this imagery in non-US markets. Moreover, ORBIMAGE plans to launch a new satellite within the next 18-24 months which will give Microsoft and its customers access to state-of-the-art satellite imagery.

Figure 5.

Are there plans to incorporate European cities for address matching?


We are continually updating Virtual Earth with new imagery provided by our partners. Ensuring our customers have the best possible information from this new service is critical to us. Not only are we regularly updating the existing aerial Imagery, we are also adding new data such as birds eye and global satellite imagery at 15 meters. We have announced agreements with Pictometry Intl Corp to do low-level 45 degree or bird's bye imagery and with ORBIMAGE and Harris to provide international aerial satellite photos from around the world taken at 15meter proximity. With Aerial Photo with Labels: refers to the imagery, we are planning to cover 80 percent of the US population within the next two years, with European data available later this calendar year. Live Local now includes imagery from the USGS, Harris, Pictomery (for Bird's Eye views), and others.

Auto-Refreshed Search Results: as the user moves around the map, the search results refresh dynamically to always give the user the most relevant information that pertains to the selected map view; Yellow Page directories: these directories have been incorporated into the Windows Live Local index so that users can query the information in rich, flexible ways; Multiple Searches: give users the ability to conduct multiple local searches and have all of the relevant information show up on the map alongside one another.

gated with a center pivot irrigation system. Some other interesting things to point out with this image; look at the mileage and time for the route. The routing algorithms obviously account for speed by road type. Also notice all the acknowledgements in the south east corner of the map.

Birds Eye Views


Birds Eye views are likely the most interesting and unique feature of the Live Local service. Birds Eye allows us to change from road or

One of the features I liked best is the ability to click on a place on a map, right click and set that point as a start or an end point to a destination and get directions to that destination without the need to enter an address for either location. One can then reverse the directions from B to A as illustrated in figures 2 and 3 in an area in London. The reverse directions feature is quite handy when dealing with one-way streets; many one-way streets in the area depicted in Figures 2 and 3 looked like a difficult place for the application to choose a route.

Questions and answers from Tom Bailey


Can you tell me what the Virtual Earth/Live Local relationship is?
The Virtual Earth brand is the platform upon which the consumer destination site is built. You will see in the Header of the Windows Live Local website Powered by Virtual Earth. The reason that this is important is that we have a business to business focused service which we call the Virtual Earth Platform. This is a service we sell to many commercial and government entities that wish to integrate mapping and location based services into their Internet and intranet solution offerings. Thats where the relationship is; all of the mapping and location services that drive the various experiences are built on the Virtual Earth Platform.

Literal
It is interesting how literal routing software can be; point A to point B along any road. I selected a route between Moffat and Silver Cliff Colorado, see Figure 4. I dont believe this route is passable by anything short of a high ground clearance vehicle, and then only in the summer. Note that the peaks relisted in the map are all above 14,000 feet. This figure illustrates the high quality of the images; even at this scale landforms are easily distinguishable. Did you notice the geometric circles in the southwest of the image? On first sight these appear to be some sort of problem with the image, but the circles are actually accurate renderings of crops irri-

What is the street source data used for routing within Live Local?
We use NAVTEQ as the underlying mapping data for about 75 percent of the maps. NAVTEQ creates the digital maps and map content that power navigation and locationbased services solutions.

Other Features
Aerial Photo with labels: refers to the navigation feature that displays the aerial images with an overlay of road networks and points of interest that is correctly geo-referenced on the image;

Latest News? Visit www.geoinformatics.com

April/May 2006

43

Special

What Geocoding Engine does Live Local use?


We have our own Geocoding service which we have built and refined over the years and we are also beginning to integrate best of breed geocoders alongside the homegrown geocoders. As we broaden out to new languages and new services beyond what we serve today - which is mostly North America and Western Europe. We are looking at other geocoders to augment other countries, such as Eastern Europe, Japan and China.

What are the plans to add content for the European community?
The MapPoint 06 product that will be out this summer, it will have expanded Eastern European coverage. We are making huge investments in Europe and very soon we will be able to zoom in to much greater detail with Live Local in many countries around Europe. Europe obviously is very important to us, and we have a long history of CD products as well as the MSN Maps and Directions service which is the precursor to Windows Live Local. We are heavily investing to expand the Windows Live Local experience to many European countries, and obviously that will roll out over time. Many of the European countries will get a Windows Live Local experience this calendar year.

Figure 6.

aerial view to a Birds Eye (45 degree) view of the map. Currently, this feature includes about twenty-five percent the United States including major cities such as New York, LA, San Francisco, and Boston (a list of Birds Eye locations are provided in Help). The service brings up a text box notifying us that there is Birds Eye imagery available when we are located in an area with Birds Eye imagery. We navigate the Birds Eye view the same as the map or image view, in addition we have a group of thumbnails that are Be sure to take Live Local and other location services for a test drive and judge for yourself which you like the best. There are a number of web based mapping/location/routing services; some are listed below: Mapquest - www.mapquest.com Yahoo Maps www.maps.yahoo.com Google Local - www.google.com/lochp Very Interesting Historical maps www.davidrumsey.com/index.html European Sites: http://rp.rac.co.uk/rp/routeplanner www.multimap.com/ http://mappoint.msn.com Developers will also want to check out: Virtual Earth Developer APIs: Free versions for both commercial and non-commercial use: Virtual Earth APIs easy-to-use JScript Map control MapPoint Developer Center on MSDN (http://msdn.microsoft.com/mappoint/) The Virtual Earth developer blog (http://blogs.msdn.com/virtualearth/) for news and other information about the Virtual Earth APIs. Via Virtual Earth (www.viavirtualearth.com), is a third-party

clickable and a large feature/small feature scale zoom, as well as a compass to rotate the view North, South, East or West. Take a look at the Birds Eye View of Niagara Falls located on The US Canadian border oriented from the North, see Figure 5, and South, see Figure 6.
Greg Baca (gbaca@geoinformatics.com) is a freelance writer for GeoInformatics. Windows Live Local can be found at http://local.live.com.

Are there plans to integrate MapPoint or Streets and Trips into Live Local?
Streets and Trips and MapPoint are to become more integrated with the Windows Live Local consumer destination on the web. In the 2006 version there is the capability to show any of your streets and trips views in Live Local, by simply right clicking on the map - it will say show in Windows Live Local and it will open (in the US) a screen in Live Local which you will be able to evaluate and with which you can do the same thing as in Windows Live Local; such as view the aerial imagery, and Birds Eye Views to help enhance your trip planning experience. This is the first of many integration points of what can be done between the offline rich win32 client application and the online service.

web site sponsored by Microsoft and dedicated to Virtual Earth development. There you will find code samples, a gallery of existing Virtual Earth powered applications, and other information to help start creating applications that incorporate a Windows Live Local-like experience by utilizing the Virtual Earth APIs. Other Virtual Earth links can be found in the article on Microsoft's Proposed Take-Over of Vexcel, page 6 of this issue of GeoInformatics. Live Local credits the following companies as providing content for Windows Live Local: NAVTEQ, TeleAtlas, USGS, NASA, Pictometry, Harris ImageLinks, OrbImage and EarthData. www.navteq.com www.teleatlas.com http://terraserver-usa.com http://seamless.usgs.gov http://earthobservatory.nasa.gov www.pictometry.com www.harris.com www.es-geo.com/ www.orbimage.com www.earthdata.com

Is there a way to set units for to anything other than feet and miles?
Units will be localized when we launch in Europe.

Latest News? Visit www.geoinformatics.com

April/May 2006

45

Interview

Our Product Focus Has Always Be


Pacific Crest Corporation Active in High-Precision Positioni
With a worldwide distribution network it is obvious that Pacific Crest Corporation has quite a lot of customers in the field of wireless communication. In this interview several managers from the organisation share their views concerning the market, competition and the future in wireless communication. By Sonja de Bruijn
venience-oriented solutions. Customers want longer range, more flexibility, more reliability, and easier setup. Machine control and offshore markets are growing market segments for Pacific Crest.

Who are your partners and which products are developed in cooperation with these partners? What about future partnerships?
Cameron: Pacific Crest continues to work with major OEMs and dealers alike in an attempt to provide them with what the market needs. We have developed internal radios at the pcb level for our major OEM customers as well as complete solutions at the box and accessory level for our dealers.

Which trends do you see in wireless data communication? What does the future look like?
Iyemura: In general, we see the need for programmable products that maximize the value for the customer. Cameron: Wireless communications are the most successful when the product is completely transparent to the user; meaning that the product works and does not require regular maintenance. We believe that we develop products in anticipation of customers needs before they know that they need it. Rick Gosalvez: Current trends in the various wireless data communication industries are: product-integration, standard-based products movement towards increased range, flexibility,

From left to right: Werner Kozel, Mario Gosalvez, John Cameron, Ron Iyemura, and Rick Gosalvez.

What is the goal of Pacific Crest? Has this changed over the years? What is done to reach this goal?
John Cameron, General Manager: At Pacific Crest, we deliver rugged communications solutions to our customers. We have evolved our kits and products over the last twelve years in an effort to provide the best solution for our customers. As we have grown, we have been able to develop and typeapprove radios for countries all over the world. Rick Gosalvez, Product Marketing Manager: Our goal is to be an industry leader of wireless technologies in the high-precision positioning market. To help us reach this, the following things are currently being done at a high level: frequent customer contact, advanced R&D, the construction of market solutions, service and support.

What can you say about competition? What makes Pacific Crest different from its competitors?
Rick Gosalvez: We believe Pacific Crests differentiating characteristics are: service, support, and reliability in the field. Compatibility is another major differentiator since our radios are already compatible with most systems out in the field.

In which markets or segments is Pacific Crest mainly active? Which new segments would you like to reach and why?
Ron Iyemura, Sales Manager Asia: Pacific Crest is mainly active in the GPS surveying market by providing data links for precision differential signal transmission. Rick Gosalvez: The high-precision positioning market is Pacific Crests main market. This markets main trend leans towards con-

Pacific Crest Corporation (www.pacificcrest.com) gets its name from the beautiful and majestic trail located in the Sierra Nevada Mountains. The Pacific Crest headquarters and primary facilities are located in Santa Clara, California. These offices provide design, quality assurance, sales, shipping, and customer support. Pacific Crest maintains a sales office in Europe, as well as a worldwide network of reseller agencies to serve customers around the world. Service and support is available at Pacific Crest service centres in Europe, Asia, and North America.

46

April/May 2006

Interview

en Compatibility and Ease of Use


ng Market
and compatibility. Future products will exhibit the following qualities: increased power and flexibility, smaller, easier to use, and consuming less power.

In December 2004 Trimble acquired Pacific Crest. What influence did and does this have on Pacific Crests strategy and/or products?
Cameron: Trimble purchased Pacific Crest for its capabilities and accordingly has encouraged us to continue with our business model and plans. Rick Gosalvez: Pacific Crests incorporation into the Trimble family is a positive thing and allows Pacific Crest to better serve the market. The merger gave us access to additional resources, financial support, and access to more industry experts.

What are the advantages and disadvantages of radio frequency?


Iyemura: The advantage of a stand-alone radio link is that you can use it anywhere without dependency on a pre-established infrastructure. It also provides the most accurate solution. The disadvantage of UHF -the most popular RTK solution- is that it is prone to interference, especially in populated areas. Werner Kozel, Senior Manager over Engineering and Customer Service: An advantage of narrow band radio frequency technologies is that these technologies offer longer range with better propagation than most spread spectrum radios. However, a license is required prior to operation of narrow-band systems. Rick Gosalvez: Radio frequency users are the beneficiaries of many useful advantages, such as mobility (flexibility), range, lower installation costs, and easier maintenance. Access to these advantages requires the right equipment and a continual power source. All users must adhere to regional regulatory compliance.

ing economies. As these regions economies continue to grow, more infrastructure and surveyors will be needed, which could mean an increase in radio sales. Basically, as these areas grow; Pacific Crest grows. Our business is inextricably tied to the growth of these countries. Pacific Crest is just one part of the surveying solution and it is our goal to be compatible with many of these other solutions. Mario Gosalvez, Sales Manager Americas: South America is beginning to open up more for RTK solutions. The efficiencies that RTK provides a North American user are beginning to be recognized in South America.

Initially planned in October 2005 but opened late Q1 2006 there now is an Authorized Service Centre in China. What effect does this have on the customer? What can you say about the Asian market in general?
Iyemura: The Asian customer base receives faster repair service and we believe this develops satisfied customers. The Asian market place is growing very fast and we are aiming at supporting this. Kozel: The repair centre means quicker turnarounds on product repairs and increased customer service for customers in that region of the world. Generally, customers living in Asian countries are used to a higher standard of customer service than can realistically be achieved by shipping everything back to the United States, which is counterproductive.

Are your products Galileo compatible?


Rick Gosalvez: Yes, because the radio will pass forward any information it is given independent of the corrections a device is receiving. In order to accommodate the additional bandwidth of Galileo satellite corrections, users have to use proper radio configuration. Pacific Crests radios are compatible with anything that can be sent. You simply have to have two Pacific Crest radios properly configured on either side, make sure they are on the same frequency, and then you are ready to go.

Is there anything you would like to add, perhaps a message to our readers?
Rick Gosalvez: The high-precision market requires the integration of technology. This market is already exhibiting signs towards product integration that simplifies system complexity and improves user flexibility. Thanks to new and innovative products that allow for much easier integration, system integrators now have much more flexibility to customize their systems according to their field requirements.
Sonja de Bruijn (sdebruijn@geoinformatics.com) is editorial manager of GeoInformatics. Surf to www.pacificcrest.com to find out more about the company and its products.

Pacific Crest's machine control product, Sitecom, with two mounting options.

Looking at America, Europe and Asia; what can you say about things like adaptation of the technology you are offering, the saturation of these markets and which market means booming business to you?
Aldert Kluft, Sales Manager Europe: There are many alternative technologies to RTK: spread spectrum, UHF, GSM/GPRS, and many others. Our goal is to be compatible with many of these existing technologies in order to satisfy the survey market. Pacific Crest is primarily involved in the survey industry with new regional markets in Europe, the Middle East, Asia, and Russia; which all have grow-

April/May 2006

47

Article

Part 3: Error Sources


Practical Satellite Navigation
In the previous articles the position determination of GPS was discussed. The GPS position, based on the C/A code, nowadays has a precision of approximately 5 20 meters. With the P code, more precise results can be achieved (1 - 5 meters). The difference in precision between the C/A and P code is largely due to the length of the code and the broadcasting of the P-code on two frequencies. There are however a number of error sources that influence the precision of the GPS position and which can degrade the position with meters. This article will give a brief overview of a number of large error sources that can influence the position determination.

By Huibert-Jan Lekkerkerk

Figure 1: effect of satellite elevation on the path travelled.

Gravity Field
Satellites are equipped with very accurate atomic clocks, as was discussed in the previous article. Nonetheless there are still small errors at work mainly due to variations in the gravity field of the earth. As a result of relativity related errors, the satellite clock can show small discrepancies when compared to the mother clocks on earth. Furthermore small changes in the gravitational field of the earth will cause small changes in the satellite orbits. It was already shown that ground stations are constantly tracking the satellites. These control stations determine the corrections for both orbit and clock and transmit these to the satellites once a week. This implies that it is possible to calculate satellite positions based on an almanac which is almost a week old and possibly incorrect. For GPS applications where accuracy is of utmost importance, the correct almanac is therefore applied afterwards to the raw satellite measurements (post-processing).

Selective Availability
Shortly after the GPS system was completed tests showed that the system functioned better than expected. Instead of the predicted precision of 50 100 meters for the civil signal (C/A code Standard Positioning Service) the results were in the order of 10 20 meters. Although these results were very positive in a scientific sense, the American government felt these results were a threat. The main reason for this was that all users could calculate positions with a precision that was almost equal to that of the military signal (P-code Precise Positioning Service). It was thus decided in 1989 to introduce errors in the C/A coded signals, bringing the precision artificially back to 50 100 meters. This signal degradation was called Selective Availability (SA) and has been in use for over a decade, with the exception of the first Gulf war in 1991 when the American army did not have enough military GPS receivers for their

own troops. On the first of May 2000, president Clinton declared that, as a result of the broad use of GPS and DGPS, there was no further need to continue SA and it was switched off. This switch-off was however conditional with the reservation that it could be put back on in times of emergency. Until today SA has been switched off, even after the events of September 11.

Troposphere
The earth atmosphere consists of a number of layers, the troposphere being the first layer (up to a height of approximately 13 kilometres) where the weather is formed. Since the GPS satellites are orbiting high above the earth, their signals need to cross the atmosphere before reaching our receiver. Factors like humidity influence the speed of light, and as such delay the GPS signals resulting in travel time errors in the order of tens of meters.

48

April/May 2006

Article

GPS receivers do employ an atmospheric model to correct for these delays. Local weather variations cannot be modelled however and will result in errors of meters in the pseudorange measurement. The amount of delay depends on the time it takes the signal to travel through the atmosphere, which in turn depends on the satellite elevation above the horizon, see Figure 1. Satellites directly above the horizon will cause the smallest error, and as a rule of thumb, keep the elevation of the satellites used above 10 to 15 in order to reduce the potential error as much as possible. Another method by which tropospheric error can be reduced is the use of a multi-frequency receiver. It has been demonstrated that the amount of delay depends on the frequency of the radio signal. If we measure the travel time for both the L1 and the L2 frequency, we can estimate the tropospheric error to some degree. Most dual frequency receivers use the P-code for correcting the atmospheric error. Since this code is transmitted on both frequencies (L1 / L2) but has an unknown starting point, it cannot be used for determination of the absolute travel time. We can however take differential measurements since the code starts at the same point in time for both frequencies.

Figure 2: number of (predicted) sun spots for the current solar cycle. (source: www.taborsoft.com )

Ionosphere
The ionosphere is the layer in the atmosphere reaching from 50 to 500 kilometres. The sun ionises the air in this layer, creating a charged particle layer. A striking example of this ionisation is the polar light. The ionised particles delay the GPS signal, creating errors of up to 30 meters in the daylight or 6 meters at night. Large sources of ionisation are the so-called sunspots and related magnetic storms. These sunspots have an 11-year cycle with the next peak occurring in 2011 2012, see Figure 2.

At the moment we are approximately at the minimum of the solar cycle. This effect will also occur around the year in locations with a large amount of exposition to the sun (equator, around noon). With a small amount of ionisation the problem will be measurement errors. When there is a lot of activity, the GPS signal can be influenced in such a way that reception is impossible, see Figure 3. When using DPGS systems the effective range can, as a result of the solar activity, be reduced with a factor 2 to 4. Ionospheric errors as a result of sunspots cannot be predicted, but the regular ionisation of the atmosphere can be predicted using an ionospheric model. A multi-frequency receiver can resolve these errors in the same manner as with the tropospheric error.

Multipath
Just as light is reflected by a shiny surface, radio signals can be reflected by things like the water surface, tanks filled with oil and water, but also by cars and ships or bridges. The reflected signals will interfere with the signals that are received via a direct path, see Figure 4. The receiver may start using the reflected signal, which has a longer travel time, instead of the direct signal. As a result the position will be calculated incorrectly, with the position shifting in the direction of the multipath source. Since multipath is hard to correct for, it is better to prevent it altogether. As the first rule in preventing multipath is to keep the antenna as far away as possible from reflectors. Enlarging the elevation mask of the receiver can be of some help as can changing the height of the antenna. A multipath error will last a couple of minutes and will disappear as soon as the signal is no longer reflected towards the antenna. Nowadays most professional GPS antennas have a built-in ground plate or choke ring, see Figure 6, which prevents the reception of reflected signals from under the antenna horizon.

User Errors
The main sources of error in GPS measurements are user errors or as they are usually called, blunders. As a rule, blunders can be prevented by a consequent measurement strategy using as many control options as practically possible. Common blunders are: Measuring too close to objects with either multipath or shielding from the horizon as a result. This results in a degraded

Figure 3: RTK GPS measurements in november 2001. The scale for both X,Y and Z is 0.25 meters. The Kp index is an indication of the radio environment in the ionosphere (red = bad). (source Kp index: http://www.sec.noaa.gov)

Latest News? Visit www.geoinformatics.com

April/May 2006

49

Figure 6: Position error through an incorrect choice of geodetic datum. In the example we read ED50 positions (centre). The WGS84 positions from the GPS receiver are 180 meters further in coordinates.

Figure 4: Through reflection of GPS signals a longer travel time is registered, resulting in position errors.

position and a difficulty to detect. Large steel structures such as cranes or masts will shield the horizon just as a bridge or a tree, a fact that is not always appreciated in the field; The use of height aiding without entering the correct antenna height above sea level. As was discussed in the previous article, the use of height aiding should be questioned these days since sufficient satellites are available for a good positioning fix under normal conditions; Incorrect initialisation position after a cold start of the receiver. This will not result in an incorrect position, but in no position reading altogether; Incorrect geodetic settings. GPS calculates all positions in WGS84 coordinates, but most receivers have the option to transform these to any other coordinate system for presentation on the screen. With most receivers the output message will however contain WGS84 coordinates. Errors as a result of the selection of an incorrect geodetic datum can be as high as hundreds of meters, see Figure 5.

Quality Control
To gain insight into the quality of a calculated position there are a number of quality control parameters available in most GPS receivers. The most important one probably is the Dilution of Precision (DOP). The DOP describes the geometric strength of the satellite configuration, or in other words the spreading of the satellites around the horizon. When all satellites are on one side of the horizon, see Figure 7a, the receiver will calculate a high DOP value. There are a number of DOPs available, but with ordinary GPS positioning the Horizontal DOP (HDOP) and geometric DOP (GDOP) are possibly the most important ones. Next to the DOP, some receivers have the ability to calculate the so-called Line of Position Mean Error (LPME). This is an indication of the precision of the position itself and will factor in other parameters like the travel time measurement. Some manufacturers present the user with a so-called quality figure that is said to indicate the precision of the position determination. This quality figure is usually calculated from

Figure 5: antenna with choke ring to prevent multipath (source: www.ipgp.jussieu.fr).

parameters like the HDOP and LPME. As a rule one should treat these figures with due caution since the formula used to calculate this is generally unknown to the user.

Summary
From this article it can be seen that there are a large number of error sources influencing the GPS position determination. We should take these error sources rather serious when performing high quality GPS measurements. A number of the errors described in this article can be corrected using DGPS, which will be described in the next article.
Huibert-Jan Lekkerkerk (hlekkerkerk@geoinformatics.com) is a freelance writer and trainer in the field of positioning and hydrography.

Figure 7: the Dilution of Precision is high (a) when all satellites are on one side of the antenna and low (b) when there is an even geometric spreading of the satellites.

50

April/May 2006

Column

About Maps and Beauty


Recently I visited an exhibition of historical maps. Here I observed two things: how pleasant it was to take time to look at the maps and how beautiful these engraved maps are. Probably that is why many people like maps anyhow. Can I enjoy todays maps as well? The answer is mixed.

If you look at maps on the Web, which I tend to do quite a bit, and not only because I have to, I notice that I take less time to look at them. It looks as if those maps are more businesslike: they have just as old maps by the way a purpose, but you have to interact to reach your goal. You have to pan, zoom, switch layers on and off to get that (often small part) of the map on the screen that fulfils your purpose. These maps are very efficient for that purpose, and some of them are even very well designed and can be classified as beautiful. And if you look from a technology perspective the approach to the solutions is very clever using the latest features of for instance Flash or SVG. Still, because these maps are so fast so to say, your appreciation is different when looking at maps at an exhibition. Todays mapping technology is no longer an excuse not to create good graphic quality. However, it is also a matter of design. Many of the early web maps were interesting in functionality but poor in design. This was partly because the maps produced were just bitmaps, and partly because the producers were already happy with the fact that their technology worked. Nothing new you will say. Indeed, when looking back at the map production history, the introduction of new technology, being it the introduction of lithography, or the plotter, the first results were always a step backwards before one could really benefit from new technology. In this respect it is interesting to page through the ESRI map books from the last decade. These show quite an improvement. The only problem is the page size of the map book where beautiful designs of large maps are reduced to stamps. So size is not just a Web problem. That is why some have started making on-line map galleries. It doesnt solve the size problem, but allows interactivity. Here I would like to draw your attention to a recent initiative to start a journal that focuses on maps as

such. The editor claims rightfully that in todays publishing world there are limits in the use of colour, let alone if one would like to publish map a size other then the journal page. Have a look at the Journal of Maps at www.journalofmaps.com and see if you can enjoy the maps published. Another example where we can witness improvement and where the joy of maps returns are the web map services which help you find location or get route information. MapQuest was one of the early players in this field. From a technological point of view is was quite something that you got a response within seconds for every question asked. The result: millions more maps than ever are produced. The quality of the maps? It is poor, because virtually no attention was paid to design. Today MapQuest hasnt changed that much, but there are new players. Lets pick one randomly: Google Maps. The maps one can produce with Google

Prof. Dr. Menno-Jan Kraak (kraak@itc.nl) is a scientist at ITC, the International Institute of Geo-Information Science and Earth Observation, Department of Geo-Information Processing, Enschede, the Netherlands.

that again will limit our pleasure in viewing. This makes me think of the ESRI maps in our geo-community. One could always easily recognise them because of the typical legends they applied for line-features, like the zig-zag line for roads. Developments in our geo-world have their ups and downs, but if you understand the environment in which our maps have to play a role I guess we can recognize, appreciate and enjoy the beauty of maps.

Todays mapping technology is no longer an excuse not to create good graphic quality. However, it is also a matter of design.

Maps are automatically well designed, crisp and clear at any zoom level. They are also an example of how the maps are fully integrated into the whole idea behind the search engine. The whole Internet is a link to their maps, which in Europe are currently limited to Britain only. And interestingly enough it is possible to create and build your own maps using the Google maps as a base map. Pleasant to see, yes, but considering the influence of the company all maps will look like Google maps, and

Latest News? Visit www.geoinformatics.com

April/May 2006

51

Educational Corner

Towards a Mobile Geographic


Why Location Matters in Education
With Galileo just around the corner and data transmitting technologies like RFID available new spatial applications come into mind. One of the brand-new stinging ideas is why we still cant learn at the location where we actually are to get background information on our current experiences. By Anja Kipfer
Classification of mobile Location-based Learning.

Learning anywhere.

More Fun
Audio guides in museums already provide information exactly where you need it. This is more fun than teaching yourself in advance or reading through some leaflets. More fun usually means better attention, but besides the motivational aspects building a spatial and thereby visual or haptic connection between a learning object and the learning content stands for a better cognition and remembrance. This concept of adapting educational content to the learners position is called Location-based Learning (LBL). Obviously, this requires cooperation between scientists and practitioners both from spatial information and pedagogy fields. The following paragraphs are dedicated to finding a definition of LBL and describing its characteristics as well as some research projects. Finally, a description of an LBLenabling platform is given.

really location technology-oriented but emphasizes the spatial and cognitive relationship between a learning object and the appropriate content. Our museum scenario also works with fixed units in front of the learning object to profit from the cognitive advantage of actually sensing the art work while getting the relevant information no locational technologies needed for that. Establishing a computer

system in front of the art work and creating an easy-to-handle user-interface should do. But there is another side to it which promises to be more fun by taking mobility into account. With more precise positional technologies and more powerful mobile devices available this concept can be extended to the outdoor world. Possible scenarios are users on the move in a nature reserve while learning on plants, land usages, climate and so on in combination with a game platform. Also, travel parties can experience a virtual guided tour in combination with some tutorial support to get away from the passive listeners role. These user cases are a form of mobile eLearning (Mobile Learning) which is didactically related to situation learning with positional information being part of the situational context of the learner. At a first glance, these scenarios sound very familiar also to geographers ears and resemble Location-based Service-applications.

Collaborative learning with mLBL.

What is LBL?
LBL does not necessarily imply being mobile. The term location-based is not

52

April/May 2006

Educational Corner

Educational System
Another Location-based Service?
However mobile Location-based Learning (mLBL) is different from being a Locationbased Service. Surely both put a focus on providing information to the user related to her or his position and other context information. But mLBL is more aimed at what the user actually does with the information. You could say it is a marriage between Ubiquitous Computing and Ubiquitous Learning. So, a technology which supports mLBL requires a mobile workbench which allows for cooperation with other learners, for solving tasks as well as collaboration tools to work on the same exercise and tutorial support. Of course, these requirements are user-case specific since a school class has other requirements on a learning platform than tourists do. But generally, learning means more than offering an information platform. However it is yet to be proven if this is an idea with a long-term perspective. learning motivation like being obliged to learn what is important to pass a test. Learning by collaboration is also considered to have positive effects on memorizing learning contents as well as on the development of social behaviour. Some of the first results point indeed to the fact that collaboration and gaming are also very attractive factors for learners when exploring internal or external environments. So far, these projects have a clear research focus.

Nature Reserves
Components of an mGES.

Mobile Campus Game


Within the framework of the EU-research initiative MobiLearn at the University of Zurich, the prototype MobileGame was developed. By constructing a mobile campus game first-year students should get an easier entry into university life by performing different tasks concerning important people and places or scheduled or ad-hoc events. Task-solving took place on specific locations which were passed on by the learners in a collaborative way within teams but competitively between teams. With an electronic orientation rally students were led over the campus via digital outdoor and indoor university maps on a handheld PC. Testing the effects of collaborative OutdoorGaming on learning was the focus of the project Ambient Wood lead-managed by the Interact Labs in Brighton. Target groups were 11 and 12 year old school kids. In a digitally augmented environment they collaboratively explored the natural setting in pairs. The pupils used probe tools to collect some georeferenced measuring data. Via GPS the position of the kids was deduced and location-specific tasks had to be solved.

Research Projects
It is no use investing in positional technologies and a different infrastructure for mLBL when the whole concept of positional learning with mobile devices will not prove to be an educational success in the long run. The main focus of mLBL is teaching and learning. There are already a few research projects focusing on didactic and technical implications of location-related learning. These are learning scenarios like making first-termers familiar with their campus environment or leading pupils through a nature reserve. Under the auspices of the Nesta Futurelab, Bristol, a research project was accomplished with the didactical focus of connecting mobile gaming with collaborative, self-controlled and experimental learning. Mobile Technologies should allow for learning scenarios outside a class-room. A mobile gaming environment for 7th formers was established to study animal behaviour. The settings were visualized by abstract PDA maps which were enriched with specific game information. In a separate control-room the game activities were presented on an interactive whiteboard. One of the results was that instabilities experienced while using GPS often complicated execution of the game. But again, new technologies like SIRF III-based systems can open up some new opportunities.

Another example for an mLBL-scenario which covers environmental learning would be to use infrastructure and educational content that are already available in nature reserves as well as in pedagogical expertise. A lot of people learn about the environment by reading information on huge presentation boards. A survey untertaken in some nature reserves in Germany has proven that learning on environmental issues usually doesnt take place in this environment and therefore the positive effect of sensing a learning object while learning is absent. Fixed computer systems are used in nature reserve centres and of course the good old presentation board is used, although some reserves have quite ambitious projects to get visitors acquainted with the natural environment. In future learning scenarios RFID tags on natural phenomena like special trees or geological features could send information to the learners device that can accordingly be used for instance to perform a problem-solving task or to add new data to a specific location. Since RFID requires no visual connection between the tag and the reading device, the learner can even be guided to discover some phenomena on his own. Because the information can be sent to many devices at once, group learning is also feasible. Thereby, spatial information technology can really support didactically sound learning.

Motivational Effects
The projects mentioned above all have in common that they focus on groups of pupils or students with a homogenous age group and put an emphasis on gaming and collaboration. Gaming is known to have strong motivational effects on learners with an extrinsic

Visualizing
Environmental learning is a prominent candidate for mLBL. Visualizing ecological processes is crucial for understanding. Refining maps with own primary or secondary measuring data is a typical exercise for students of related disciplines. Combining those maps

Latest News? Visit www.geoinformatics.com

April/May 2006

53

Educational Corner

with real learning content to give background information on the natural environment to the student and allowing for explorative on-site learning might solve the gap often experienced between theoretical knowledge from the library and field-trips. Especially, three aspects of the examples described above seem to make mLBL an interesting playground for spatial information scientists and practioners: 1) choosing the right positioning technology for specific learning scenarios, 2) creating application-specific maps which include learning content and 3) integrating didactic requirements into a geographic platform. These requirements could be covered by a mobile Geographic Educational System (mGES) which would serve as a base to provide for mLBL-scenarios.

mGES
Technological requirements still pose major challenges of Location-based Learning environments. This means outside learning scenarios with an audio guide automatically adapting to your position is still some steps ahead. But with innovation cycles ever so fast lets not talk about hardware devices, but focus on features you could expect from a platform that integrates spatial information and educational requirements. A mobile Geographic Educational System (mGES) like that would acquire a combination of the known features of a learning platform as well as the components of a GIS. When exporting raster or vector maps and attributing data to a mobile device, why not exporting georeferenced learning content as well? Creating application-specific maps and handling large amounts of geodata would be on the desk of a spatial information expert as well as running spatial analysis tools and supporting mobile devices with various positioning technologies. The eLearning expert would be responsible for designing the layout of an mGES from a didactical point of view. This would among others cover defining tools for tutorial support, collaboration, task-solving and evaluation. Before deciding on the functional scope of a mGES and the relevant tool support it is crucial to define the spectrum of application scenarios which can be covered by mLBL as well as the appropriate target groups. There dont seem to be clear requirements or definitions yet on how an integration of educational and geographic content can be achieved. As well as from the functional or tool-oriented point of view, creating didactically sound learning material adapted to the users situational context requires cooperation between spatial information and pedagogic scientists and practitioners. Maybe it would

mLBL in a nature reserve.

be a good starting point to experiment with Google Earth as a base system for an mGES.

Whats Next?
An interesting task would be to apply existing research results and integrate them into new application scenarios. Mobile Learning applications have quite some history for mobile workers or being a component in Blended Learning scenarios which use mobile devices as a supplement for existing face-to-face courses. An extension of those projects towards mLBL could provide for an additional value. So, starting from our environmental learning scenario one possible application setting would be an export solution from an mGES to a mobile device which is equipped with positioning technology. First of all, it has to be decided who should learn what. First formers simply need different learning material than sixth formers. Going from there, necessary content has to be identified and adapted to the specified learners via a target group analysis. Here, nature reserves are an excellent starting point since they often have a mission to educate and possess excellent learning content. Then, media files like audio and video will have to be generated which are connected to the learning content. Appropriate learning exercises and evaluation mechanism have to be added as well. Those Knowledge Nuggets can be exported to the mobile device. This is business as usual but the geographical value comes with the fact that this knowledge would be georeferenced and includes some maps to visualize content and allow for show-

ing the positional data of the user. This scenario could be augmented by data pushed to the users device while exploring the natural environment. By offering analysis tools and tutorial support it can be assured that the new knowledge is understood and worked on to allow for good educational results.

Summary
As the preceding text has shown, a combination of spatial information technologies and educational applications can provide for mobile Location-based Learning scenarios. Results of mLBL research projects already can be used to find successful configurations for a mobile Geographic Educational System, which could serve as a standard platform for mobile geographic applications with an educational focus.
Anja Kipfer (Anja.Kipfer@eml.villa-bosch.de) is a Geographer and has a Master in Educational Media. She works as a Software Developer in the Spatial Information Department of the EML GmbH, Heidelberg.

Latest News? Visit www.geoinformatics.com

April/May 2006

55

Product News

Oc Introduces Five Black & White Systems


Oc has launched a new family of multifunctional black & white systems. The Oc VarioPrint 1055/ 1065/1075 delivers 55, 62 and 72 pages per minute (ppm) and the Oc VarioPrint 2062/2075 62 and 72 ppm to cover the speed range segments 4 and 5. The new Oc systems offer: Fully integrated fingerprint-based printing via Oc TouchTo Print easy, direct job access via the optional built-in sensor; Easy PDF document handling using any USB memory stick print from and scan to the Oc Pocket Mailbox; Serverless follow-me printing with the Oc SMART Mailbox documents available at the point of need, without any IT infrastructure changes, securing an 99% uptime guarantee for the print infrastructure; Oc IntuiGraph user interface with big buttons, a scroll wheel and a wizard button. They also come with a number of features that enhance everyday work processes: Powerful job processing thanks to the embedded Oc Genie or Genie Pro controller ; Smart scan profiles for quick and integrated document digitisation workflows; Smart@email: scan to email with LDAP and Microsoft Active Directory support; Oc PRISMA workflow software support to optimise and integrate office document operations with printroom and ERP operations. The print engine and scan technology are based on Oc Copy Press and Image Logic technology, both of which have been further fine-tuned and continue to be exponents of the renowned Oc quality, reliability and productivity. The new systems are backward-compatible with all previous Oc VarioPrint models.

Source: Oc Internet: www.oce.com

Sokkia Introduces MONMOS Industrial Measuring System


Sokkia Europe extended its industrial range with the new MONMOS industrial measuring system existing of the new NET1100M 3-D station and 3 DIM-observer controller/software package. The new motorized MONMOS system features Sokkias original Servomotor drive mechanism and control algorithms using angle information obtained directly from the encoders. In addition the EDM has been improved; it features the latest digital signal and sophisticated optical technologies. This results in an expanded measuring range of 300m, the widest in the MONMOS series ever. The NET1100M is able to measure larger objects than before without moving the instrument, which results in more accurate measurements. The target illumination function provides the user with a better view in poor light situations even when doing these long range measurements. The NET1100M features timesaving semi-automatic control when used in conjunction with 3-DIM Observer, the special controller package developed by Germany based GLM (Sokkias industrial partner). It realizes high precision 3D measurements for deformation of landslides, tunnels, ships, turbines, dams, buildings and road surfaces; tunnel profiling, construction supervision, shape and dimensional measurement of domes or train bogies, large-scale part installation in factories and more. The new MONMOS system is available now through Sokkias exclusive dealer network. Source: Sokkia Europe Internet: www.sokkia.net

Release of Leica CloudWorx 1.0 for PDMS


At the SPAR2006 conference in Houston, Texas, Leica Geosystems announced the immediate availability of Leica CloudWorx 1.0 for PDMS. This point cloud solution is designed for PDMS users who want to take advantage of accurate, laser scan as-built data directly in PDMS. Leica CloudWorx 1.0 for PDMS is the latest addition to the Leica CloudWorx suite of products that enable professionals to use rich, as-built point cloud data directly in their native desktop design and visualization platform. PDMS is part of AVEVAs VANTAGE Plant Design family.

Leica Geosystems Launches Britains first Commercial RTK Network


Since mid January 2006 Leica Geosystems SmartNet service is live and available as a broadcast correction service to subscribers via GSM or GPRS technology in Great Britain. The service is a partnership between Leica Geosystems and Ordnance Survey. SmartNet is based on raw data from the Ordnance Survey network of GPS base stations. This network, known as OS Net, comprises around 90 permanent nationally deployed GPS reference stations. However OS Net is commercially available only via partners. Data from each of these base stations around the country is received over the Internet at a highly secure location in London Docklands. Here it is processed using Leica SpiderNet, Leica Geosystems advanced network calculation software, and made available to users when they dial up or log in. Source: Leica Geosystems Internet: www.leica-geosystems.com

High density point cloud from phase-based Leica HDS4500 scanner in PDMS.
Supports a variety of laser scanners Including native data formats from Leica Geosystems scanners; CloudWorx toolbars Access CloudWorx operations; Visualize a new design concept directly in context with reality.

Key features and capabilities of Leica CloudWorx 1.0 for PDMS include: Measure Using PDMS own measuring tools; Automated clash checking - Using PDMS built-in clash management and reporting tools; PDMS Design Point (D-Point) placement - At pick point or center-of-pipe, D-Point placement lets users create intelligent as-built models directly in PDMS using catalog components and objects; Easy point cloud management By Scan, Limit Box, Cutplane slices and sections, HideRegion;

Source: Leica Geosystems

Internet: www.leica-geosystems.com/hds

Latest News? Visit www.geoinformatics.com

April/May 2006

57

Product News

Bentley Connects MicroStation to Google Earth Service


The MicroStation model content is available to users of the Google Earth service. For example: All included levels available to the MicroStation user are persisted in the KML file, so the Google Earth user can easily switch parts of the model on and off as desired; Saved views in MicroStation are transferred to the KML file, so the Google Earth user can easily move through pre-configured perspectives in the model; Embedded links within a MicroStation file will be automatically published as Google Earth Placemarks. This allows the Google Earth viewer to quickly navigate to supporting project data; The geometries of MicroStation GeoGraphics users who have defined the coordinate system for their designs will be automatically exported to the correct locations in Google Earth; MicroStation raster imagery can be published to Google Earth to replace or augment the Google Earth imagery. Bentley also decided to publish some Frequently Asked Questions (FAQ) on www.bentley.com/en-US/Products/MicroStation/Google+Earth+Tools/FAQ.htm. Why does Bentley want to make 3D modelling cooler for a wider audience? What impact does the acquisition of @Last by Google have on Bentleys plans? These are just some of the questions Bentley provides an answer to. Bentley SELECT subscribers can download the new connection software now for use with MicroStation V8 2004 Edition. The capability is delivered within MicroStation V8 XM Edition. To view an online demonstration of the MicroStation and Google Earth connection, view an eSeminar on how to publish DGN and DWG models to the Google Earth environment, or to learn more, go to www.bentley.com/earthtools.

By means of a web conference and a press release just before this virtual gathering halfway March Bentley wanted to draw attention to remarkable news indeed: MicroStation has been connected to the Google Earth service. Joe Croser, global marketing director, Bentley platform products, and Ray Bentley, lead developer, explained that this means infrastructural projects can be viewed in context. Adding models, 3D viewing, zooming in and out, turning off and on local information and reference files, turning on levels like parking or roads, all this is possible. The connection MicroStation- Google Earth means that CAD and GIS data are combined (the user first having to register CAD files for Google Earth to recognize them), which I think everybody agrees with is quite an interesting aspect.

58

April/May 2006

Product News

New GNSS Technology Leica Geosystems


Leica Geosystems extended its GPS product portfolio with the launch of the Leica GX1230 GG and Leica ATX1230 GG sensors and the GRX1200 GG Pro sensor for reference station networks. The new ultra precise GNSS measurement engine now supports both GPS L2C signals and GLONASS satellites. Users of these Leica GNSS solutions now have up to 100 percent more satellites available than GPS alone. Additionally, these systems are designed to track future GNSS signals, such as GPS L5 and Galileo, guaranteeing investments. Leicas SmartTrack+ guarantees very low signal noise, high sensitivity, fast re-acquisition and tracks all available signals. Features of the GRX1200 GG Pro sensor include Internet-connectivity via HTTPS, onboard generation of RINEX files and the prompt FTP Push of high quality Raw and RINEX Data. It is configurable using a Web interface or GPS Spider affording full operations whether working remotely or in the office. The lowest power consumption in class reduces infrastructure requirements. The new ATX1230 GG sensor is fully compatible with the

Thales GPSDif ferential Module

Leica TPS1200 total stations creating a GNSS SmartStation. Source: Leica Geosystems

Internet: www.leica-geosystems.com

Leica Geosystems Launches Leica GeoMoS Version 2.0


a tool for easy database replication and synchronization between up to five clients and one server. Easy configuration and control of distributed monitoring systems from a single server, secure data storage, centralized messaging and system integrity monitoring are just some of the benefits according to Leica Geosystems. It is also stated that Leica GeoMoS Office, new in Leica GeoMoS 2.0, allows fast and easy access to data recorded by remote monitoring stations by using automatic data distribution via FTP. The office database is used as a backup database for the measurement stations and allows for offline analysis and post processing. There are several other enhancements to the limit checks and messaging tools, updates to the online help and additional functionality concerning data editing and post processing.

Leica GeoMoS is a solution for multi-sensor structural monitoring using a range of high precision geodetic instruments from Leica Geosystems and third-party sensors. Version 2.0 of the software provides a step forward in secure data replication, synchronization and post processing. The new Leica GeoMoS Server, introduced in Leica GeoMoS 2.0, is

Thales introduced GPSDifferential Module, a software extension for MobileMapper CE that seamlessly adds the power of post processing to virtually any mobile GIS/mapping software application. With the GPSDifferential Module, sub-meter and up to sub-foot mapping are also achievable where real-time corrections are not available, or when used in difficult signal environments required by applications such as forestry. GPSDifferential Module is a software extension that fully integrates into third-party mobile GIS software applications. Behind the scenes and without interrupting normal workflow, GPSDifferential Module automatically logs the raw data that is required for reliable sub-meter post-processed differential corrections. In certain conditions and with an external Thales GPS precision antenna, accuracy up to subfoot can be consistently achieved. The Thales GPSDifferential Module is a business partner and software integrator tool that includes MobileMapper Office software for post-processing. When GPSDifferential Module is integrated into a GIS application, raw data collection functions can be accessed that will store the data in a separate raw data field that will be later recognized by MobileMapper Office. This will affect neither real-time data collection storage nor the structure of the realtime data format. The raw data file stored by this software extension can be post-processed using MobileMapper Office. Post-processed data is exported with attributes in industry standard formats. Software Integrators find that GPSDifferential Module easily merges with their own solutions without the need to change the data structure to support post-processing or create office-based post-processing software. Source: Thales Internet: www.thalesgroup.com/navigation

Source: Leica Geosystems Internet: www.leica-geosystems.com

Smart GPS Timing Antenna Trimble


chronized to UTC within 15 nanoseconds (one sigma). It is a solution for adding GPS timing and synchronization into any application where ease of installation and longterm reliability are critical. The new Acutime Gold GPS smart antenna is directly compatible with applications built around previous generation Acutime 2000 receivers. The antenna can be used for precise timing and network synchronization, including broadband wireless applications. It provides an independent timing source, within the firewall, for any application such as network fault detection systems and synchronization of wireless networks. Source: Trimble Internet: www.trimble.com

Intergraph Announces MAP2PDF for GeoMedia and Digital Cartographic Studio


MAP2PDF versions for Intergraph product lines, GeoMedia and Digital Cartographic Studio (DCS) are now generally available. GeoPDF embeds geospatially referenced data for map coordinate readouts, distances and bearing information in PDF format. The new MAP2PDF will export geospatial data from GeoMedia or DCS to a georegistered PDF with layers and feature attributes. This GeoPDF can be easily distributed and used in connected or disconnected modes with TerraGo's free Adobe Reader software. Users are able to view maps, turn layers on and off, query attributes, display coordinates, GPS track and create redlines and notes. Source: Intergraph Internet: www.intergraph.com

Trimble introduced the latest in a long line of GPS timing receivers, the Acutime Gold GPS smart antenna. Slightly larger than a baseball and housed in a rugged, environmentally sealed enclosure, the Acutime Gold provides a pulse-per-second (PPS) output syn-

Latest News? Visit www.geoinformatics.com

April/May 2006

59

Product News

PCI Geomatics Develops SAR Polarimetry Workstation


PCI Geomatics announces the development and release of the SAR Polarimetry Workstation (SPW) as part of the Canadian Space Agency (CSA) Earth Observation Application Development Program (EOADP). The SPW is available as an add-on module to Geomatica 10. Polarimetric synthetic aperture radar (POLSAR) data has greater information content than the more commonly available non-polarimetric SAR data. There is solid experimental evidence that this data can be exploited in a variety of important tasks, including sea-ice detection and classification, agricultural crop monitoring, and forest type and harvest mapping. In addition, POLSAR data can be collected day or night and under atmospheric conditions that prevent the collection of optical data, such as clouds. The availability of POLSAR data, particularly from satellite-borne sensors, will soon increase significantly, and there will be a considerable interest in establishing truly operational applications of this data. The SPW is a tool that will help make this happen. The SPW can directly read data products from the AIRSAR, ENVISAR ASAR, CV-580, and SIR-C systems. It is also able to read synthetic RADARSAT-2 data products and will be able to read the real data products once they become available.

PCI Geomatics Of fers Geomatica 10 for Linux and Solaris Users


PCI Geomatics released Geomatica 10 for Linux and Solaris users. This latest version emphasizes automation, productivity, and support for more than 100 geospatial data formats. Geomatica 10 offers solutions for various geomatics processing requirements, while maintaining interoperability with outside software packages. Geomatica 10 offers enhanced charting and new atmospheric correction algorithms for hyperspectral data. In particular Geomatica 10 takes full advantage of Oracle 10g and according to PCI Geomatics makes uploading and downloading multiple-formatted and large quantities of data effortless. Geomatica 10 for Linux and Solaris is complete, featuring the latest version, 10.0.1.

Source: PCI Geomatics Internet: www.pcigeomatics.com/g10

Sokkia Series30R
Bluetooth wireless communication is now available for Sokkias Series30Rtotal station line providing cable-free communication with data collectors. The Sokkia Field-info Xpress (SFX) function, fitted as standard, enables data transfer via the Internet using mobile phones. As Series30Rs Bluetooth wireless communications module has a dial-up function, SFX can be used without cables using a mobile phone with Bluetooth technology. This latest version of Series30R is also equipped with enhanced surveying programs. The Sokkia Series30Rtotal station line offers IP66 level of dust and water resistance, and distance measuring capabilities of survey-grade accuracy +/-(3+2 ppm x D)mm from 30 cm to 350 m (Class 3R models). Source: Sokkia Internet: www.sokkia.net

Source: PCI Geomatics Internet: www.pcigeomatics.com/products/ radar_data.html or www.pcigeomatics.com/products/ products_overview.html

Boeing's SoftPlotter 4.1 Software Release Enhances Digital Map Production


model and triangulation support, while digital aerial camera support for the Intergraph Z/I DMC, Vexcel UltraCAM and Airborne Data Systems digital cameras allow users to process imagery directly from these devices. SoftPlotter's new multi-window capability allows all viewing tools to display multiple stereo and monoscopic views of imagery, with geosynchronous cursor movement in all views. New AutoCAD, MicroStation and ESRI translators are included, and Visual Basic workflow wizards provide streamlined workflow setups for batch processes. Of interest to KDMS users, SoftPlotter 4.1 provides a COM interface callable from macros and a database interface for collection of fully attributed vector map data.

Release NovAtel Galileoready Receiver & Antenna


NovAtel released its first production standard Galileo-ready receiver and antenna. The new L1L5E5a receiver offering 16 channel tracking of GPS L1/L5, Galileo L1/E5a and SBAS signals in a Euro form-factor card, is packaged in a EuroPak enclosure. The complementary 704X passive antenna offers access to multiple Global Navigation Satellite Systems (GNSS) including GPS, Galileo and GLONASS frequencies. Currently, the Galileo functionality of the L1L5E5a receiver is available only to customers authorized by the European Space Agency (ESA), due to an intense test campaign that ESA is conducting with GIOVE-A, the first Galileo test satellite launched December 28, 2005. According to Tony Murfin, NovAtel's VP Business Development, the EuroPak-L1L5E5a receiver is ideally suited for customers, such as government agencies and universities, who want early access to the new GPS GEO L1 and L5 satellite signals and Galileo L1 and E5a signals for research purposes. The L1L5E5a receivers, which were initially developed under contract with the Canadian Space Agency, were first demonstrated in May 2005 and have undergone engineering testing.

Boeing has released version 4.1 of its SoftPlotter digital map production software, enabling users to provide more accurate and efficiently produced digital mapping products to their defense and commercial mapping customers. Digital Globe QuickBird sensor support for panchromatic and multispectral imagery offers SoftPlotter users full, rigorous sensor

Source: Boeing Internet: http://sismissionsystems.boeing.com/products

Matrox TripleHead2Go External Upgrade Of fers Support for 3 Monitors at a Time


Boeing has released version 4.1 of its SoftPlotter digital map production software, enabling users to provide more accurate and efficiently produced digital mapping products to their defense and commercial mapping customers. Digital Globe QuickBird sensor support for panchromatic and multispectral imagery offers SoftPlotter users full, rigorous sensor model and triangulation support, while digital aerial camera support for the Intergraph Z/I DMC, Vexcel UltraCAM and Airborne Data Systems digital cameras allow users to process imagery directly from these devices. SoftPlotter's new multi-window capability allows all viewing tools to display multiple stereo and monoscopic views of imagery, with geosynchronous cursor movement in all views. New AutoCAD, MicroStation and ESRI

Source: NovAtel Internet: www.novatel.com

translators are included, and Visual Basic workflow wizards provide streamlined workflow setups for batch processes. Of interest to KDMS users, SoftPlotter 4.1 provides a COM interface callable from macros and a database interface for collection of fully attributed vector map data.

Source: Boeing Internet: http://sismissionsystems.boeing.com/products

60

April/May 2006

Product News

NavCom Announces Enhanced VueStar Aerial Survey System


vey applications. It utilizes the global satellite-based StarFire Network to provide precise positioning worldwide without the need for RTK base stations or GPS post processing. This new system introduces three significant enhancements: Improved GPS signal processing, StarPac Mission software, and the optional Event Latch Interface. The new 12-channel, dual-frequency GPS receiver computes real-time positions at up to 25 times a second, reacquires GPS signals faster, and includes improved troposphere modeling which better compensates for changes in altitude. Providing both pre-mission and post-mission processing, the new StarPac mission software provides critical tools for mission planning, as well as RINEX conversion, datum conversion, trajectory plotting and output of position data in a number of formats, along with a quality/figure of merit for each position. Source: NavCom Technology Internet: www.navcomtech.com

Pictometry Viewer ActiveX Control

NavCom Technology introduced the newly enhanced VueStar aerial survey solution. This combines its new StarPac utility software that facilitates better integration into pre-existing workflows. VueStar complete global navigation system is configured specifically for all aerial sur-

New Spider and SpiderWEB software Leica Geosystems


Monitoring applications Leica GPS Spider now supports the recently introduced Leica GMX902 monitoring GPS receiver. A new coordinate post-processing complements the previously available real-time positioning, to support slow moving object monitoring. Both, real time positioning and postprocessing, support now data rates of up to 20Hz, as can be provided by GMX902 or GRX1200 series GPS receivers, for detection of high frequency object motion. Leica SpiderWEB V1.3 allows GPS Network administrators to keep track of users, data and downloads. GPS Network users can download GPS RINEX observation data for single or multiple stations with just a few mouse clicks. Leica SpiderWEB complements the Leica Geosystems reference stations software product portfolio consisting of Leica GPS Spider and Leica GNSS QC. For demo user access login to Leicas demo sever at the following web address: http://spiderweb.leica-geosystems.com Source: Leica Geosystems Internet: www.nrs.leica-geosystems.com and www.leica-geosystems.com

Pictometry Viewer ActiveX control enables third-party software vendors and system integrators to embed Pictometry's oblique imagery and analytical tools directly into end user applications. Using Pictometry's ActiveX control, third-party software companies can now integrate their own customized version of Pictometry's software tools, similar to Pictometry's own Electronic Field Study (EFS) software without having to leave their native application environment. This gives users access to Pictometry's software functionality in a third-party application. The company has been working with several technologies and business partners to test, implement, and deploy its ActiveX control in other geospatial related systems. One of the first set of solutions where Pictometry has successfully integrated its ActiveX control is in ESRI's suite of GIS products that include ArcIMS and ArcGIS. Sample screen capture images of Pictometry technology in ESRI and microDATA GIS solutions can be viewed at www.pictometry.com/pressrelease/activex.asp. Companies that are interested in partnering with Pictometry and using its Active X control can contact Pictometry Vice President Scott Sherwood. Source: Pictometry Internet: www.pictometry.com

Leica Geosystems introduces a new version of its GPS Spider software and SpiderWEB V1.3, a web-based solution for distribution of GPS reference data via the Internet. Amongst software optimizations, such as further improved data processing for network RTK, the graphical user interface of the new Spider software has been enhanced with consistent map views supporting now loadable background maps and a graphical continous raw data status view. In view of GPS

Pentax Launches R-300NX Total Station


Pentax introduces its new R-300NX Series Total Station measuring up to 270m using the non-prism function in the long measurement mode. Non-prism measurement range and accuracy : Measurement range. Reflectorless : 1.5m Normal range mode : 90m Long range mode : 270m (NX models only) Accuracy - Reflectorless. R-322NX and R-323NX : 1.5 to 200m : (5 + 2ppm x D)mm 200 to 270m : (7 + 10ppm x D)mm R-325NX, R-335NX and R-315NX : 1.5 to 200m : (5 + 3ppm x D)mm 200 to 270m : (7 + 10ppm x D)mm

Source: Pentax Internet: www.pentaxsurveying.com

Latest News? Visit www.geoinformatics.com

April/May 2006

61

Industry News

Facts / Figures / Contracts


NTT DoCoMo Achieves 2.5Gbps Packet Transmission in 4G Field Experiment
NTT DoCoMo, Inc. has achieved 2.5Gbps packet transmission in the downlink while moving at 20km/h. The fourth-generation (4G) radio access field experiment took place in Yokosuka, Kanagawa Prefecture on December 14, 2005. DoCoMo achieved a maximum 1Gbps speed in a similar field experiment on May 9, 2005. This time, by increasing the number of MIMO*1 transmission antennas from four to six and by using 64-QAM*2, data volume per transmission was increased from four bits to six bits. As a result, DoCoMo achieved a maximum speed of 2.5Gbps, which is faster than the International Telecommunication Union Radiocommunication Sector (ITU-R)'s proposed standard. Frequency spectrum efficiency*3, which is expressed as information bits per second per Hertz, was also increased from 10 bits per second per Hertz during the last experiment to 25 bits. This figure is the maximum frequency spectrum efficiency for 4G as defined by WINNER*4. Building on the success of the field trials, DoCoMo will continue its research and development in order to actively contribute to the global standardization of 4G.

al citizens, deployed the SPS Application which recently won the award for best project in Government to Business category in the Irish "Innovation Through Technology Awards". The winning project reused the iMAP solution built by eSpatial and Accenture, the international management-consulting and technology services company, integrates application-form processing data with geospatial information enabling the department to manage and process Single Farm Payment applications and payments in an integrated and web-based environment. The spatial component of the system is built on eSpatials iSMART technology, which draws upon the spatial capability within Oracle 8.1.7 Database and Oracle9i Application Server.

Online Multimedia Presentations OGC


The Open Geospatial Consortium, Inc. (OGC) announces the availability of two online multimedia demonstrations documenting the milestones achieved in the OGC Web Services Phase 3 Initiative, (OWS-3). The focus of the presentations is to share the OWS-3 goals and to provide a synopsis of the final demonstration. The presentations are available as interactive Flash and Web-streamed video and illustrate the use of a variety of draft and approved OpenGIS standards in an emergency response to a fictitious wildfire threat in Southern California. The presentations are available at www.opengeospatial.org/demo/ows3/

www.espatial.com

Landmark Building Added to Ordnance Survey's Data Mapping Assembled for Senedd
The Senedd - the National Assembly of Wales' new debating chamber - now features on the nation's most detailed maps. The outline of the Cardiff Bay building is set out in fine detail in OS MasterMap. The 67 million building, designed by Lord Richard Rogers, hosted its first debate and First Minister's questions last week, and was officially opened by the Queen on St David's Day, 1 March 2006. The Senedd - parliament or senate in Welsh - uses natural wood and slate and is designed with energy efficiency in mind. It has won praise for its construction, winning the Building Research Establishment's highest award for sustainable construction.

Terralink International Adopts ER Mapper


Terralink International has selected the ER Mapper image processing application to assist in the datum/ projection conversion of existing image catalogues. Terralink will be converting its assets from NZMG49 to NZTM GD2000 and vice versa using the NTV2 grid transformation. Many of the datasets used by Terralink are more than 30GB in size. Many of these datasets will be split into 1:50k topographic sheets, to facilitate management and tracking.

www.ermapper.com

www.nttdocomo.com

ESRI Book Remote Sensing For GIS Managers ESA Joins Forces with Japan on New Infrared Sky Surveyor
A high-capability new infrared satellite, ASTRO-F, was successfully launched on 21 February by the Japan Aerospace Exploration Agency (JAXA). In a collaborative effort involving ESA and scientists across Europe, the spacecraft is now being prepared to start its mapping of the cosmos. Orbiting the Earth, ASTRO-F (to be renamed Akari (light) now that it is in orbit) will make an unprecedented study of the sky in infrared light, to reveal the distant phenomena hidden from our eyes that tell the story of the formation and evolution processes taking place in the universe. ASTRO-F will be in polar orbit around the Earth at an altitude of 745 kilometres. From there, after two months of system check-outs and performance verification, it will survey the whole sky in about half a year, with much better sensitivity, spatial resolution and wider wavelength coverage than its only infrared surveyor predecessor, the Anglo-Dutch-US IRAS satellite (1983). The all-sky survey will be followed by a ten-month phase during which thousands of selected astronomical targets will be observed in detail. This will enable scientists to look at these individual objects for a longer time, and thus with increased sensitivity, to conduct their spectral analysis. This second phase will end with the depletion of the liquid helium needed to cool down the spacecraft telescope and its instruments to only a few degrees above absolute zero. ASTRO-F will then start its third operations phase and continue to make observations of selected celestial targets with its infrared camera only, in a few specific infrared wavelengths. Hundreds of illustrations and examples merge in Remote Sensing for GIS Managers, a new book from ESRI Press that reveals the power of interpreting information gathered from aerial photography, radar, satellite, and other remote-sensing methods. Readers will travel from the vast ocean depths to the far reaches of outer space as they learn everything from the basics of remote sensing to the challenges of interpreting, managing, and storing the ever-increasing range of remotely sensed data available today. Designed for new and experienced users, Remote Sensing for GIS Managers is written for GIS managers, professionals, and students who want to become more knowledgeable users of remote-sensing services and manage the development of innovative solutions suited to the needs and goals of their organizations. The books case studies illustrate the use of remote sensing in national security, urban and regional planning, resource inventory and management, and scientific disciplines ranging from forestry and geology to archaeology and meteorology. Remote Sensing for GIS Managers (ISBN 1-58948081-3, 524 pages, $69.95) is available in bookstores and from online retailers worldwide or can be purchased at www.esri.com/esripress or by calling 1-800447-9778. Outside the United States, contact a local ESRI distributor; see www.esri.com/international for a current distributor list. Books published by ESRI Press are distributed by Independent Publishers Group (tel.: 1-800-888-4741, Web: www.ipgbook.com).

www.ordnancesurvey.co.uk

Geauga County, OH Awards RFP to Pictometry


Pictometry International Corp. has won a Request for Proposal (RFP) from Geauga County, OH to provide oblique aerial imaging and software for the county and its local municipalities. Pictometry's county-wide licensing agreement enables all municipalities in a participating county to freely have access to the digital aerial photography and GIS software. Located in the upper, northeast region of Ohio, Geauga County encompasses 404 square miles and has a population of over 94,000 residents. The county was founded in 1806 and has 16 townships. The company is hosting its first annual User Conference, Pictometry FutureView 2006, from October 29 - November 1, 2006 in Orlando, FL. www.pictometry.com.

PICAB New Topcon Dealer in North of Sweden


Topcon Europe Positioning BV has a new partner in the North of Sweden. In a new agreement with the Swedish company PICAB, Topcon aims to provide its customers with a strong basis for further expansion of its sales and support capabilities for Survey and GPS positioning products (total station and GPS+ sales) in this part of Sweden. It is the new company PICAB Positioning, within the PICAB Group, that will take care of positioning solutions for construction, survey and GIS. PICAB Positioning is managed by Brje Israelsson, prior sales manager for Topcon Scandinavia in the north of Sweden. PICAB Group is one of the leading companies in System Development (design and development of activities and custom GIS applications), GIS production (map production, digitization, GIS analysis, GPS measurements etc), and will now actively broaden its business in a new field by supplying positioning solutions for construction, survey and GIS fields.

www.esri.com

Loy Surveys Replace Survey Equipment


Loy Surveys, Chartered Land Surveyors have placed an order with Trimble dealer Survey Solutions to replace their entire fleet of survey equipment. The new Trimble S6 robotic/reflex total stations and Trimble R8GNSS receivers together with the introduction of a new standard Trimble TCS2 handheld interface will provide real-time computing in the field. Trimble R8 GNSS now supports both the nextgeneration GPS L2C and L5 signals and GLONASS L1/L2 signals.

www.esa.int

Irish Dept. of Agriculture & Foods SPS Uses eSpatial Spatial Technology
eSpatial has announced that the Irish Dept. of Agriculture & Foods recent award winning Single Payment System (SPS) uses spatial technology based on eSpatials iSMART platform. The Dept. of Agriculture & Food, whose range of customers encompasses government departments and individu-

www.topconeurope.com

www.loy.co.uk/

Latest News? Visit www.geoinformatics.com

April/May 2006

63

Industry News
Terra Digital Chooses Infoterras Pixel Factory
widespread adoption and use of GI. It highlights the continuing maintenance of the national georeferencing infrastructure on which the market in GI can develop, plus Ordnance Survey's focus on customers and collaborative work with government, business and other stakeholders. The strategy involves the goal of ensuring that Ordnance Survey data remains a key enabler to support new demands and requirements throughout the information industry. A major component to support joined-up information sharing is the ongoing development of OS MasterMap as a seamless geographic database compatible with accepted web standards and ordered through an online interface. tor for geographic information system (GIS) software in Iraq. Since 1989, the Republic of Iraq, with the help of Atlas for GIS and Surveying Systems, has continued the arduous task of updating the country's infrastructure maps while working toward the establishment of national information and GIS centers.

www.esri.com

People
New Petroleum Industry Solutions Manager Joins ESRI
Brian Boulmay has joined ESRI as Petroleum Industry Solutions manager. With more than eight years of experience in the petroleum industry, Boulmay comes to ESRI from Shell in Houston, Texas, where he was team leader for geo-information and GIS. To meet the diverse needs for geographic technology in this industry, ESRI is expanding its petroleum team with highly qualified, GIS, Petroleum professionals. During his time with Shell, Boulmay served on the Petroleum User Group (PUG) steering team assisting with the PUG conference as well as leading the PUG 3D working group focusing on gridding and contouring, and 3D capabilities of ESRI software. With Shell, he led a team of spatial professionals and managed local and global projects, improving the company's use of spatial technologies. A recent project was working to standardize the GIS IT architecture globally for Shell.

www.ordnancesurvey.co.uk
Infoterra France announces the delivery of its Pixel Factory photogrammetric suite to Terra Digital, an advanced geo-service provider in Germany. The Pixel Factory, developed by ISTAR (now part of Infoterra France) during the last 15 years, offers the opportunity to rapidly generate a wide range of cartographic end products such as Digital Surface Models, Digital Terrain Models, Ortho and TrueOrthoTM photos. According to Infoterra France there is a high level of automation and multi-processor architecture which help to produce vast quantity of raw data in outstanding time.

GeoEye to Supply European Commission with OrbView-3 Imagery


GeoEyes partner European Space Imaging (EUSI) received an additional contract from the European Commission to supply OrbView-3 high-resolution map-accurate imagery. The contract was awarded on April 5, 2006 and is valued at $2.15 million (EUR 1.8 million) over the next four years. The imagery will be collected by the companys OrbView-3 high-resolution earth-imaging satellite. Together with its recently renewed three-year contract to supply imagery from GeoEyes IKONOS high-resolution satellite, EUSI continues to be one of the largest suppliers of commercial satellite imagery to the European Commission. The imagery will be processed and delivered to the European Commission by EUSI located in Munich, Germany. The contract requires GeoEye to begin imagery collections over 24 specific European sites to support the European Commissions agricultural subsidy controls. The European Union is using satellite imagery as a tool to verify farmers declarations and claims for subsidies. In addition, the European Commission will use OrbView-3 imagery in support of national security related projects.

www.infoterra-global.com

NGA Awards ClearView Contract to DigitalGlobe


DigitalGlobe has been awarded a $12 million satellite imagery capacity contract modification by the National Geospatial-Intelligence Agency (NGA). This ClearView contract enables the NGA to acquire additional commercial imagery from DigitalGlobe's QuickBird satellite - the world's highest resolution commercial imaging satellite.

www.esri.com

www.digitalglobe.com

Vianova Systems Delivers 10,000th Copy of Novapoint


Vianova Systems (Oslo, Norway) announced the delivery of the 10,000th copy of Novapoint, the companys software solution for the design of transportation infrastructures and the management of infrastructure assets, to Grontmij NV (www.grontmij.com), the Netherlands. With a team made up of 320 people and 80% market share in the Scandinavian countries, Vianova Systems has recently extended its reach with business operations in Spain, the UK/Ireland and France.

GeoEye Names Paolo Colombi Vice President of International Sales


GeoEye appointed Mr. Paolo E. Colombi as vice president of International Sales. Mr. Colombi has over 25 years of experience in sales, operations and international management in high-technology and telecommunications systems and services. He will be directly responsible for all international business development and sales, and serve as a resource to more than a dozen Regional Affiliates and Regional Distributors around the globe.

www.geoeye.com

100 Thales ProMark3 Systems Ordered by Brazils Agrarian Reform Institute


Brazils cadastral agency responsible for land regulation, sustainable development of rural lands and topographic surveying of government land (INCRA), has purchased 100 Thales ProMark 3 GPS survey systems. The INCRA choice of the ProMark3 GPS survey system is the first time the agency has selected Thales equipment to conduct its nationwide GPS surveying effort. INCRA officials were impressed with the overall design of the ProMark3, including its light weight and screen capabilities. Headquartered in Brasilia, the Instituto Nacional de Colonizao e Reforma Agrria employs more than 4,000 nationwide. The mission of the 35-year-old public agency is to create development opportunity for Brazils rural populations. It is responsible for placing families in rural areas, land use regulation, and topographic surveys of government lands.

www.vianovasystems.co.uk www.vianova-systems.fr

Vexcel Delivers End-To-End Aerial Mapping System to COWI


COWI A/S (Denmark) has purchased a complete aerial mapping solution from Vexcel. The system includes two Vexcel UltraCam (TM) large format digital aerial cameras and the Vexcel UltraMap(TM) server turnkey hardware and software solution for digital image archiving, cataloging and post-processing. The procurement of these combined products provides COWI with Vexcel's fully digital approach to photogrammetric operations.

Conferences & Meetings


Laser-Scan User & Partner Conference 2006 Laser-Scan's 2006 User & Partner Conference will be held from Tuesday 27th - Thursday 29th June. The theme of this year's Conference is Spatial Data Supply Chain: Delivering ROI. Organisations across the world in many different market sectors, such as government, defence and utilities have invested significantly in departmental geospatial data holdings over the last 10-15 years. Delegates can register online at www.laser-scan.com/conference/

www.vexcel.com

Ordnance Survey UK Highlights GI Strategy


Ordnance Survey UK has published a free headline summary of its strategy for providing geographic information (GI) for Great Britain over the next two years. The paper - available to download at www.ordnancesurvey.co.uk/aboutus - outlines the key elements involved in supporting the

index.htm.

www.thalesgroup.com/navigation

Atlas New ESRI Software Distributor in Iraq


ESRI announced that Atlas for GIS and Surveying Systems Co. Ltd., a former user, is now its distribu-

Latest News? Visit www.geoinformatics.com

April/May 2006

65

You might also like