The Havoc of Non-Interoperability

An Open GIS Consortium (OGC) White Paper

Mark Reichardt - Executive Director,

Outreach and Community Adoption Program,

OpenGIS Consortium (OGC)

 

 

Abstract: A recent Delphi Group Study, "The Value of Standards" (June 2003)[i], gathered the responses of more than 800 end users, software vendors, and service providers to identify current attitudes and expectations for software standards. Delphi's conclusions were striking: "There is a clear and sudden shift in attitudes towards software standards. The climate of economic constraint and risk aversion along with the mandate to integrate systems on both sides of the firewall has created a sea change in the sense of imperative to adopt software standards. ....The results portray a shifting landscape where standards will provide the foundation for long term advances in the way software is built, bought and deployed." [Emphasis mine.] The Open GIS Consortium (OGC)[ii], an innovative public-private partnership, has worked for a decade to cause such change in the domain of digital geographic information and geoprocessing. "Economic constraint and risk aversion," always important drivers in the geospatial domain, are increasingly important as agencies and businesses face recession and national security threats. In this paper, we look at how standards are creating a new geospatial information space and discuss how the methods employed by the Open GIS Consortium to enable interoperability in the spatial technology domain provide a model for the way standards will be built, adopted, and deployed.

 

1         Introduction – The Potential for Havoc

It is always good to take a positive approach in trying to convey an idea, and indeed, this paper is about very encouraging progress. So I address "havoc" at the outset to be done with it. By "havoc," I refer mainly to Webster's "great confusion and disorder," though I shall also make reference to Webster's "wide and general destruction" definition.[iii]

 

Our world is going through a communications revolution on top of a computing revolution, and the many technology issues this involves frequently cause confusion in the corporate technology decision making process. In a period of rapid change, it has been difficult for people to stay sufficiently informed to make good decisions about technology. The technology has been immature as well as overwhelming in volume, hype, and rate of appearance in new products. Thus, in hindsight, we often see that resources have been applied less effectively than they might have been. This sense of confusion and disorder has been amplified by the latest phase in the communications revolution in which almost all computers have been attached to a vast network. The Net is potentially a wonderful thing, but besides unleashing evils like viruses and spam, it has shown that our applications often don't work very well together. That is, they are often non-interoperable."

 

Non-interoperability impedes the sharing of data and the sharing of computing resources, causing organizations to spend much more than necessary on data, software, and hardware. Since the Delphi report states that organizations today are under "economic constraints,” the issue of non-interoperability is one that obviously needs to be resolved quickly.

 

The report also states that organizations are risk-averse. Non-interoperability increases technology risks, which are a function of 1) the probability that a technology will not deliver its expected benefit and 2) the consequence to the system (and users) of the technology not delivering that benefit. Risk assessment must take into account evolving requirements and support costs.[iv] Some technology risks derive from being locked in to one vendor, others from choosing a standard that the market later abandons.

 

The most dire risks associated with non-interoperability are real-world risks. Today, lives and property depend on digital information flowing smoothly from one information system to another. Public safety, disaster management, and military applications increasingly depend on communication between dissimilar systems used by groups with different but related missions. No single organization produces all the data (so it's inconsistent) and no single vendor provides all the systems (so the systems use different system architectures, which are usually based on different proprietary interfaces). Thus, there is the potential for real world havoc.

 

In this section, we first consider the particularly difficult interoperability challenges of geographic information and geoprocessing software. Then we look at a scenario that illustrates the dangerous but all too common real world trouble that ensues when those challenges are not met.

 

1.1        Sources of Geoprocessing Non-Interoperability

Few kinds of information are more complex than information about the location, shape of, and relationships among geographic features and phenomenon. One reason is that there are many fundamentally different kinds of geoprocessing systems, that is, systems for creating, storing, retrieving, processing, and displaying geospatial data. These include vector and raster geographic information systems (GIS) and systems for Earth imaging (imaging devices on satellites and airplanes), computer-aided design (CAD) (for roads, sewers, bridges, etc.), navigation, surveying, cartography, location based services (delivered, for example, via cell phones that can give directions and report about what's nearby), facilities management, etc. Numerous vendors work within each of these technology domains who did not, until they joined OGC, consult with their competitors to form agreements on how the data should be structured and how the systems might communicate. This lack of communication coupled with the many different ways of measuring and mathematically representing the Earth produced a complex and non-interoperable geoprocessing environment. Added to that "havoc" are the user-side semantic issues: Without coordination, no two highway departments, for example, will use the same attribute schemas, measurement types, and data types in describing a road. Their "metadata" (data describing their data sets) will also use different schemas, making automated data discovery and data sharing difficult.

 

1.2        Scenario

Suppose a gasoline truck hits a utility pole where a state highway intersects a county road. Gasoline spills and burns, some of it running into a storm drain that empties into a stream. The utility pole, owned by the electric utility and used also by a cable company and a phone company, falls amid a tangle of wires. Traffic backs up in all directions. People are injured and the fire is spreading to nearby properties.

 

In considering the information sharing one would like to see in this scenario, we begin by merely listing the government and private entities that might have and/or urgently need spatial information: the state and local police, the ambulance company, the local fire department, the company that employs the truck driver, the company that does the hazardous material (HAZMAT) transportation monitoring, the state and local highway departments, the local sewer department, field engineering and customer service groups at each of the "wires" companies, the traffic reporters at the local news broadcasting stations, the state department of environmental protection, the owner of the burning property, and perhaps others, including federal authorities such as the Federal Emergency Management Administration (FEMA), the Environmental Protection Agency (EPA), and the National Transportation Safety Board (NTSB). Currently, some of these information flows, particularly those that require only a phone call or that work through proprietary interfaces in tightly coupled systems, work smoothly. But most of the information sharing that involves digital spatial data cannot happen in real time. It often takes hours or days because no single technology provider has "tightly coupled" all those systems nor have all of these providers yet implemented the new OGC Web Services standards that enable "loose coupling" of multiple vendors' applications.

 

Now imagine a much broader disaster such as a major flood, an earthquake, an explosion, a building collapse in a downtown area, a natural gas pipeline explosion, or a sudden national epidemic.Consider the impact of non-interoperable data on services such as power, water, electricity, sewage, and transportation, and consider the impact on safety and on repair costs. Suffice it to say that all "spatial data infrastructure" stakeholder groups along with the vendors who serve them have a responsibility to work together to establish interoperable geoprocessing that will help agencies plan for, mitigate, and respond to such real world havoc. As the HAZMAT carriers say, "Information is safety."

 

2         The Good News: Open Standards Conquer the Havoc

As the Delphi report states, "There is a clear and sudden shift in attitudes towards software standards." It appears that both intra-enterprise interoperability and inter-enterprise interoperability are now seen to be much more important than just a year or two ago. This is easy to understand given people’s experience with non-interoperability havoc and the sudden obviousness of the following logic:

1.       Computing means to store, retrieve, and process data.

2.       To avoid havoc (and to enjoy many positive benefits), our computer systems need to be able to communicate.

3.       Communication means transmitting or exchanging through a common system of symbols, signs, or behavior.

4.       Standardization means agreeing on a common system.

5.       Therefore, we should promote standardization and employ standards in our computer systems.

This realization is perhaps induced by the Internet and Web, whose open standards (HTTP, TCP/IP, XML, etc.) and extraordinary success give us a taste of what interoperability is all about.

 

2.1        Open Standards in the Geospatial World

To begin this discussion, we must first define the term open standard. OGC defines an open standard as one that:

  1. Is created in an inclusive, international, participatory industry process.
  2. Is owned in common.
  3. Has free rights of distribution. That is, anyone can share it with anyone, free of charge.
  4. Is free and openly available to the public, in all its details.
  5. Does not discriminate, in the license or the standard, against persons or groups..
  6. Is technology neutral--no provision of the license may be predicated on any individual technology or style of interface.

By this definition, a de facto standard established by one company, an exclusive group of companies, or a government is not an open standard, even if it is published and available for use by anyone at no charge. The Web must not depend on proprietary standards and the same applies to the "Spatial Web," which OGC defines as the set of all Web-based geoinformation and geoprocessing resources that are accessible through open interfaces.

 

Open standards are developed by non-exclusive industry consortia and task forces (like the OGC, the World Wide Web Consortium (W3C), the Open Mobile Alliance (OMA), the Internet Engineering Task Force (IETF), and others) as interlocking parts of interoperability frameworks and reference models. These organizations' framework and reference model documents guide developers and integrators in designing customer-specific open architectures, which specify the open data models (information schemas) and open interfaces, protocols, etc. that will meet the needs of particular enterprises based on their user needs, including business models and work flows.

 

Open standards address user needs that can only be met by cooperation among system vendors. Overall, users want to maximize the value of past and future investments in systems and data.[v] [vi] In the geospatial world, that general statement points to the following user needs:

  1. The need to share and reuse data in order to decrease costs (avoid redundant data collection), obtain additional or better information, and increase the value of data holdings
  2. The need to choose the best tool for the job and the related need to reduce technology and procurement risks (i.e., the need to avoid being locked in to one vendor)
  3. The need to leverage investments in software and data, such as enabling more people to benefit from using geospatial data across applications without the need for additional training

 

It happens that the open framework that addresses these basic needs (documented in more detail in any open geoprocessing architecture) makes it possible for vendors to address a whole new array of user needs that require a standards foundation. These additional user needs include:

  1. The need to organize geographic data stored in text and on video, audio, and other media
  2. The need to access and process on-line sensor data (a sensor is always someplace) from multiple sources
  3. The need for Location Based Services that are portable across devices, networks, and providers
  4. The need to apply different symbology to data for different applications
  5. The need to take advantage of grid computing for geoprocessing applications

The solutions that vendors will offer to fill these needs must have a standards platform that enables them to establish new markets and new opportunities for growth.

 

3         Enabling a New Information Space

Information technology standards are business enablers and channelers just like highways, air traffic rules, business laws, and HAZMAT transportation regulations. The five-point list above demonstrates how a platform of open standards enables innovation and proliferation of new capabilities. Recognizing this, vendors in OGC's consensus process give up their proprietary "lock" on customers in favor of the chance to participate in a greatly expanded market.

 

Just as the World Wide Web opened up a whole new information space, the OGC-enabled Spatial Web opens up a vastly expanded geospatial information space. Few people a decade ago could imagine the Web-enabled information space. In the same way, few outside of OGC today imagine the greatly expanded geospatial information space that will result from a platform of open standards for geoprocessing. What is and isn't known about location, proximity, spatial distribution, and extent (of assets, suppliers, customers, service providers, purchases, risks, opportunities, etc.) is tremendously important. More importantly, however, is that our information systems are largely blind to such information and incapable of useful spatial calculation and presentation. OGC members believe that "spatial enablement" will have a profound and largely positive impact in the public and private sector, similar to the impact of the Web itself. Ultimately, we believe that spatial enablement will drive new business opportunities and allow new human activities.

 

4         Enabling Other New Information Spaces: The OGC Model

One might argue that governments, industries, professions, and disciplines have an absolute obligation to their stakeholders to organize consensus-based strategic "imagineering" for the purpose of creating the shared information framework that will optimally support their work in the future. OGC's experience suggests that this happens best in an inclusive, structured, consensus-based specification process with ample input from prototyping in testbeds and real-world testing in pilot projects. OGC's "Interoperability Initiatives" are testbeds, pilot projects and other short-term, intensive, multi-participant "spiral engineering" activities to develop, test, and promote the use of OpenGIS Specifications. Specifications developed initially in testbeds typically are completed in the OGC Technical Committee, tested in commercial products in pilot projects, and then approved by the OGC Technical committee and Planning Committee. Interoperability Initiatives provide an opportunity for technology user organizations to steer the direction of technology by providing user interoperability requirements which are the main guiding factor in these initiatives. Other technology domains could use the same methods to quickly develop standards that are quickly implemented in commercial products and that are tailored to users' interoperability needs.

 

To ignore this opportunity, leave interoperability to a vendors' de facto standards, or hire consultants to build a system from the top down, is to condemn stakeholders to more years of havoc. Vendors and consultants will play essential roles, of course, because the actual development, maintenance, customization, and service of software require special skills. Success, however, lies in the ability to engage these experts and other stakeholders in the process mentioned above.

 

It should be added that data models are an important part of the information space. Geospatial data models are complex and heterogeneous. OGC has developed an XML encoding for spatial data, the Geography Markup Language (GML), that, when used with XML tools, makes it possible to resolve many of the difficulties associated with incompatible data models. The XML tools (prototyped in OGC's Geospatial One-Stop Transportation Pilot and Critical Infrastructure Protection Initiative Phase 2 pilot project) map GML-encoded data from a local model to the national model and vice versa. The data thus becomes “as useful as possible” to the data sharing partner who uses a different model. Certain elements of one model cannot map to the other, but the XML tools make these inconsistencies plain in all their details, so that it is easy for data managers to focus on the critical schema elements that don’t map. This makes both data sharing and data coordination much easier. It is already happening that different disciplines, industry sectors localities and professions are forming data committees to manage data coordination, sometimes in the context of setting up data consortia that negotiate data contributions, access, pricing, etc. This work, too, is part of the consensus work that builds the information space.

 

Today and in the future user requirements will almost always involve interoperability, that is, communication between different systems. It is thus necessary to pay attention to:

  1. The requirements of multiple users and classes of users
  2. The details of technical communication standards
  3. The ideas, needs, and stated directions of multiple technology providers

 

The specification process should involve thoughtful consideration of what the community does now, what its members may want to accomplish in the years ahead, and what kind of information flows will be necessary to enable those accomplishments. From these requirement statements, participants construct a framework of interoperability specifications that will support all current and future work. Such an effort is intellectually challenging, socially rewarding, and empowering for the present and future community. In addition to the process recommended above, it is important to emphasize the crucial role that users play in the move towards open standards. With so much at stake, imagine what could be accomplished if a large number of people each took a small step and insisted on open standards in procurements.

 

Much more needs to written about this process, which is at the creative leading edge of standards setting. Additional topics might include:

1.       How can OGC's successful technology-steering-through-standards-setting model be employed in broader information technology (IT) domains?

2.       How can such a process be employed as a tool of industrial or economic policy?

3.       How can such a process be employed as a tool of procurement policy? What guidelines can best leverage the process to ensure fair, application satisfying procurements that yield maximum value for the customer?

4.       How can vendors in a consortium like OGC optimize their participation to best create, enter, and stabilize markets?

5.       What guidelines, regulations, best practices, etc. might minimize the potential for conflict and abuse in standards activities (anti-trust and anti-competitive behaviors, IPR issues)? Currently, for better or worse, businesses influence policy decisions of local, national, and international governments. What role can a consortium such as OGC play in ensuring that such influence is moderated to have a wholesome effect?

6.       What guidelines can make a standards process most useful as a tool for unification (markets, regions, nations, continents)?

 

 

 

 

 

 

OpenGIS, OGC, and OGC User are registered trademarks and service marks or trademarks and service marks of Open GIS Consortium, Inc.  Copyright 2003 by the Open GIS Consortium, Inc. 



Notes

 

[i] A Delphi Survey, “The Value of Standards”, ©2003 Delphi Group, Ten Post Office Square, Boston, MA 02109

[ii] OpenGIS Consortium (OGC):  http://www.opengis.org

[iii] Merriam Webster, Webster's Ninth New Collegiate Dictionary, 1984

[iv] Dave Brown, Technology and Engineering Department, Defense Acquisition University, <davebrown@dau.edu>, "Evolutionary Acquisition and Spiral Development" (presentation, ca 2002)

[v] "The Importance of Going 'Open,'” an Open GIS Consortium (OGC) White Paper, September, 2003

[vi] Chuck Heazel, "An Architecture Approach for Web-Enabled Systems," an unpublished article written for OGC, August, 2003