Page images
PDF
EPUB

Highlights

Primary Mapping
Economic Analysis

By Larry L. Amos, Carl D.
Shapiro, and Donald H. Zoller

Mapping activities at the U.S. Geological Survey are undergoing major programmatic and technological transitions that will continue during the next several years. Programmatically, the USGS plans to have available initial national coverage of primary-scale quadrangle maps by the end of fiscal year 1990. There is now a growing demand for revision of previously completed primary maps where data have become outdated. As a result, the USGS is placing more emphasis and resources on map revision efforts.

A technological revolution also is taking place as computer storage and manipulation of map data are rapidly changing and improving the ways in which data can be used. In response to new requirements created by this changing technology, the USGS is digitizing its maps to the extent resources permit. The development and increased use of geographic information systems, which provide the technology for automated manipulation and analysis of these digitized map data, have intensified pressure on the USGS to provide more digitized cartographic data. To respond to this increasing demand, the USGS plans to implement advanced technology being developed by the Department of Defense. This advanced technology, called Advanced Cartographic Systems (ACS), will greatly increase production capacity and will allow the USGS to achieve significant cost savings over the next decade. An Advanced Cartographic Systems cost-effectiveness analysis, prepared in 1986, demonstrated that ACS is cost effective over a wide range of production levels. To help determine what production level is appropriate, the USGS initiated a primary mapping economic analysis. Because there is a time

limit on the procurement of hardware and software required for the implementation of ACS, a decision on this production level has to be made in the near future.

Methodology

The assumption that primary map information is a public good is central to the methodology in this analysis. Because many parties can use the same map information at the same time, the benefits from the information in each map can be summed among its many users.

The model developed in this analysis examines the benefits and costs of revising primary map information in each 7.5-minute quadrangle in a sample of five States. A series of map revision cycles are examined to identify the best revision cycle for each quadrangle. The results obtained in the sample States then are extrapolated statistically to the Nation. The optimal ACS production capacity is derived directly from the optimal revision cycles.

The methodology incorporates the multiplicity of use into the benefit calculations. Rather than examining a small number of applications with dramatic benefits, a larger number of applications with smaller calculated benefits is studied. The existence of a large number of applications makes it possible to calculate a lower, conservative estimate of benefits for each application because the sum of these conservative estimates is still large enough to make the results meaningful.

Benefits in this analysis are defined as the amount that an organization would spend to update map information if current primary maps were not available. This definition provides a conservative estimate of the benefits for each application. The consumer surplus (that is, the difference between the amount a user is willing to pay and the amount that the user actually pays) is not included in this definition.

A decision tree was developed to model the different procedures by which map users could update primary map information if the USGS were not able to do so. The cost of implementing each of the branches of the decision tree was estimated from USGS map production rec

ords. As a result, the decision tree provided estimated costs of various levels of map revision that ranged from no activity to the maximum of a full map revision for eight layers of map information.

The decision-tree model was applied to 216 case studies in 5 States (Connecticut, Florida, Illinois, Oregon, and Utah). For each case study, information was collected on the appropriate branches of the decision tree and the geographic area where the application was conducted. The costs examined in this analysis included both the capitalization costs for ACS between fiscal years 1989 and 2000 and the production costs for revision of primary maps. Optimal map revision cycles were calculated for each of the 5,416 quadrangles in the 5 sample States by evaluating benefits and costs for 20 different map revision cycles, ranging from a 1-year cycle to a 20-year cycle.

The calculated best map revision cycles were projected to the Nation by classifying geographic areas in the five States and extrapolating the results to similar geographic areas in the Nation. Four dichotomous categories that could significantly affect the benefits or costs of map revision were selected for this extrapolation. These categories were urban/rural, Federal lands/non-Federal

lands, energy lands/non-energy lands, and coastal areas/inland areas. These 4 dichotomous categories combine to form 16 unique conditions. Within each of the unique conditions, the mean optimal revision cycle was calculated in the five States. This result was then projected so that areas within the same classification category have the same optimal revision cycle throughout the Nation.

Results

The optimal revision cycles in the five sample States range from 1 year in selected urban and urbanizing areas to 5 years in most rural and remote areas. The results show that developing areas outside the traditional urban core may require primary map revisions as frequently as do urban core areas. Applications in urban and urbanizing areas and on Federal lands are key variables that suggest more frequent map revision in this model. Frequent optimal revision cycles in the Western States, where most Federal land is located, provide evidence of the large benefits from Federal appli

[blocks in formation]
[graphic][merged small]

net benefits are maximized at a production level of approximately 1,200 quadrangles revised per year. At this production level, the present value of aggregate net benefits between 1989 and 2000 averages approximately $70,000 a quadrangle and corresponds to an average revision cycle of 4.5 years.

The patterns noted in the five States. continue when the data are extrapolated to the conterminous United States as shown in figure 1. The importance of primary map revision in urbanizing areas is shown dramatically in the Eastern United States, where large population corridors extending beyond the traditional central city require more frequent revision than do rural areas. The optimal production level associated with the extrapolated national model is approximately 12,500 maps revised per year. Sensitivity analyses, where benefits are arbitrarily doubled or halved, show a range in the optimal map revision capacity for the U.S. Geological Survey of 10,000 to 17,000 maps a year.

Standards for Digital Cartographic Data Exchange

By Hedy J. Rossmeissl

The U.S. Geological Survey assumed leadership in developing, defining, and maintaining Federal earth science standards in February 1980 with the signing of a memorandum of understanding between the National Bureau of Standards (now the National Institute of Standards and Technology) and the USGS. Under the terms of this memorandum, the USGS has sought to develop standards for digital cartographic data exchange through the Standards Working Group of the Federal Interagency Coordinating Committee on Digital Cartography and through grants to the National Committee for Digital Cartographic Data Standards, which has operated under the auspices of the American Congress on Surveying and Mapping

since 1982. A proposed "Standard for Digital Cartographic Data," published in the January 1988 issue of The American Cartographer, represents a merger of the efforts of these two groups.

The proposed standard consists of four parts: definitions and references, spatial data transfer specification, digital cartographic data quality, and cartographic features. The purpose of the proposed standard has evolved during its developmental period from an initial emphasis on data structure and content to a less restricted focus on transfer of digital cartographic data between dissimilar systems.

Following publication of the proposed standard, empirical testing was begun. A technical review board, chaired by the USGS, was established and charged with oversight of changes to the proposed standard resulting from testing. Testing involves several levels of effort. First a conceptual model of spatial data is needed to permit exchange of information between dissimilar systems. Once a conceptual model has been developed, the transfer process begins, which consists of taking data from a sender system, putting the data into standard form, then extracting the data from the standard form into a receiver system.

The first phase of testing the proposed standard was conducted by the Federal agencies most active in its development, including the U.S. Bureau of the Census, the Defense Mapping Agency, the Naval Ocean Research and Development Activity, and the USGS.

The second phase of testing, involving a cross section of government, university, and private-sector organizations, began in September 1988. The review board sought a broad spectrum of participants to thoroughly test application of the standard in as many discipline areas as possible. Submittal of the "Standard for Digital Cartographic Data" to the National Institute of Standards and Technology is expected to take place after completion of the second phase of testing in early 1989. At that point, the standard will be reviewed by the National Institute of Standards and Technology for adoption as a formal Federal Information Processing Standard.

Geographic Information Systems Research Laboratories-Bringing Data, Systems, and Users Together

By John C. Houghton

The essence of geographic information systems (GIS) technology is the ability to analyze the spatial aspects of data; the broad definition of GIS technology also encompasses data collection, storage, updating, and output. GIS analysis typically uses several different data sets, such as images and remotely sensed data, line data from topographic maps, the area covered by an aquifer or a volcano or a hazardous waste site, or a wide variety of other information.

The rapid evolution of GIS technology can be attributed to several factors. One factor is the rapidly increasing capabilities of computer and peripheral hardware. Another factor is the growing availability and sophistication of commercial GIS software. A third factor is the rapidly growing base of GIS users, which serves to defray the cost of hardware and software development.

The large number of users has an important advantage in addition to making software and hardware less expensive. The salient feature of a GIS is its ability to integrate many different kinds of data sets. This ability makes it a natural process to bring together heterogeneous disciplines. Teams conducting GIS studies include scientists not only from the USGS but also from other Federal agencies, national laboratories, State and local governments, nonprofit institutions, academia, and the private sector. These scientists provide different discipline perspectives and contribute a wide variety of data.

The rapid increase in GIS use is also due, in part, to the existence of GIS research laboratories in the USGS. A GIS laboratory was established in Reston, Va., in 1985 and is now being used by nearly 100 scientists. GIS laboratories in Denver, Colo., and Menlo Park, Calif., were estab

lished during 1988. Other USGS field offices are also adding to their spatial data processing capabilities. These GIS laboratories offer centralized facilities where new equipment and techniques can be investigated and the results passed on to others, where users with different backgrounds can work and learn together, and where expensive equipment can be shared by the many users.

In addition to earth-science research work in support of the Survey's mission, the GIS laboratories also perform an important educational function. The Reston GIS laboratory, for example, provides demonstrations on previous GIS projects, assists in the design of new projects, and conducts tours for more than 200 people a month. The GIS laboratories are also beginning to monitor and catalog digital data collected for particular projects.

GIS laboratories offer centralized facilities where new equipment and techniques can be investigated.

Several GIS projects have been undertaken in cooperation with other agencies. In a project led by the Environmental Protection Agency to help restore Chesapeake Bay, the USGS helped develop a spatial data base to test GIS techniques in analyzing the sources, transport, and impact of contamination on a highly urbanized watershed. In one scenario, GIS software was used to locate areas that were potentially affected by a source of lead contamination. In another cooperative project with the Environmental Protection Agency, GIS software was used to characterize a Superfund site during its 30-year life. One outcome was the ability to help position a monitoring well by determining the location of certain site features and their proximity to existing topographic and culturalhistorical features. A demonstration of GIS technology was also developed with the U.S. Forest Service. The Copper Basin area of Prescott National Forest,

Ariz., was used as a case study to evaluate the potential impact of a proposed land exchange. A GIS will be used as the foundation for a future environmental impact statement to describe a copper and molybdenum mining operation.

In another cooperative effort, the Survey is providing assistance to the U.S. Fish and Wildlife Service in the development of a comprehensive GIS data base for the Arctic National Wildlife Refuge. The U.S. Bureau of Land Management, the State of Alaska Department of Natural Resources, and the North Slope Borough are also actively participating in a cooperative effort to conduct biological and geological studies to investigate the area's vast potential oil and gas resources and to protect its wildlife.

These cooperative efforts demonstrate the broad utility of GIS technology and the importance of providing centralized facilities where data and ideas can be exchanged and developed.

Geographic
Information System
Modeling

By Louis T. Steyaert

Global changes in the Earth's natural system are current and critical issues of scientific investigation. Detecting and understanding the causes and implications of those changes demand sophisticated research tools and large earthscience data bases. One research approach that is under investigation by the USGS is the use of geographic information system (GIS) technology for global monitoring and scientific modeling.

A GIS is a computer hardware and software system designed to store, collect, manipulate, analyze, and display geographically referenced data. USGS scientists are investigating methods to incorporate the analysis of space- and timedependent data into the GIS process. This research includes the use of

GIS in combination with other modeling

tools, such as statistical and numerical methods, as part of an overall systems approach to earth-science modeling involving, for example, land-surface monitoring, hydrologic and climatic processes, or impact assessments.

GIS modeling would allow scientists to go one step further into scientific inquiry.

As the community of GIS users continues to grow, earth scientists are investigating more complex data bases and developing quantified models for more diversified applications. Data available for GIS analysis have expanded rapidly in recent years and now include digital cartographic data (base categories of information and unique thematic maps of earth-science data), historical data (hydrologic, climatic, and geologic data), and daily satellite data for the entire globe. Some GIS applications involving these data require scientists to investigate and analyze the data across time, in three dimensions, or on a global scale. Therefore, in addition to fundamental GIS map applications such as map overlays and polygon or point reclassifications, GIS applications are incorporating surface, volume, and temporal analyses as well as interfaces with complex hydrological and meteorological flow modeling or economic modeling. The development of a systems approach for the integrated use of GIS technology and statistical and numerical methods will provide modeling capabilities that will enhance many operational monitoring systems, including those designed to help in understanding and tracking global change.

Recent trends in GIS applications underscore the need for statistical analysis procedures. For example, techniques for exploratory data analysis are essential to the investigation of complex data structures. Such analysis capabilities represent the first step in developing models of earth-science processes involving, for example, climatic and hydrologic systems.

« PreviousContinue »