Page images
[merged small][merged small][graphic]
[graphic][ocr errors][graphic][ocr errors][ocr errors][ocr errors][merged small][graphic][merged small]

Parallel ProcessingPowerful Tool for the Earth Sciences

By Rick MacDonald

The complex processes of nature, the still mysterious forces at work deep beneath the Earth, and the puzzling phenomena of natural hazards are excellent candidates for the applications of new computer technologies. Earth science has the wide range and complexity of disciplines that challenge the leading edge of computer technology. Each new research venture in earth science demands more and more computing resources. Research interests that can benefit from better computational tools include projects in geology, hydrology, seismology, volcanology, geochemistry, geography, and cartography. Much can be learned, for example, from constructing dynamic computer models of the core and mantle of the Earth, which allows an unprecedented insight into the forces that drive the Earth's tectonic plates. Other computer models can predict how toxic and radioactive contaminants behave as they travel in ground-water systems. Threedimensional geographic information systems, which can be used to integrate almost infinite permutations and combinations of geologic, hydrologic, and topo

Each new research

venture in earth science

demands more and more

computing resources.

graphic information, allow scientists to analyze relations among the Earth's properties and to understand better the processes at work.

Supercomputers are one of the exciting new and developing technologies that can provide the data-handling resources to unlock the mysteries of the Earth's processes. Newer supercomputers employ a wide variety of "architectures", includ

ing scalar, vector, parallel, vector-parallel, and massively parallel. Scalar and vector architectures are familiar to most earth and computer scientists—these are the more standard computing environments that use conventional programming and a sequential processor in analyzing data.

Parallelism, however, introduces a high degree of complexity into the effective use of computers. One reason is that sufficient software tools have not been developed for parallel processors. Also, scientists have been trained to view computers as sequential processors and to program them accordingly. Also, because scientists have viewed computers as only sequential machines, they have been slow to conceptualize the mathematical algorithms that would help them to solve problems more effectively through parallel processing.

Parallel Processing

Parallelism in computer systems is not a new concept. Operating systems have been designed for years with the capability for either simulated or actual parallel operation. Forms of parallelism have been considered in hardware design ever since the early days of computers. Parallelism has been defined as "a collection of processing elements that can communicate and cooperate to solve large problems fast." Parallel processing systems can be characterized by the number of processor elements (CPU's) that it has, how much memory it has, how the memory is distributed, how the processor elements communicate, what form of interconnection network it possesses, and how the processor elements are synchronized. Some examples of parallelism are as follows:

• Timesharing—Multiple users are served by a computer, allowing more than one user at the same time to access the computer.

• Batch processing—Multiple jobs share time on a single processor.

• Multiprocessing—Multiple jobs are distributed among independent processors.

• Vectorization —Identical computations on elements in an array are performed simultaneously by using specialized hardware.

• Multiple functional units—Computation is shared between functional units designed to work in parallel within the same processor.

• Pipelining—Parts of one computation overlap parts of the previous computation when they use different hardware resources.

• Concurrentization —A single job is programmed to run on multiple processors in parallel.

The cost of circuitry has inhibited, until recently, the use of more parallelism in computer architectures. But with the cost of hardware components undergoing dramatic reductions, parallelism has become a viable solution to the problem of speeding up computer systems. Since the ultimate limitations to how fast a computer can be run are controlled by the natural laws of physics, parallelism may be the only way to increase the speeds of computers.

USGS Activities in Parallel

The USGS has begun several projects to explore the potential benefits

of parallel processing in the earth sciences. For example, a project is underway in which fundamental earth-science algorithms such as fluid flow through porous media and development of evenly spaced data grids from randomly dispersed satellite data of the Earth's surface are being examined in a generic parallel processing architecture. Much already has been published about the mathematical algorithms that are used in sequential processing of earth-science data. The new USGS project is examining methods and techniques that enable scientists to apply the algorithms to general parallel processing platforms. In this way, a knowledge base will be established that will form the kernel for future parallel processing efforts in the USGS.

As industry continues to incorporate this kind of architecture, more and more so-called general-purpose computers can be expected to take on the advanced characteristics of leading edge platforms that use parallel processing. Thus, knowledge about how these architectures can be used most effectively in earth-science research will be necessary.

[blocks in formation]

The Administrative Division provides administrative direction and coordination in support of the scientific and technical programs of the U.S. Geological Survey. This support includes policy guidance and program direction and provides leadership and authority for various administrative management and technical support functions, including personnel, manpower utilization, finance, administrative management systems, management analysis, records management, procurement and contract negotiation, property and facilities management, motor vehicle management, security, and safety. The Division also manages the development, maintenance, and operation of the financial management system for the entire Department of the Interior. These functions are carried out at the National Center in Reston, Va., and through Regional Management Offices in Denver, Colo., and Menlo Hark, Calif.

[merged small][graphic]

Credit Card Sales and
Bankcards for Small
Purchase Procurement

By Charlotte H. Goodson,
Wendy R. Hassibe,1 and
Betty B. Br odes

As part of a continuing effort to use technology advances in automation in the areas of financial management, the U.S. Geological Survey is testing two unique programs for the use of bankcards and credit cards. These innovative initiatives include credit card sales to the general public for USGS products and the use of bankcards by Federal employees for small purchase procurements.

Credit Card Sales

The USGS offers a multitude of cartographic products and book reports for sale at 19 offices throughout the United States. These products include aerial photographs, topographic maps, mineral and energy resource maps, satellite images, selected separate sheets for the National Atlas, scientific reports, and many other products of interest to the general public and specialized users. The USGS began using credit cards for sales during fiscal year 1989 with the idea of improving service to its many customers.

Because the USGS had been considering the use of bank-cards for several years, the bureau was ready when the Treasury Department offered support in

Of the National Mapping Division.

« PreviousContinue »