Page images
PDF
EPUB
[blocks in formation]

T

endeavor of the Circum-Pacific Coun

cil for Energy and Mineral Resources and the USGS. The council is a nonprofit, nongovernmental organization founded to encourage. international cooperation in the study of the geology and resources of the Pacific basin and the lands that form its rim. The final 61 maps of the CPMP, when complete, will portray more than one-half of the Earth's surface.

The objectives of the CPMP are to (1) outline the distribution of resources in the Pacific basin, (2) depict the relation between the latest geologic and tectonic data and known energy and mineral resources, (3) aid in the exploration for new resources, (4) compile new basin-wide geologic and resource data sets, (5) relate oceanic to continental geology, (6) focus on gaps in knowledge and encourage research to complete them, and (7) provide a mechanism for scientific cooperation among Pacific nations.

The CPMP is directed by a Council Map Committee. Scientific and technical

coordination, cartography, and, since 1990, publication are being carried out by the USGS. Although maps are the main products, planning, compiling, and publishing geoscience data is of equal value. The American Association of Petroleum Geologists previously published, and still distributes, the earlier map products.

Data are compiled by six panels of the Council Map Committee and include international experts who live and work in the Pacific region. The CPMP has several unique and innovative aspects:

• Unlike most previous compilations, the maps include geologic and resource data for both land and ocean areas and have projection points in the mid-Pacific.

• A new series of equal-area, 1:10,000,000and 1:7,000,000-scale base maps depicts data with minimal distortion.

• Base map information and most of the already published data sets on the thematic maps are computerized, and computer technology is being used in preparing the thematic maps.

• Several new data sets have been specially prepared for the map series; among these are sea-floor sediment, manganese nodule distribution, sedimentation rates, earthquake firstmotion solutions, plate-motion vectors, Deep Sea Drilling Project borehole columns, and oceanic crustal ages.

• A new international network of voluntary and nonreimbursed scientific cooperation has been established among Pacific nations. The circum-Pacific region is divided into seven areas: the four quadrants of the Pacific basin, the Arctic Ocean basin, all of Antarctica, and the entire Pacific basin. Base maps for each region have a scale of 1:10,000,000 on a Lambert Azimuthal equalarea projection and are individually centered to minimize distortion. The base map for the entire basin is a single sheet and has a scale of 1:17,000,000. Base maps and color geographic maps are part of the published series. In addition, there are thematic maps for each region and plate-tectonic, geodynamic, geologic, energy resources, mineral resources, and tectonic maps for the entire basin.

Publication of the base and geographic maps began in 1977, and thematic map publication began in 1981. Special themes of tectonostratigraphic, sea-floor materials, and natural hazards are compiled and published for the entire Pacific basin. In addition, a special map showing global change in the Pacific basin is being compiled, and color proof of a map showing sedimentary basins in the southeast quadrant is being reviewed. Thirty-six maps are now printed, and 25 more are in various stages of completion.

[graphic]

Information Systems Activities

Mass Storage System for Earth Science Data

By Joe Aquilino and Tod Huffman

F

or the past 25 years, the USGS has

used computer technology to help collect and store large volumes of cartographic, geologic, hydrologic, and other earth science data. In the past, these data were stored in mainframe computers. As data requirements and computing technology became more sophisticated, data-collection and data-base activities became more distributed, that is, centered around local or regional computer networks. Within these networks, minicomputer, microcomputer, and new high-performance workstations have become the primary means for data collection and storage.

As these distributed computer data-base environments expand, conventional approaches to data storage and archiving are inadequate. There is, therefore, an increasing need to upgrade, expand, and modernize the storage, management, and retrieval of archival data. A central USGS data storage facility of the future must consider the following:

data for use in nationwide archival and retrieval systems are increasing.

• Providing standardized and systematic archival services for centrally stored, as opposed to distributed, data is far more effective, efficient, and less expensive.

During 1990, the USGS developed a prototype mass storage system. The design for the mass storage system focuses on a simple, straightforward solution to storing and retrieving very large files in a networked computing environment. The primary requirements of this design include the following: • A storage system that has potential for organizing, storing, and managing any set of data files, regardless of where the client is located geographically.

• Virtually unlimited file storage capability (no system-imposed expansion limits). Initially this capability will approximate 1 terabyte (1012 bytes) of storage capacity that can be expanded simply by adding units of storage hardware.

• A very large file server accessible to any local area subnetwork via USGS telecommunications facilities.

• Access to the very large file server by using simple file transfer, management, and storage system command structure and syntax. Regardless of the diverse computing environments of users, the connection between client and file server will be explicit. All data file transmissions to and from the server, as well as queries and data histories, will be activated by the user; the user will have full control over the archive process.

• Compatibility with various communication networks and storage hardware.

• Good file security mechanisms that allow the owner to control access to data.

• The ability to share data files among diverse computer environments.

A storage server that is constant over evolving generations of client hardware and operating systems.

• Accessibility, 24 hours a day and 7 days a week, at a low cost per gigabyte (109 bytes) of storage.

This mass storage system will eventually replace data archiving functions now associated with the existing mainframe computer. The system will embody all the advantages available in the current mainframe storage environment as well as vastly increase file storage capabilities, improve efficiencies in the use of file system hardware, have a new hierarchical library structure, and reduce storage costs.

For the scientific workstation and distributed data-base activities in nonmainframe environments, such as UNIX, VAX/VMS, Macintosh, and MS/DOS microcomputers, this mass storage system will have filename,

directory structure, and command syntax conventions that are familiar to users of these environments, support common network standards and file transport facilities, allow sharing of data between these environments, and improve file security.

The mass storage system user, whether using UNIX, VAX, Macintosh, MS/DOS, or IBM-like mainframes, will learn one file server interface that will work the same for all systems. Standard commands will be available for connecting to the mass storage system; establishing or listing the contents of the user directory and detailed file descriptions and histories; storing and retrieving file(s) or hierarchical groups of files; adding, changing, and deleting files or directories; and invoking help commands. Through a file importexport capability, archived files can be stored in and retrieved from diverse system and storage media formats, and data sharing and disseminating can be made across various computing environments and data formats.

Mass storage system software for the existing mainframe and a robot tape storage device will be used to test a 1 terabyte mass storage system during 1991. A sample of representative data-base activities and scientific disciplines will be used to evaluate the system. This mass storage prototype will become the basis for developing the hardware, software, and basic data backup and archiving techniques necessary to handle the very large USGS data bases of the future.

Electronic Mail

By Paul Celluzzi

omputer electronic mail (Email) is fast becoming a vital part of everyday busi

Aness life. Email, which has evolved

from simple message sending to complex information exchange, provides an efficient method of transferring information in electronic form. This information may be simple text messages, complex documents, graphics, facsimile, or binary programs. In the future, Email will even include voice and video annotation.

Email is an effective method for moving documents, messages, and data and has improved information flow and strengthened communication within the USGS. Throughout the bureau, scientists are recognizing the advantages offered by Email technology for communicating with colleagues and for conducting joint research from geographically dispersed locations.

The USGS is investigating technology that will allow information of all types to be

[merged small][graphic][subsumed][subsumed][subsumed][subsumed][subsumed][subsumed][subsumed][subsumed][subsumed][subsumed][subsumed][subsumed][subsumed][subsumed][subsumed][subsumed][subsumed][subsumed][subsumed]

moved across different computer systems, application environments, and organizational boundaries. The diverse computer hardware and communication systems in use, however, complicate achieving a bureauwide system. Because each system has its own proprietary Email, implementing a single Email system is impractical. Instead, an alternative is to integrate the many separate USGS systems and connect to non-USGS systems. This approach gives each user the freedom to choose the Email system that best addresses local applications, provides for minimal disruption to existing user applications, preserves the investment in existing software, and avoids expensive retraining of users, who are able to continue to use their familiar local Email.

A central electronic Email switch will provide the necessary protocol conversion and mail routing when a user is sending information from one Email system to another. Also, a common central directory of user names and addresses will give USGS employees the ability to transfer electronic information to any user, regardless of computer system or Email software, simply by knowing the destination address. The sender will not need to know where the destination user is physically located, what computer system is being used, or on what Email system the destination user is registered.

Email is an end-user technology that, to be truly successful, must conform to design criteria that address user needs. The system must be universally accessible to users, allow transmission of different data types, be easy and inexpensive to use, be executable from within the user's local computing environment, and support interfaces to Email systems external to the USGS.

In the future, all USGS Email systems will support a standard convention for mail interchange that is part of a broader communication protocol suite known as GOSIP (Government Open Systems Interconnection Profile, discussed in "Open Systems Communication Standards," p. 88).

[blocks in formation]
[blocks in formation]
[blocks in formation]

(1) climate models suggest that the Arctic will be one of the first areas to respond to changing climate, (2) the magnitude of environmental change will be greater in the Arctic than on other areas of the Earth's surface, and (3) the Arctic scientific community is a relatively small group that has a need to improve access to data and information from remote locations.

The USGS is an active member of the Arctic Environmental Data Directory Working Group, which is sponsored by the Interagency Arctic Research Policy Committee (IARPC). The working group is composed of representatives from government agencies and academia. A goal of the working group is to establish easy access to, and hence improve the dissemination of, earth science data and information about the Arctic.

As a first step, the working group developed the Arctic Environmental Data Directory, which contains more than 300 references to Arctic data sets maintained by U.S. Government agencies and other institutions. To meet the data management goals of the IARPC, the working group developed a pilot study, known as the Arctic Data Interactive (ADI), to

Additional Entries

integrate information to be published using compact disc-read only memory (CD-ROM) technology. The ADI prototype includes the following multimedia elements:

• Arctic Environmental Data Directory,

• Bibliographic information,

Full text of research reports and short papers (including illustrations), and Arctic data sets.

The project will also develop an electronic journal prototype that will include a mix of textual, numeric, and spatial data and related software for data analysis. The data will be in standard formats to correspond with other applications software such as spread sheets, graphics, and image processing.

The design of the ADI prototype is based on the concept of hypertext technology. Hypertext, also known as hypermedia, is defined in the computer and information science literature as a software environment for developing nonsequential data-basemanagement systems. Hypertext techniques create associative links between structured and unstructured information that may include data, text, graphics, imagery, and sound. A hypertext link, conceptually similar to a footnote or a parenthetical phrase, directs the reader to related points or topics for further research.

A hypertext system is characterized by a user interface having icons (graphic representations) and multiple windows on a computer monitor. Icons for different functions allow readers to browse through information by following associative links between bibliographies, numeric data, textual information, and spatial imagery.

The goal of USGS experimentation with hypermedia technology is to integrate a broad range of multimedia formats into one

[graphic]
« PreviousContinue »