« PreviousContinue »
APRIL 21–32, 3
MAY 1–109 1992
These images illustrate temporal changes in vegetation conditions during the early portion of the growing season in North America. The changes in tone reflect the amount of photosynthetic activity in each area. Clouds and snow are depicted in white, and black represents areas where data are missing. The information is derived from 1-kilometer AVHRR data from NOAA.
In the last year, the EROS Data Center (the USGS satellite research and archiving facility in Sioux Falls, S. Dak.) purchased the required Landsat TM data covering the conterminous 48 States and completed the processing—including registration and distribution to the partners—of nearly half of 600 Scenes.
The first land-cover classifications devel
oped by the partners are expected in early 1995. The tentative goal is to have national coverage by late 1996.
Monitoring and Change Analysis
multiple-resolution land characteristics
monitoring system will support a broad range of environmental assessment and earth system process studies. The monitoring system will use coarse-resolution data having high temporal frequencies such as AVHRR to identify anomalous landscape conditions. The assessment of anomalies is based on higher resolution images such as those from the Landsat program.
The multi-resolution land characteristics monitoring system depends on the development of the global and regional land characteristics databases. The former will be related to the global 1-kilometer AVHRR data. The scale of the latter will be commensurate with that of the Landsat data. As these databases evolve, so will the structure and capabilities of the multi-resolution land characteristics monitoring system
Thomas R. Loveland is a remote-sensing scientist as the USGS EROS Data Center in Sioux Falls, S. Dak.
National Spatial Data Infrastructure
he National Spatial Data Infrastructure
(NSDI) is being established as a means to find and cooperatively produce and use geospatial data as an information resource for the Nation. The NSDI consists of organizations and individuals who generate or use geospatial data, the technologies that facilitate use and transfer of geospatial data, and the actual data themselves. The Federal
O. April 11, 1994, President Clinton signed Executive Order 12906, Coordinating Geographic Data Acquisition and Access: The National Spatial Data Infrastructure. The order directs all Federal agencies to contribute to the development of the NSDI and lays out key activities that Federal agencies must conduct in conjunction with State and local governments, academia, and the private sector to ensure the evolution and growth of the NSDI. Agencies are called upon to: • Contribute to a national geospatial data clearinghouse and use that clearinghouse to determine data availability before starting new data collection projects. • Document data sets according to metadata standards and support public access to data. • Cooperatively develop data content standards and other geospatial data standards as necessary.
Executive Order 12906, “geospatial data" means information that identifies the geographic location and characteristics of natural or constructed features and boundaries on the Earth. This information may be derived from, among other things, remote sensing, mapping, and surveying technologies. Statistical data may be included in this definition at the discretion of the collecting • Develop a plan for a national digital geospatial data framework.
• Develop strategies to cooperate more fully with State and local governments, the private sector, and other non-Federal organizations to share costs and improve efficient acquisition of geospatial data.
Agencies are called upon to
develop strategies to cooperate more fully with State and local governments, the private sector,
and other non-Federal
organizations to share costs and
improve efficient acquisition
of geospatial data.
The Geological Survey Geographic Data Committee (GSGDC) was charged to lead in developing the bureau's response to the Executive Order. An NSDI Action Plan Committee delivered a strategy to the GSGDC at the end of August, and an Action Plan Implementation Team was activated to carry the plan through its initial steps in January 1995.
Content Standards for Digital Geospatial Metadata
Ol. 8, 1994, the FGDC, under the chairmanship of Secretary of the Interior Bruce Babbitt, approved the Content Standards for Digital Geospatial Metadata. The standard is a common set of terms and definitions for documenting important aspects of geospatial data, including identification, data quality, spatial reference, spatial
data organization, feature and attribute defini
tions, and distribution. The standard was developed over the last 2 years with input from numerous groups. The USGS began to develop software to assist in the creation of metadata compliant with the standards. The USGS also sponsors training and workshop sessions to educate
data producers and users on the value of metadata and the metadata standards.
National Geospatial Data Clearinghouse
A network-based clearinghouse for geospatial data is being developed to provide both metadata and geospatial data. Instead of centralizing all information, the Internet is used to link the sites where data are produced or maintained. By using this approach, data producers can control and maintain information provided about their data. The Internet then is used to find what data exist, the quality and condition of those data, and the terms for obtaining them. The USGS has been particularly active in clearinghouse efforts. Servers for geospatial data and metadata within the USGS include the National Digital Cartographic Data Base servers at the EROS Data Center, which recorded nearly 40,000 downloads during their first 3 months, and the Distributed Spatial Data Library, which serves as the testbed for Wide-Area Information Server (WAIS) software development.
Instead of centralizing all information, the Internet is used to link the sites where data are
produced or maintained.
Continued development of spatial enhancement to the WAIS software, together with workshops and training sessions, has helped spread knowledge of these new capabilities.
NSDI Competitive Cooperative Agreements Program
he NSDI Competitive Cooperative Agreements Program was established in 1994 as a mechanism for the FGDC to use in
forming partnerships with the non-Federal sector to assist in the development of the NSDI. This program provides cooperative funding to State and local government agencies, institutions of higher education, and (or) private organizations to encourage resourcesharing projects through the use of technology, networking, and enhanced interagency coordination efforts. Funding for the first year totaled $250,000 in nine individual awards. Awards were targeted toward two major elements of the NSDI. One element involved the development of the National Geospatial Data Clearinghouse to increase awareness and use of geospatial data; the second involved developing and furthering the use of FGDC-endorsed standards. Awards were given to: • Texas–Texas Water Development Board: An Internet Node for the State of Texas. • North Carolina–North Carolina Center for Geographic Information and Analysis: North Carolina Geographic Data Clearinghouse. • Wisconsin–Wisconsin Land Information Board: Wisconsin NSDI Clearinghouse Initiative. • Florida–Florida State University: The Integration of Citizens and State and Local Governments into the NSDI Initiatives. • Iowa–Iowa Department of Natural Resources: Establishing a National Geospatial Data Clearinghouse Node in Iowa. • New Mexico–Earth Data Analysis Center, University of New Mexico: Conversion of New Mexico's Resource Geographic Information System Metadata to FGDC Metadata Standards. • Minnesota–Alexandria Technical College, Minnesota: Geospatial Data Standards Education. • Montana–Natural Resource Information System, Montana State Library: Montana GIS Data Clearinghouse. • New Jersey–New Jersey Department of Environmental Protection: Contributing New Jersey's GIS User Network and Geographic Information to the NSDI.
collected by the public and private sectors. A Framework Working Group, organized by the FGDC, is identifying the purpose, goals, and content of the framework, how it should work, and reasons why organizations should participate. The framework is a basic, consistent set of digital geospatial data and supporting services that will provide a geospatial foundation to which an organization may add detail and attach attribute information. Organizations will be able to use such a framework to accurately register and compile other themes of data and link the results of application to the landscape. Implementation will be phased, the goal being to have an initial geospatial data framework in place by the year 2000.
Thomas M. McCulloch is clearinghouse coordinator on the Federal Geographic Data Committee staff
Internet: Earth Science Link to the Information Superhighway
he development of high-speed computer networks available to the general public provides new opportunities for the U.S. Geological Survey (USGS) to distribute the results of its research, including reports, data sets, and maps. Computer networks also allow the public to get questions answered more rapidly and inexpensively than ever before. The largest of these high-speed networks is the Internet. The Internet was developed by the Department of Defense Advanced Research Projects Agency during the 1970's to study computer networking technology. At that time, it was called ARPAnet and was available to a small number of researchers. In the 1980's, it was expanded and supported by the National Science Foundation to enhance communication within the academic and Government research community. During this period, the main backbone of the Internet was called NSFnet. In 1993, the Internet was opened for commercial (nonresearch) uses. The commercialization of the Internet has created new methods and opportunities for communicating with millions of users. Today, the Internet is a collection of over 20,000 individual computer networks
One historical problem with the Internet has been the difficulty in finding and viewing the information that it contained. Because it was originally created to support the research community, a high level of technical expertise was needed to operate the system. A recent technical development has helped solve this problem. A program called Mosaic, written at the National Center for Supercomputing Applications at the University of Illinois Urbana–Champaign, now provides a method by which casual users can access Internet and get useful information and services. Mosaic uses text, images, audio, and animation to communicate information through a computer. The user no longer needs to know the technical details for access to the Internet. Mosaic can access the many different types of information services through a graphical user interface.
When accessing the Internet through Mosaic, a user can take advantage of active references to additional information that are embedded within each document. These “links” are a little like reference citations in a paper document, but they are much easier to use. Instead of tracking down the additional information in some other publication, the
user simply clicks a mouse button to access the new material. The network of documents created by these links, called the World Wide Web (WWW), is not a publication or a database. The WWW is a virtual library within the Internet that contains publications, databases, and access to products. This library does not physically exist. It is created by many computers on the Internet working together. The USGS WWW Library is implemented as a distributed computer system; many USGS computers participate by providing access through Mosaic to the data that they contain. The WWW is not just one library but a collection of interaccessible libraries. These virtual libraries conceptually consist of “rooms”: a reference room, a general reading room, and a special collections room. The reference room of the USGS portion of WWW contains the official publications of the USGS. It is the central location for finding USGS information products on the Internet. The documents in the WWW reference room, unlike those in traditional library reference rooms, may circulate. Users may “borrow” a document, making a complete duplicate of the document for their full use. Many copies of a given document eventually may exist in satellite WWW libraries around the world. Because these copies may have been modified or altered, users must know where to find the original version of any given USGS document, data set, or electronic product. The WWW reference room provides access to that original version. All documents in the reference room of the USGS WWW Library have been approved for publication by the Director. The USGS WWW Library, like any traditional library, also contains books, maps, data sets, and other products that support the mission of the USGS but were not created by the USGS, such as historical documents or duplicate copies of reports or data sets. Many may have been created at the USGS but have been extended by others to create valuable new products. They are held in the general reading room, which is analogous to the “stacks” of a paper library. The USGS uses computers and networks in many phases of its research. Unfinished works in progress are not generally released to the public until they have been completed and granted Director's approval. These resources are kept in the special collections room of the USGS WWW Library. Access is restricted, much as access to rare books is restricted in a paper library or as