• Keine Ergebnisse gefunden

The JGOFS data management is the responsibility of each national participant, without a centralized clearinghouse for the data

N/A
N/A
Protected

Academic year: 2022

Aktie "The JGOFS data management is the responsibility of each national participant, without a centralized clearinghouse for the data"

Copied!
1
0
0

Wird geladen.... (Jetzt Volltext ansehen)

Volltext

(1)

To really optulllse the scientific value of an international research project, such as JGOFS and its successors, proper data management practices are essential. They should aim to ensure the elaboration of quality-controlled, homogeneously formatted and extensively documented datasets and their rapid, worldwide dissemination and long-term stewardship within the World Data Centre (WDC) system. The JGOFS data management is the responsibility of each national participant, without a centralized clearinghouse for the data. As a result, many data are still not fully available for the modelling and synthesis phase. The DMTT, a consortium of national data managers (DMO), is working hard to put together a single database (so-called, the International JGOFS Master Dataset), in a single location (in the WOC system, thanks to an initiative of P ANGAEA / WOC-MARE), in a single format. This should be achieved by adapting previously developed tools, especially from the US-JGOFS DMO (for the user query interface) and from ODVIPANGAEA (for the datasets visualization and metadata handling). In this framework, the major past and current DMTT activities include: production of nationally approved JGOFS cruise inventories;

preparation of a list of core parameters, associated units, methodology, and quality control criteria;

preparation of metadata standards to describe the datasets;

production of CD-ROMs and/or on-line databases (JGOFS and JGOFS-related);

adaptation of existing data management tools, such as user interface (e.g., J-LAS) and visualization package (e.g., ODV-derived), to be incorporated into the International CD-ROM dataset;

collection of relevant references associated with the datasets;

interactions within the DMTT and with other competent bodies (e.g., NODCs, WDCs, ICES, lODE, national data managers not involved in the DMTT);

interactions with other JGOFS WGs and TTs;

preparation of recommendations for proper data management to the JGOFS SSC, the GOFS parent bodies (IGBP and SCOR) and (inter)national funding agencies; in preparation of the future marine biogeochemistry programme(s).

We hope that DMTT representatives and modellers can work together (nationally and internationally) during the remaining period of the JGOFS project. Your inputs regarding the needs of the modelling community (e.g., parameters collected, quality, resolution, and accessibility) will be very welcome.

PANGAEA AND THE WORLD DATA CENTER FOR MARINE ENVIRONMENTAL SCIENCES- FACILITIES FOR THE FINAL GLOBAL DATA SYNTHESIS OF JGOFS DATA

Michael Diepenbroek, Hannes Grobe, Rainer Sieger

The World Data Centre for Marine Environmental Sciences (WDC-MARE, http://www.pangaea.de) is aimed at collecting, scrutinizing, and disseminating data related to global change in the fields of environmental oceanography, marine geology, paleoceanography, and marine biology. WDC-MARE uses the scientific information system PANG AEA (Network for Geosciences and Environmental Data) as its operating platform. P ANGAEA in addition serves significant amounts of terrestrial, lacustrine, and glacial data. Essential services supplied by WDC-MARE / PANGAEA are project data management, data publication, and the distribution of visualization and analysis software (freeware products). Among the recent data management projects are the final global data synthesis for the Joint Global Ocean Flux Study (JGOFS) and the International Marine Global Change Study (IMAGES). Together with the WDC for Paleoclimatology, Boulder, WDC-MARE forms the essential backbone within the IGBPIP AGES Data System (Eakin, 2002). Organization of data management includes quality control and publication of data and the dissemination of metadata according to international standards. Data managers are responsible for acquisition and maintenance of data. The data model used reflects the information processing steps in the earth science fields and can handle any related analytical data. A relational database management system (RDBMS) is used for information storage. Users access data from the database via web-based clients, including a simple search engine (pangaVista) and a data-mining tool (ART). With its comprehensive graphical user interfaces and the built in functionality for import, export, and maintenance of information PANGAEA is a highly efficient system for scientific data management and data publication. WDC-MARE / P ANGAEA is operated as a permanent facility by the Centre for Marine Environmental Sciences at the

Referenzen

ÄHNLICHE DOKUMENTE

To improve the availability and findability of research data, the signers encourage authors of research papers to deposit researcher validated data in trustworthy and reliable.

Author contributions BB has led overall research activities from proposal development to data compilation, data entry and processing, data analysis, and interpretation of the result

His research focuses on fast access to big data including in particular: data analytics on large datasets, Hadoop MapReduce, main-memory databases, and database indexing. He has been

Im Blickpunkt des Workshops standen daher sowohl Technologien und Infrastrukturen für das skalierbare Datenmanagement als auch etablierte und neue Anwendungsbereiche, in denen das

Extracts were filtered using Milcrofiltration Systems MFS Nylon, 3mm dia, 0.45 um filters and analysed by the method of Wright et al (1991) using a Waters 626 pump, Gilson

In 2001, the CMTT Global Synthesis group invited the World Data Centre for Marine Environmental Sciences (WDC-MARE) to take care for information and data management during and

MKE. NPP: Annual newspaper subscription price. TVU: Number of TV units per capita. Stocks computed from produc- tion data based on a seven-year lifetime. Aroucas

The challenges of cloud computing data management can be summarized as mas- sively parallel and widely distributed data storage and processing, integration of novel processing