Filter
Reset all

Subjects

Content Types

Countries

AID systems

API

Certificates

Data access

Data access restrictions

Database access

Database access restrictions

Database licenses

Data licenses

Data upload

Data upload restrictions

Enhanced publication

Institution responsibility type

Institution type

Keywords

Metadata standards

PID systems

Provider types

Quality management

Repository languages

Software

Syndications

Repository types

Versioning

  • * at the end of a keyword allows wildcard searches
  • " quotes can be used for searching phrases
  • + represents an AND search (default)
  • | represents an OR search
  • - represents a NOT operation
  • ( and ) implies priority
  • ~N after a word specifies the desired edit distance (fuzziness)
  • ~N after a phrase specifies the desired slop amount
  • 1 (current)
Found 16 result(s)
The OpenMadrigal project seeks to develop and support an on-line database for geospace data. The project has been led by MIT Haystack Observatory since 1980, but now has active support from Jicamarca Observatory and other community members. Madrigal is a robust, World Wide Web based system capable of managing and serving archival and real-time data, in a variety of formats, from a wide range of ground-based instruments. Madrigal is installed at a number of sites around the world. Data at each Madrigal site is locally controlled and can be updated at any time, but shared metadata between Madrigal sites allow searching of all Madrigal sites at once from any Madrigal site. Data is local; metadata is shared.
Country
GEOFON seeks to facilitate cooperation in seismological research and earthquake and tsunami hazard mitigation by providing rapid transnational access to seismological data and source parameters of large earthquakes, and keeping these data accessible in the long term. It pursues these aims by operating and maintaining a global network of permanent broadband stations in cooperation with local partners, facilitating real time access to data from this network and those of many partner networks and plate boundary observatories, providing a permanent and secure archive for seismological data. It also archives and makes accessible data from temporary experiments carried out by scientists at German universities and institutions, thereby fostering cooperation and encouraging the full exploitation of all acquired data and serving as the permanent archive for the Geophysical Instrument Pool at Potsdam (GIPP). It also organises the data exchange of real-time and archived data with partner institutions and international centres.
Country
ICES is an intergovernmental organization whose main objective is to increase the scientific knowledge of the marine environment and its living resources and to use this knowledge to provide unbiased, non-political advice to competent authorities.
Country
GEOMAR Helmholtz Centre for Ocean Research Kiel is one of the leading marine science institutions in Europe. GEOMAR investigates the chemical, physical, biological, and geological processes in the oceans, as well as their interactions with the seafloor and the atmosphere. OceanRep is an open access digital collection containing the research output of GEOMAR staff and students. Included are journal articles, conference papers, book chapters, theses and more, - with fulltext, if available. Research data are linked to the publications entries.
SHARE - Stations at High Altitude for Research on the Environment - is an integrated Project for environmental monitoring and research in the mountain areas of Europe, Asia, Africa and South America responding to the call for improving environmental research and policies for adaptation to the effects of climate changes, as requested by International and Intergovernmental institutions.
The Humanitarian Data Exchange (HDX) is an open platform for sharing data across crises and organisations. Launched in July 2014, the goal of HDX is to make humanitarian data easy to find and use for analysis. HDX is managed by OCHA's Centre for Humanitarian Data, which is located in The Hague. OCHA is part of the United Nations Secretariat and is responsible for bringing together humanitarian actors to ensure a coherent response to emergencies. The HDX team includes OCHA staff and a number of consultants who are based in North America, Europe and Africa.
The Earth System Grid Federation (ESGF) is an international collaboration with a current focus on serving the World Climate Research Programme's (WCRP) Coupled Model Intercomparison Project (CMIP) and supporting climate and environmental science in general. Data is searchable and available for download at the Federated ESGF-CoG Nodes https://esgf.llnl.gov/nodes.html
ETH Data Archive is ETH Zurich's long-term preservation solution for digital information such as research data, digitised content, archival records, or images. It serves as the backbone of data curation and for most of its content, it is a “dark archive” without public access. In this capacity, the ETH Data Archive also archives the content of ETH Zurich’s Research Collection which is the primary repository for members of the university and the first point of contact for publication of data at ETH Zurich. All data that was produced in the context of research at the ETH Zurich, can be published and archived in the Research Collection. An automated connection to the ETH Data Archive in the background ensures the medium to long-term preservation of all publications and research data. Direct access to the ETH Data Archive is intended only for customers who need to deposit software source code within the framework of ETH transfer Software Registration. Open Source code packages and other content from legacy workflows can be accessed via ETH Library @ swisscovery (https://library.ethz.ch/en/).
The THREDDS Data Server (TDS) is a web server that provides metadata and data access for scientific datasets, using OPeNDAP, OGC WMS and WCS, HTTP, and other remote data access protocols. Unidata is a diverse community of over 250 institutions vested in the common goal of sharing data, and tools to access and visualize that data. For more than 25 years Unidata has been providing data, tools, and support to enhance earth-system education and research. In an era of increasing data complexity, accessibility, and multidisciplinary integration, Unidata provides a rich set of services and tools.
The LRIS portal is the first element of scinfo.org.nz, a new repository of authoritative New Zealand science datasets and information. It is has been created in response to a growing expectation that government and publicly funded science data should be readily available in authoritative human and machine readable forms.
The Social Science Data Archive is still active and maintained as part of the UCLA Library Data Science Center. SSDA Dataverse is one of the archiving opportunities of SSDA, the others are: Data can be archived by SSDA itself or by ICPSR or by UCLA Library or by California Digital Library. The Social Science Data Archives serves the UCLA campus as an archive of faculty and graduate student survey research. We provide long term storage of data files and documentation. We ensure that the data are useable in the future by migrating files to new operating systems. We follow government standards and archival best practices. The mission of the Social Science Data Archive has been and continues to be to provide a foundation for social science research with faculty support throughout an entire research project involving original data collection or the reuse of publicly available studies. Data Archive staff and researchers work as partners throughout all stages of the research process, beginning when a hypothesis or area of study is being developed, during grant and funding activities, while data collection and/or analysis is ongoing, and finally in long term preservation of research results. Our role is to provide a collaborative environment where the focus is on understanding the nature and scope of research approach and management of research output throughout the entire life cycle of the project. Instructional support, especially support that links research with instruction is also a mainstay of operations.
The CCHDO provides access to standard, well-described datasets from reference-quality repeat hydrography expeditions. It curates high quality full water column Conductivity-Temperature-Depth (CTD), hydrographic, carbon and tracer data from over 2,500 cruises from ~30 countries. It is the official data center for CTD and water sample profile data from the Global Ocean Ship-Based Hydrographic Investigations Program (GO-SHIP), as well as for WOCE, US Hydro, and other high quality repeat hydrography lines (e.g. SOCCOM, HOT, BATS, WOCE, CARINA.)
Country
The Marine Data Archive (MDA) is an online repository specifically developed to independently archive data files in a fully documented manner. The MDA can serve individuals, consortia, working groups and institutes to manage data files and file versions for a specific context (project, report, analysis, monitoring campaign), as a personal or institutional archive or back-up system and as an open repository for data publication.
To help flattening the COVID-19 curve public health systems need better information on whether preventive measures are working and how the virus may spread. Facebook Data for Good offer maps on population movement that researchers and nonprofits are already using to understand the coronavirus crisis, using aggregated data to protect people’s privacy.
Kochi Core Center (KCC) houses one of the 3 Inernationational Ocean Discovery Program (IODP) core repositories, accompanied by images and x-ray CT scanning data viewable by the Virtual Core Library. And it hosts Japan Agency for Marine-Earth Science and Technology (JAMSTEC) marine core samples and associated analytical data for general scientific or educational uses, after 2 years have passed since collection of core samples.
The Barrow area on the North Slope of Alaska is one of the most intensely sampled locations in the Arctic with research sites dating back to the 1940s. The Barrow Area Information Database (BAID) is a resource for learning about the types of data collection activities in the region. The BAID team collaborates with scientists and the local community to compile and share this information via online web mapping applications.