Filter
Reset all

Subjects

Content Types

Countries

AID systems

API

Certificates

Data access

Data access restrictions

Database access

Database access restrictions

Database licenses

Data licenses

Data upload

Data upload restrictions

Enhanced publication

Institution responsibility type

Institution type

Keywords

Metadata standards

PID systems

Provider types

Quality management

Repository languages

Software

Syndications

Repository types

Versioning

  • * at the end of a keyword allows wildcard searches
  • " quotes can be used for searching phrases
  • + represents an AND search (default)
  • | represents an OR search
  • - represents a NOT operation
  • ( and ) implies priority
  • ~N after a word specifies the desired edit distance (fuzziness)
  • ~N after a phrase specifies the desired slop amount
Found 222 result(s)
<<<!!!<<< Stated 2019-10-30: Dash is no longer available. Researchers are advised to store their research data at Dryad https://www.re3data.org/repository/r3d100000044 >>>!!!>>> Dash is an open data publication platform for upload, access, and re-use of research data. Submissions to Dash may be from researchers at participating UC campuses, researchers in earth science and ecology (DataONE), and researchers submitting to the UC Press journals Elementa and Collabra. Self-service depositing of research data through Dash fulfills publisher, funder, and data management plan requirements regarding data sharing and preservation. When researchers publish their datasets through Dash, their datasets are issued a DOI (DataCite) to optimize citability, and are publicly available for download and re-use under a CC BY 4.0 or CC-0 license. Deposited data are preserved in Merritt, California Digital Library’s preservation repository.
Country
The BCDC serves the research data obtained, and the data syntheses assembled, by researchers within the Bjerknes Centre for Climate Research. Furthermore it is open for all interested scientists independent of institution. All data from the different disciplines (e.g. geology, oceanography, biology, model community) will be archived in a long-term repository, interconnected and made publicly available by the BCDC. BCDC has collaborations with many international data repositories and actively archives metadata and data at those ensuring quality and FAIRness. BCDC has it's main focus on services for data management for external and internal funded projects in the field of climate research, provides data management plans and ensures that data is archived accordingly according to the best practices in the field. The data management services rank from project work for small external funded project to top-of-the-art data management services for research infrastructures on the ESFRI roadmap (e.g. RI ICOS – Integrated Carbon Observation System) and for provides products and services for Copernicus Marine Environmental Monitoring Services. In addition BCDC is advising various communities on data management services e.g. IOC UNESCO, OECD, IAEA and various funding agencies. BCDC will become an Associated Data Unit (ADU) under IODE, International Oceanographic Data and Information Exchange, a worldwide network that operates under the auspices of the Intergovernmental Oceanographic Commission of UNESCO and aims at becoming a part of ICSU World Data System.
The Magnetics Information Consortium (MagIC) improves research capacity in the Earth and Ocean sciences by maintaining an open community digital data archive for rock magnetic, geomagnetic, archeomagnetic (archaeomagnetic) and paleomagnetic (palaeomagnetic) data. Different parts of the website allow users access to archive, search, visualize, and download these data. MagIC supports the international rock magnetism, geomagnetism, archeomagnetism (archaeomagnetism), and paleomagnetism (palaeomagnetism) research and endeavors to bring data out of private archives, making them accessible to all and (re-)useable for new, creative, collaborative scientific and educational activities. The data in MagIC is used for many types of studies including tectonic plate reconstructions, geomagnetic field models, paleomagnetic field reversal studies, magnetohydrodynamical studies of the Earth's core, magnetostratigraphy, and archeology. MagIC is a domain-specific data repository and directed by PIs who are both producers and consumers of rock, geo, and paleomagnetic data. Funded by NSF since 2003, MagIC forms a major part of https://earthref.org which integrates four independent cyber-initiatives rooted in various parts of the Earth, Ocean and Life sciences and education.
Country
Etsin is a research data finder that contains descriptive information – that is, metadata – on research datasets. In the service you can search and find data from various fields of research. A researcher, research group or organisation can use Etsin to publish information on their datasets and offer them for wider use. The metadata contained in Etsin makes it easy for anyone to discover the datasets. Etsin assigns a permanent URN identifier to datasets, making it possible to link to the dataset and gather merit through its publication and use. The metadata enables users to search for datasets and evaluate the potential for reuse. Etsin includes a description of the dataset, keywords and various dataset identifiers. The dataset information includes, for example, its subject, language, author, owner and how it is licensed for reuse. Good description of data plays an important role in its discoverability and visibility. Etsin encourages comprehensive descriptions by adapting a common set of discipline independent metadata fields and by making it easy to enter metadata. Etsin only collects metadata on datasets, not the data themselves. Anyone may browse and read the metadata. Etsin can be used with a browser or through an open interface (API). The service is discipline independent and free to use. Etsin is a service provided by the Ministry of Education and Culture to actors in the Finnish research system. The service is produced by CSC – IT Center for Science (CSC). Customer service contacts and feedback is available through servicedesk@csc.fi. The service maintenance window is on the fourth Monday of every month between 4 and 6 PM (EET). During that time, the service will be out of use.
The WDC is concerned with the collection, management, distribution and utilization of data from Chinese provinces, autonomous regions and counties,including: Resource data:management,distribution and utlilzation of land, water, climate, forest, grassland, minerals, energy, etc. Environmental data:pollution,environmental quality, change, natural disasters,soli erosion, etc. Biological resources:animals, plants,wildlife Social economy:agriculture, industry, transport, commerce,infrastructure,etc. Population and labor Geographic background data on scales of 1:4M,1:1M, 1:(1/2)M, 1:2500, etc.
Content type(s)
The Lamont-Doherty Core Repository (LDCR) contains one of the world’s most unique and important collection of scientific samples from the deep sea. Sediment cores from every major ocean and sea are archived at the Core Repository. The collection contains approximately 72,000 meters of core composed of 9,700 piston cores; 7,000 trigger weight cores; and 2,000 other cores such as box, kasten, and large diameter gravity cores. We also hold 4,000 dredge and grab samples, including a large collection of manganese nodules, many of which were recovered by submersibles. Over 100,000 residues are stored and are available for sampling where core material is expended. In addition to physical samples, a database of the Lamont core collection has been maintained for nearly 50 years and contains information on the geographic location of each collection site, core length, mineralogy and paleontology, lithology, and structure, and more recently, the full text of megascopic descriptions.
The South African Marine Information Management System (MIMS) is an Open Archival Information System (OAIS) repository that plays a multifaceted role in archiving, publishing, and preserving marine-related datasets. As an IODE-accredited Associate Data Unit (ADU), MIMS serves as a national node for the IODE of the IOC of UNESCO. It archives and publishes collections and subsets of marine-related datasets for the National Department of Forestry, Fisheries, and the Environment (DFFE) and its regional partners. As an IOC member organization, DFFE is committed to supporting the long-term preservation and archival of marine and coastal data for South Africa and its regional partners, promoting open access to data, and encouraging scientific collaboration. Tasked with the long-term preservation of South Africa's marine and coastal data, MIMS functions as an institutional data repository. It provides primary access to all data collected by the DFFE Oceans and Coastal Research Directorate and acts as a trusted broker of scientific marine data for a wide range of South African institutions. MIMS hosts the IODE AFROBIS Node, an OBIS Node that coordinates and collates data management activities within the sub-Saharan African region. As part of the OBIS Steering Group, MIMS represents sub-Saharan Africa on issues around biological (biodiversity) data standards. It also facilitates data and metadata publishing for the region through the GBIF and OBIS networks. Operating on the Findable, Accessible, Interoperable, and Reusable (FAIR) data principles, MIMS aligns its practices to maximize ocean data exchange and use while respecting the conditions stipulated by the Data Provider. By integrating various functions and commitments, MIMS stands as a vital component in the marine and coastal data landscape, fostering collaboration, standardization, and accessibility in alignment with international standards and regional needs.
Country
The Digital Repository of Ireland (DRI) is a national trusted digital repository (TDR) for Ireland’s social and cultural data. We preserve, curate, and provide sustained access to a wealth of Ireland’s humanities and social sciences data through a single online portal. The repository houses unique and important collections from a variety of organisations including higher education institutions, cultural institutions, government agencies, and specialist archives. DRI has staff members from a wide variety of backgrounds, including software engineers, designers, digital archivists and librarians, data curators, policy and requirements specialists, educators, project managers, social scientists and humanities scholars. DRI is certified by the CoreTrustSeal, the current TDR standard widely recommended for best practice in Open Science. In addition to providing trusted digital repository services, the DRI is also Ireland’s research centre for best practices in digital archiving, repository infrastructures, preservation policy, research data management and advocacy at the national and European levels. DRI contributes to policy making nationally (e.g. via the National Open Research Forum and the IRC), and internationally, including European Commission expert groups, the DPC, RDA and the OECD.
Country
IDOC-DATA is a department of IDOC IDOC (Integrated Data & Operation Center) has existed since 2003 as a satellite operations center and data center for the Institute of Space Astrophysics (IAS) in Orsay, France. Since then, it has operated within the OSUPS (Observatoire des Sciences de l'Univers de l'Université Paris-Saclay - first french university in shanghai ranking), which includes three institutes: IAS, AIM (Astrophysique, Interprétation, Modélisation - IRFU, CEA) and GEOPS (Geosciences Paris-Saclay) . IDOC participates in the space missions of OSUPS and its partners, from mission design to long-term scientific data archiving. For each phase of the missions, IDOC offers three kinds of services in the scientific themes of OSUPS and therefore IDOC's activities are divided into three departments: IDOC-INSTR: instrument design and testing, IDOC-OPE: instrument operations, IDOC-DATA: data management and data value chain: to produce the different levels of data constructed from observations of these instruments and make them available to users for ergonomic and efficient scientific interpretation (IDOC-DATA). It includes the responsibility: - To build access to these datasets. - To offer the corresponding services such as catalogue management, visualization tools, software pipeline automation, etc. - To preserve the availability and reliability of this hardware and software infrastructure, its confidentiality where applicable and its security.
ZENODO builds and operates a simple and innovative service that enables researchers, scientists, EU projects and institutions to share and showcase multidisciplinary research results (data and publications) that are not part of the existing institutional or subject-based repositories of the research communities. ZENODO enables researchers, scientists, EU projects and institutions to: easily share the long tail of small research results in a wide variety of formats including text, spreadsheets, audio, video, and images across all fields of science. display their research results and get credited by making the research results citable and integrate them into existing reporting lines to funding agencies like the European Commission. easily access and reuse shared research results.
Neotoma is a multiproxy paleoecological database that covers the Pliocene-Quaternary, including modern microfossil samples. The database is an international collaborative effort among individuals from 19 institutions, representing multiple constituent databases. There are over 20 data-types within the Neotoma Paleoecological Database, including pollen microfossils, plant macrofossils, vertebrate fauna, diatoms, charcoal, biomarkers, ostracodes, physical sedimentology and water chemistry. Neotoma provides an underlying cyberinfrastructure that enables the development of common software tools for data ingest, discovery, display, analysis, and distribution, while giving domain scientists control over critical taxonomic and other data quality issues.
The University of Waterloo Dataverse is a data repository for research outputs of our faculty, students, and staff. Files are held in a secure environment on Canadian servers. Researchers can choose to make content available to the public, to specific individuals, or to keep it private.
Country
The Coriolis Data Centre handles operational oceanography measurements made in situ, complementing the measurement of the ocean surface made using instruments aboard satellites. This work is realised through the establishment of permanent networks with data collected by ships or autonomous systems that are either fixed or drifting. This data can be used to construct a snapshot of water mass structure and current intensity.
CLARINO Bergen Center repository is the repository of CLARINO, the Norwegian infrastructure project . Its goal is to implement the Norwegian part of CLARIN. The ultimate aim is to make existing and future language resources easily accessible for researchers and to bring eScience to humanities disciplines. The repository includes INESS the Norwegian Infrastructure for the Exploration of Syntax and Semantics. This infrastructure provides access to treebanks, which are databases of syntactically and semantically annotated sentences.
TemperateReefBase is a resource for temperate reef researchers worldwide to use and contribute data. Unique in its role as a one-stop-shop for global temperate reef data, TemperateReefBase was initially established by IMAS in collaboration with the Kelp Ecology Ecosystem Network (KEEN). KEEN was instigated through a National Centre for Ecological Analysis and Synthesis (NCEAS) working group which assembled experts from around the world to examine the impacts of global change on kelp-bed ecosystem worldwide. The group has assembled significant global data for kelps, other seaweeds and associated species including fishes, and has embarked on unprecedented global experiments and surveys in which identical experiments and surveys are being conducted at sites in kelp beds around the world to determine global trends and examine the capacity of kelps to respond to disturbance in the face of climate change and other anthropogenic stressors. The TemperateReefBase Data Portal is an online discovery interface showcasing temperate reef data collected from around the globe. The portal aims to make this data freely and openly available for the benefit of marine and environmental science as a whole. The TemperateReefBase Data Portal is hosted and maintained by the Institute for Marine and Antarctic Studies at the University of Tasmania, Australia.
ArrayExpress is one of the major international repositories for high-throughput functional genomics data from both microarray and high-throughput sequencing studies, many of which are supported by peer-reviewed publications. Data sets are submitted directly to ArrayExpress and curated by a team of specialist biological curators. In the past (until 2018) datasets from the NCBI Gene Expression Omnibus database were imported on a weekly basis. Data is collected to MIAME and MINSEQE standards.
The World Glacier Monitoring Service (WGMS) collects standardized observations on changes in mass, volume, area and length of glaciers with time (glacier fluctuations), as well as statistical information on the distribution of perennial surface ice in space (glacier inventories). Such glacier fluctuation and inventory data are high priority key variables in climate system monitoring; they form a basis for hydrological modelling with respect to possible effects of atmospheric warming, and provide fundamental information in glaciology, glacial geomorphology and quaternary geology. The highest information density is found for the Alps and Scandinavia, where long and uninterrupted records are available. As a contribution to the Global Terrestrial/Climate Observing System (GTOS, GCOS), the Division of Early Warning and Assessment and the Global Environment Outlook of UNEP, and the International Hydrological Programme of UNESCO, the WGMS collects and publishes worldwide standardized glacier data.
Country
DataverseNO (https://dataverse.no) is a curated, FAIR-aligned national generic repository for open research data from all academic disciplines. DataverseNO commits to facilitate that published data remain accessible and (re)usable in a long-term perspective. The repository is owned and operated by UiT The Arctic University of Norway. DataverseNO accepts submissions from researchers primarily from Norwegian research institutions. Datasets in DataverseNO are grouped into institutional collections as well as special collections. The technical infrastructure of the repository is based on the open source application Dataverse (https://dataverse.org), which is developed by an international developer and user community led by Harvard University.
The CLARIN­/Text+ repository at the Saxon Academy of Sciences and Humanities in Leipzig offers long­term preservation of digital resources, along with their descriptive metadata. The mission of the repository is to ensure the availability and long­term preservation of resources, to preserve knowledge gained in research, to aid the transfer of knowledge into new contexts, and to integrate new methods and resources into university curricula. Among the resources currently available in the Leipzig repository are a set of corpora of the Leipzig Corpora Collection (LCC), based on newspaper, Wikipedia and Web text. Furthermore several REST-based webservices are provided for a variety of different NLP-relevant tasks The repository is part of the CLARIN infrastructure and part of the NFDI consortium Text+. It is operated by the Saxon Academy of Sciences and Humanities in Leipzig.
Country
jPOSTrepo (Japan ProteOme STandard Repository) is a repository of sharing MS raw/processed data. It consists of a high-speed file upload process, flexible file management system and easy-to-use interfaces. Users can release their "raw/processed" data via this site with a unique identifier number for the paper publication. Users also can suspend (or "embargo") their data until their paper is published. The file transfer from users’ computer to our repository server is very fast (roughly ten times faster than usual file transfer) and uses only web browsers – it does not require installing any additional software.
The Maize Genetics and Genomics Database focuses on collecting data related to the crop plant and model organism Zea mays. The project's goals are to synthesize, display, and provide access to maize genomics and genetics data, prioritizing mutant and phenotype data and tools, structural and genetic map sets, and gene models. MaizeGDB also aims to make the Maize Newsletter available, and provide support services to the community of maize researchers. MaizeGDB is working with the Schnable lab, the Panzea project, The Genome Reference Consortium, and iPlant Collaborative to create a plan for archiving, dessiminating, visualizing, and analyzing diversity data. MMaizeGDB is short for Maize Genetics/Genomics Database. It is a USDA/ARS funded project to integrate the data found in MaizeDB and ZmDB into a single schema, develop an effective interface to access this data, and develop additional tools to make data analysis easier. Our goal in the long term is a true next-generation online maize database.aize genetics and genomics database.
The Abacus Data Network is a data repository collaboration involving Libraries at Simon Fraser University (SFU), the University of British Columbia (UBC), the University of Northern British Columbia (UNBC) and the University of Victoria (UVic).
Country
The project brings together national key players providing environmentally related biological data and services to develop the ‘German Federation for Biological Data' (GFBio). The overall goal is to provide a sustainable, service oriented, national data infrastructure facilitating data sharing and stimulating data intensive science in the fields of biological and environmental research.