Filter
Reset all

Subjects

Content Types

Countries

AID systems

API

Certificates

Data access

Data access restrictions

Database access

Database access restrictions

Database licenses

Data licenses

Data upload

Data upload restrictions

Enhanced publication

Institution responsibility type

Institution type

Keywords

Metadata standards

PID systems

Provider types

Quality management

Repository languages

Software

Syndications

Repository types

Versioning

  • * at the end of a keyword allows wildcard searches
  • " quotes can be used for searching phrases
  • + represents an AND search (default)
  • | represents an OR search
  • - represents a NOT operation
  • ( and ) implies priority
  • ~N after a word specifies the desired edit distance (fuzziness)
  • ~N after a phrase specifies the desired slop amount
  • 1 (current)
Found 21 result(s)
Country
Arquivo.pt is a research infrastructure that preserves millions of files collected from the web since 1996 and provides a public search service over this information. It contains information in several languages. Periodically it collects and stores information published on the web. Then, it processes the collect data to make it searchable, providing a “Google-like” service that enables searching the past web (English user interface available at https://arquivo.pt/?l=en). This preservation workflow is performed through a large-scale distributed information system and can also accessed through API (https://arquivo.pt/api).
DBpedia is a crowd-sourced community effort to extract structured information from Wikipedia and make this information available on the Web. DBpedia allows you to ask sophisticated queries against Wikipedia, and to link the different data sets on the Web to Wikipedia data. We hope that this work will make it easier for the huge amount of information in Wikipedia to be used in some new interesting ways. Furthermore, it might inspire new mechanisms for navigating, linking, and improving the encyclopedia itself.
Biological collections are replete with taxonomic, geographic, temporal, numerical, and historical information. This information is crucial for understanding and properly managing biodiversity and ecosystems, but is often difficult to access. Canadensys, operated from the Université de Montréal Biodiversity Centre, is a Canada-wide effort to unlock the biodiversity information held in biological collections.
The Government is releasing public data to help people understand how government works and how policies are made. Some of this data is already available, but data.gov.uk brings it together in one searchable website. Making this data easily available means it will be easier for people to make decisions and suggestions about government policies based on detailed information.
ArcGIS 'Living Atlas of the World is a unique collection of worldwide geographic information. It contains maps, apps and data layers that support you in your work. Corona Virus resources https://coronavirus-resources.esri.com/
Earthdata powered by EOSDIS (Earth Observing System Data and Information System) is a key core capability in NASA’s Earth Science Data Systems Program. It provides end-to-end capabilities for managing NASA’s Earth science data from various sources – satellites, aircraft, field measurements, and various other programs. EOSDIS uses the metadata and service discovery tool Earthdata Search https://search.earthdata.nasa.gov/search. The capabilities of EOSDIS constituting the EOSDIS Science Operations are managed by NASA's Earth Science Data and Information System (ESDIS) Project. The capabilities include: generation of higher level (Level 1-4) science data products for several satellite missions; archiving and distribution of data products from Earth observation satellite missions, as well as aircraft and field measurement campaigns. The EOSDIS science operations are performed within a distributed system of many interconnected nodes - Science Investigator-led Processing Systems (SIPS), and distributed, discipline-specific, Earth science Distributed Active Archive Centers (DAACs) with specific responsibilities for production, archiving, and distribution of Earth science data products. The DAACs serve a large and diverse user community by providing capabilities to search and access science data products and specialized services.
Country
The department specializes on developing complex distributed systems for satellite data processing. The main task given to the department is development, validation and implementation of different satellite data processing methods in the form of information services and certain systems
The NCBI Short Genetic Variations database, commonly known as dbSNP, catalogs short variations in nucleotide sequences from a wide range of organisms. These variations include single nucleotide variations, short nucleotide insertions and deletions, short tandem repeats and microsatellites. Short Genetic Variations may be common, thus representing true polymorphisms, or they may be rare. Some rare human entries have additional information associated withthem, including disease associations, genotype information and allele origin, as some variations are somatic rather than germline events. ***NCBI will phase out support for non-human organism data in dbSNP and dbVar beginning on September 1, 2017***
<<<!!!<<< USHIK was archived because some of the metadata are maintained by other sites and there is no need for duplication. The USHIK metadata registry was a neutral repository of metadata from an authoritative source used to promote interoperability and reuse of data. The registry did not attempt to change the metadata content but rather provided a structured way to view data for the technical or casual user. Complete information see: https://www.ahrq.gov/data/ushik.html >>>!!!>>>
The National Archives is home to millions of historical documents, known as records, which were created and collected by UK central government departments and major courts of law. Data of the fomer National Digital Archive of Datasets (NDAD) collection, which was active from 1997 to 2010 and preserves and provides online access to archived digital datasets and documents from UK central government departments, is integrated. Access to records held by The National Archives and more than 2,500 other archives.
Country
As the third center for oceanography of the World Data Center following WDC-A of the United States and WDC-B of Russia, WDC-D for oceanography boasts long-term and stable sources of domestic marine basic data. The State Oceanic Administration now has long-term observations obtained from the fixed coastal ocean stations, offshore and oceanic research vessels, moored and drifting buoys. More and more marine data have been available from the Chinese-foreign marine cooperative surveys, analysis and measurement of laboratory samples, reception by the satellite ground station, aerial telemeter and remote sensing, the GOOS program and global ships of opportunity reports, etc; More marine data are being and will be obtained from the ongoing “863” program, one of the state key projects during the Ninth Five-year plan and the seasat No 1 which is scheduled to be launched next year. Through many years’ effort, the WDC-D for oceanography has established formal relationship of marine data exchange with over 130 marine institutions in more than 60 countries in the world and is maintaining a close relationship of data exchange with over 30 major national oceanographic data centers. The established China Oceanic Information Network has joined the international marine data exchange system via Internet. Through these channels, a large amount data have been acquired of through international exchange, which, plus the marine data collected at home for many years, has brought the WDC-D for Oceanography over 100 years’ global marine data with a total data amounting to more than 10 billion bytes. In the meantime, a vast amount of work has been done in the standardized and normalized processing and management of the data, and a series of national and professional standards have been formulated and implemented successively. Moreover, appropriate standards and norms are being formulated as required.
Country
The World Data Centre section provides software and data catalogue information and data produced by IPS Radio and Space Services over the past few past decades. You can download data files, plot graphs from data files, check data availability, retrieve data sets and station information.
The Clouds and the Earth’s Radiant Energy System (CERES) is a key component of the Earth Observing System (EOS) program. CERES instruments provide radiometric measurements of the Earth’s atmosphere from three broadband channels. CERES products include both solar-reflected and Earth-emitted radiation from the top of the atmosphere to the Earth's surface.
B2SAFE is a robust, safe and highly available service which allows community and departmental repositories to implement data management policies on their research data across multiple administrative domains in a trustworthy manner. A solution to: provide an abstraction layer which virtualizes large-scale data resources, guard against data loss in long-term archiving and preservation, optimize access for users from different regions, bring data closer to powerful computers for compute-intensive analysis
The World Stress Map (WSM) is a global compilation of information on the crustal present-day stress field maintained since 2009 at the Helmholtz Centre Potsdam GFZ German Research Centre for Geosciences. It is a collaborative project between academia and industry that aims to characterize the crustal stress pattern and to understand the stress sources. All stress information is analysed and compiled in a standardized format and quality-ranked for reliability and comparability on a global scale. The WSM is an open-access public database and is used by various academic and industrial institutions working in a wide range of Earth science disciplines such as geodynamics, hazard assessment, hydrocarbon exploitations and engineering.
>>>>!!!<<< As stated 2017-06-27 The website http://researchcompendia.org is no longer available; repository software is archived on github https://github.com/researchcompendia >>>!!!<<< The ResearchCompendia platform is an attempt to use the web to enhance the reproducibility and verifiability—and thus the reliability—of scientific research. we provide the tools to publish the "actual scholarship" by hosting data, code, and methods in a form that is accessible, trackable, and persistent. Some of our short term goals include: To expand and enhance the platform including adding executability for a greater variety of coding languages and frameworks, and enhancing output presentation. To expand usership and to test the ResearchCompendia model in a number of additional fields, including computational mathematics, statistics, and biostatistics. To pilot integration with existing scholarly platforms, enabling researchers to discover relevant Research Compendia websites when looking at online articles, code repositories, or data archives.
Merritt is a curation repository for the preservation of and access to the digital research data of the ten campus University of California system and external project collaborators. Merritt is supported by the University of California Curation Center (UC3) at the California Digital Library (CDL). While Merritt itself is content agnostic, accepting digital content regardless of domain, format, or structure, it is being used for management of research data, and it forms the basis for a number of domain-specific repositories, such as the ONEShare repository for earth and environmental science and the DataShare repository for life sciences. Merritt provides persistent identifiers, storage replication, fixity audit, complete version history, REST API, a comprehensive metadata catalog for discovery, ATOM-based syndication, and curatorially-defined collections, access control rules, and data use agreements (DUAs). Merritt content upload and download may each be curatorially-designated as public or restricted. Merritt DOIs are provided by UC3's EZID service, which is integrated with DataCite. All DOIs and associated metadata are automatically registered with DataCite and are harvested by Ex Libris PRIMO and Thomson Reuters Data Citation Index (DCI) for high-level discovery. Merritt is also a member node in the DataONE network; curatorially-designated data submitted to Merritt are automatically registered with DataONE for additional replication and federated discovery through the ONEMercury search/browse interface.
Country
The Norwegian Meteorological Institute supplies climate observations and weather data and forecasts for the country and surrounding waters (including the Arctic). In addition commercial services are provided to fit customers requirements. Data are served through a number of subsystems (information provided in repository link) and cover data from internal services of the institute, from external services operated by the institute and research projects where the institute participates. Further information is provided in the landing page which also contains entry points some of the data portals operated.
The Argo observational network consists of a fleet of 3000+ profiling autonomous floats deployed by about a dozen teams worldwide. WHOI has built about 10% of the global fleet. The mission lifetime of each float is about 4 years. During a typical mission, each float reports a profile of the upper ocean every 10 days. The sensors onboard record fundamental physical properties of the ocean: temperature and conductivity (a measure of salinity) as a function of pressure. The depth range of the observed profile depends on the local stratification and the float's mechanical ability to adjust it's buoyancy. The majority of Argo floats report profiles between 1-2 km depth. At each surfacing, measurements of temperature and salinity are relayed back to shore via satellite. Telemetry is usually received every 10 days, but floats at high-latitudes which are iced-over accumulate their data and transmit the entire record the next time satellite contact is established. With current battery technology, the best performing floats last 6+ years and record over 200 profiles.
The World Glacier Monitoring Service (WGMS) collects standardized observations on changes in mass, volume, area and length of glaciers with time (glacier fluctuations), as well as statistical information on the distribution of perennial surface ice in space (glacier inventories). Such glacier fluctuation and inventory data are high priority key variables in climate system monitoring; they form a basis for hydrological modelling with respect to possible effects of atmospheric warming, and provide fundamental information in glaciology, glacial geomorphology and quaternary geology. The highest information density is found for the Alps and Scandinavia, where long and uninterrupted records are available. As a contribution to the Global Terrestrial/Climate Observing System (GTOS, GCOS), the Division of Early Warning and Assessment and the Global Environment Outlook of UNEP, and the International Hydrological Programme of UNESCO, the WGMS collects and publishes worldwide standardized glacier data.