Filter
Reset all

Subjects

Content Types

Countries

AID systems

API

Certificates

Data access

Data access restrictions

Database access

Database access restrictions

Database licenses

Data licenses

Data upload

Data upload restrictions

Enhanced publication

Institution responsibility type

Institution type

Keywords

Metadata standards

PID systems

Provider types

Quality management

Repository languages

Software

Syndications

Repository types

Versioning

  • * at the end of a keyword allows wildcard searches
  • " quotes can be used for searching phrases
  • + represents an AND search (default)
  • | represents an OR search
  • - represents a NOT operation
  • ( and ) implies priority
  • ~N after a word specifies the desired edit distance (fuzziness)
  • ~N after a phrase specifies the desired slop amount
Found 76 result(s)
!!! >>> intrepidbio.com expired <<< !!!! Intrepid Bioinformatics serves as a community for genetic researchers and scientific programmers who need to achieve meaningful use of their genetic research data – but can’t spend tremendous amounts of time or money in the process. The Intrepid Bioinformatics system automates time consuming manual processes, shortens workflow, and eliminates the threat of lost data in a faster, cheaper, and better environment than existing solutions. The system also provides the functionality and community features needed to analyze the large volumes of Next Generation Sequencing and Single Nucleotide Polymorphism data, which is generated for a wide range of purposes from disease tracking and animal breeding to medical diagnosis and treatment.
The nationally recognized National Cancer Database (NCDB)—jointly sponsored by the American College of Surgeons and the American Cancer Society—is a clinical oncology database sourced from hospital registry data that are collected in more than 1,500 Commission on Cancer (CoC)-accredited facilities. NCDB data are used to analyze and track patients with malignant neoplastic diseases, their treatments, and outcomes. Data represent more than 70 percent of newly diagnosed cancer cases nationwide and more than 34 million historical records.
The FishNet network is a collaborative effort among fish collections around the world to share and distribute data on specimen holdings. There is an open invitation for any institution with a fish collection to join.
Country
The GeoPortal.rlp allows the central search and visualization of geo data. Inside the geo data infrastructure of Rhineland-Palatinate the GeoPortal.rlp inherit the central duty a service orientated branch exchange between user and offerer of geo data. The GeoPortal.rlp establishes the access to geo data over the electronic network. The GeoPortal.rlp was brought on line on January, 8th 2007 for the first time, on February, 2nd 2011 it occured a site-relaunch.
Galaxies, made up of billions of stars like our Sun, are the beacons that light up the structure of even the most distant regions in space. Not all galaxies are alike, however. They come in very different shapes and have very different properties; they may be large or small, old or young, red or blue, regular or confused, luminous or faint, dusty or gas-poor, rotating or static, round or disky, and they live either in splendid isolation or in clusters. In other words, the universe contains a very colourful and diverse zoo of galaxies. For almost a century, astronomers have been discussing how galaxies should be classified and how they relate to each other in an attempt to attack the big question of how galaxies form. Galaxy Zoo (Lintott et al. 2008, 2011) pioneered a novel method for performing large-scale visual classifications of survey datasets. This webpage allows anyone to download the resulting GZ classifications of galaxies in the project.
The Site Survey Data Bank (SSDB) is a repository for site survey data submitted in support of International Ocean Discovery Program (IODP) proposals and expeditions. SSDB serves different roles for different sets of users.
SOHO, the Solar & Heliospheric Observatory, is a project of international collaboration between ESA and NASA to study the Sun from its deep core to the outer corona and the solar wind. SOHO was launched on December 2, 1995. The SOHO spacecraft was built in Europe by an industry team led by prime contractor Matra Marconi Space (now EADS Astrium) under overall management by ESA. The twelve instruments on board SOHO were provided by European and American scientists.
Country
The Marine Data Portal is a product of the “Underway”- Data initiative of the German Marine Research Alliance (Deutsche Allianz Meeresforschung - DAM) and is supported by the marine science centers AWI, GEOMAR and Hereon of the Helmholtz Association. This initiative aims to improve and standardize the systematic data collection and data evaluation for expeditions with German research vessels and marine observation. It supports scientists in their data management duties and fosters (data) science through FAIR and open access to marine research data. AWI, GEOMAR and Hereon develop this marine data hub (Marehub) to build a decentralized data infrastructure for processing, long-term archiving and dissemination of marine observation and model data and data products. The Marine Data Portal provides user-friendly, centralized access to marine research data, reports and publications from a wide range of data repositories and libraries in the context of German marine research and its international collaboration. The Marine Data Portal is developed by scientists for scientists in order to facilitate Findability and Access of marine research data for Reuse. It supports machine-readable and data driven science. Please note that the quality of the data may vary depending on the purpose for which it was originally collected.
Country
Research Data Centres offer a secure access to detailed microdata from Statistics Canada's surveys, and to Canadian censuses' data, as well as to an increasing number of administrative data sets. The search engine was designed to help you find out more easily which dataset among all the surveys available in the RDCs best suits your research needs.
PAGER (Prompt Assessment of Global Earthquakes for Response) is an automated system that produces content concerning the impact of significant earthquakes around the world, informing emergency responders, government and aid agencies, and the media of the scope of the potential disaster. PAGER rapidly assesses earthquake impacts by comparing the population exposed to each level of shaking intensity with models of economic and fatality losses based on past earthquakes in each country or region of the world. Earthquake alerts – which were formerly sent based only on event magnitude and location, or population exposure to shaking – now will also be generated based on the estimated range of fatalities and economic losses. PAGER uses these earthquake parameters to calculate estimates of ground shaking by using the methodology and software developed for ShakeMaps. ShakeMap sites provide near-real-time maps of ground motion and shaking intensity following significant earthquakes. These maps are used by federal, state, and local organizations, both public and private, for post-earthquake response and recovery, public and scientific information, as well as for preparedness exercises and disaster planning.
The Museum is committed to open access and open science, and has launched the Data Portal to make its research and collections datasets available online. It allows anyone to explore, download and reuse the data for their own research. Our natural history collection is one of the most important in the world, documenting 4.5 billion years of life, the Earth and the solar system. Almost all animal, plant, mineral and fossil groups are represented. These datasets will increase exponentially. Under the Museum's ambitious digital collections programme we aim to have 20 million specimens digitised in the next five years.
GLOBE (Global Collaboration Engine) is an online collaborative environment that enables land change researchers to share, compare and integrate local and regional studies with global data to assess the global relevance of their work.
Public Opinion in the European Union. Our surveys address major topics concerning European citizenship. The Standard Eurobarometer was established in 1973. Since 1973, the European Commission has been monitoring the evolution of public opinion in the Member States, thus helping the preparation of texts, decision-making and the evaluation of its work. Our surveys and studies address major topics concerning European citizenship: enlargement, social situation, health, culture, information technology, environment, the Euro, defence, etc. Each survey consists of approximately 1000 face-to-face interviews per country. Reports are published twice yearly. Reproduction is authorised, except for commercial purposes, provided the source is acknowledged. Special Eurobarometer reports are based on in-depth thematic studies carried out for various services of the European Commission or other EU Institutions and integrated in the Standard Eurobarometer's polling waves. Reproduction is authorised, except for commercial purposes, provided the source is acknowledged. Flash Eurobarometers are ad hoc thematic telephone interviews conducted at the request of any service of the European Commission. Flash surveys enable the Commission to obtain results relatively quickly and to focus on specific target groups, as and when required. Reproduction is authorised, except for commercial purposes, provided the source is acknowledged. The qualitative studies investigate in-depth the motivations, feelings and reactions of selected social groups towards a given subject or concept, by listening to and analysing their way of expressing themselves in discussion groups or with non-directive interviews.
>>>!!!<<< This site is going away on April 1, 2021. General access to the site has been disabled and community users will see an error upon login. >>>!!!<<< Socrata’s cloud-based solution allows government organizations to put their data online, make data-driven decisions, operate more efficiently, and share insights with citizens.
The Johns Hopkins Research Data Repository is an open access repository for Johns Hopkins University researchers to share their research data. The Repository is administered by professional curators at JHU Data Services, who will work with depositors to enable future discovery and reuse of your data, and ensure your data is Findable, Accessible, Interoperable and Reusable (FAIR). More information about the benefits of archiving data can be found here: https://dataservices.library.jhu.edu/
This database contains references to publications that include numerical data, general information, comments, and reviews on atomic line broadening and shifts, and is part of the collection of the NIST Atomic Spectroscopy Data Center https://www.nist.gov/pml/quantum-measurement/atomic-spectroscopy/atomic-spectroscopy-data-center-contacts.
The Southern California Earthquake Data Center (SCEDC) operates at the Seismological Laboratory at Caltech and is the primary archive of seismological data for southern California. The 1932-to-present Caltech/USGS catalog maintained by the SCEDC is the most complete archive of seismic data for any region in the United States. Our mission is to maintain an easily accessible, well-organized, high-quality, searchable archive for research in seismology and earthquake engineering.
The Environmental Change Network is the UK’s long-term environmental monitoring and research (LTER) programme. We make regular measurements of plant and animal communities and their physical and chemical environment. Our long-term datasets are used to increase understanding of the effects of climate change, air pollution and other environmental pressures on UK ecosystems.
GeneCards is a searchable, integrative database that provides comprehensive, user-friendly information on all annotated and predicted human genes. It automatically integrates gene-centric data from ~125 web sources, including genomic, transcriptomic, proteomic, genetic, clinical and functional information.
The Brown Digital Repository (BDR) is a place to gather, index, store, preserve, and make available digital assets produced via the scholarly, instructional, research, and administrative activities at Brown.
Here you will find data on the estimated stock of social capital in each US county for the years 1990, 1997, 2005, 2009, and 2014 PLEASE NOTE: The data are presented "as is" without any implied guarantees for accuracy, and that there are slight differences between 1990/1997 and 2005 as a result of the switch from the SIC to the NAICS codes.
Country
MTD is focused on mammalian transcriptomes with a current version that contains data from humans, mice, rats and pigs. Regarding the core features, the MTD browses genes based on their neighboring genomic coordinates or joint KEGG pathway and provides expression information on exons, transcripts, and genes by integrating them into a genome browser. We developed a novel nomenclature for each transcript that considers its genomic position and transcriptional features.