Filter
Reset all

Subjects

Content Types

Countries

AID systems

API

Data access

Data access restrictions

Database access

Database access restrictions

Database licenses

Data licenses

Data upload

Enhanced publication

Institution responsibility type

Institution type

Keywords

Metadata standards

PID systems

Provider types

Quality management

Repository languages

Software

Syndications

Repository types

Versioning

  • * at the end of a keyword allows wildcard searches
  • " quotes can be used for searching phrases
  • + represents an AND search (default)
  • | represents an OR search
  • - represents a NOT operation
  • ( and ) implies priority
  • ~N after a word specifies the desired edit distance (fuzziness)
  • ~N after a phrase specifies the desired slop amount
  • 1 (current)
Found 13 result(s)
The Infrared Space Observatory (ISO) is designed to provide detailed infrared properties of selected Galactic and extragalactic sources. The sensitivity of the telescopic system is about one thousand times superior to that of the Infrared Astronomical Satellite (IRAS), since the ISO telescope enables integration of infrared flux from a source for several hours. Density waves in the interstellar medium, its role in star formation, the giant planets, asteroids, and comets of the solar system are among the objects of investigation. ISO was operated as an observatory with the majority of its observing time being distributed to the general astronomical community. One of the consequences of this is that the data set is not homogeneous, as would be expected from a survey. The observational data underwent sophisticated data processing, including validation and accuracy analysis. In total, the ISO Data Archive contains about 30,000 standard observations, 120,000 parallel, serendipity and calibration observations and 17,000 engineering measurements. In addition to the observational data products, the archive also contains satellite data, documentation, data of historic aspects and externally derived products, for a total of more than 400 GBytes stored on magnetic disks. The ISO Data Archive is constantly being improved both in contents and functionality throughout the Active Archive Phase, ending in December 2006.
FactSage is a fully integrated Canadian thermochemical database system which couples proven software with self-consistent critically assessed thermodynamic data. It currently contains data on over 5000 chemical substances as well as solution databases representing over 1000 non-ideal multicomponent solutions (oxides, salts, sulfides, alloys, aqueous, etc.). FactSage is available for use with Windows.
Country
SILVA is a comprehensive, quality-controlled web resource for up-to-date aligned ribosomal RNA (rRNA) gene sequences from the Bacteria, Archaea and Eukaryota domains alongside supplementary online services. In addition to data products, SILVA provides various online tools such as alignment and classification, phylogenetic tree calculation and viewer, probe/primer matching, and an amplicon analysis pipeline. With every full release a curated guide tree is provided that contains the latest taxonomy and nomenclature based on multiple references. SILVA is an ELIXIR Core Data Resource.
>>>!!!<<<The IGETS data base at GFZ Potsdam http://www.re3data.org/repository/r3d100010300 continues the activities of the International Center for Earth Tides (ICET), in particular, in collecting, archiving and distributing Earth tide records from long series of gravimeters, tiltmeters, strainmeters and other geodynamic sensors. >>>!!!<<< The ICET Data Bank contains results from 360 tidal gravity stations: hourly values, main tidal waves obtained by least squares analyses, residual vectors, oceanic attraction and loading vectors. The Data Bank contains also data from tiltmeters and extensometers. ICET is responsible for the Information System and Data Center of the Global Geodynamic Project (GGP). The tasks ascribed to ICET are : to collect all available measurements of Earth tides (which is its task as World Data Centre C), to evaluate these data by convenient methods of analysis in order to reduce the very large amount of measurements to a limited number of parameters which should contain all the desired and needed geophysical information, to compare the data from different instruments and different stations distributed all over the world, evaluate their precision and accuracy from the point of view of internal errors as well as external errors, to help to solve the basic problem of calibrations and to organize reference stations or build reference calibration devices, to fill gaps in information or data as far as feasible, to build a data bank allowing immediate and easy comparison of Earth tide parameters with different Earth models and other geodetical and geophysical parameters like geographical position, Bouguer anomaly, crustal thickness and age, heat flow, ... to ensure a broad diffusion of the results and information to all interested laboratories and individual scientists.
Country
Scans of plates obtained at Landessternwarte Heidelberg-Königstuhl and German-Spanish Astronomical Center (Calar Alto Observatory), Spain, 1900 through 1999.
Country
The CliSAP-Integrated Climate Data Center (ICDC) allows easy access to climate relevant data from satellite remote sensing and in situ and other measurements in Earth System Sciences. These data are important to determine the status and the changes in the climate system. Additionally some relevant re-analysis data are included, which are modeled on the basis of observational data. ICDC cooperates with the "Zentrum für Nachhaltiges Forschungsdatenmanagement "https://www.fdr.uni-hamburg.de/ to publish observational data with a doi.
Country
<<<<<!!! With the implementation of GlyTouCan (https://glytoucan.org/) the mission of GlycomeDB comes to an end. !!!>>>>> With the new database, GlycomeDB, it is possible to get an overview of all carbohydrate structures in the different databases and to crosslink common structures in the different databases. Scientists are now able to search for a particular structure in the meta database and get information about the occurrence of this structure in the five carbohydrate structure databases.
Country
It is the objective of our motion capture database HDM05 to supply free motion capture data for research purposes. HDM05 contains more than three hours of systematically recorded and well-documented motion capture data in the C3D as well as in the ASF/AMC data format. Furthermore, HDM05 contains for more than 70 motion classes in 10 to 50 realizations executed by various actors.
Country
The Service Centre of the Federal Government for Geo-Information and Geodesy (Dienstleistungszentrum des Bundes für Geoinformation und Geodäsie - DLZ) provides geodetic and geo-topographic reference data of the Federal Government centrally to federal institutions, public administrations, economy, science and citizens. The establishment of the Service Centre is based on the Federal Geographic Reference Data Act (Bundesgeoreferenzdatengesetz − BGeoRG), which came into effect on 1 November 2012. This act regulates use, quality and technology of the geodetic and geo-topographic reference systems, networks and data.
When published in 2005, the Millennium Run was the largest ever simulation of the formation of structure within the ΛCDM cosmology. It uses 10(10) particles to follow the dark matter distribution in a cubic region 500h(−1)Mpc on a side, and has a spatial resolution of 5h−1kpc. Application of simplified modelling techniques to the stored output of this calculation allows the formation and evolution of the ~10(7) galaxies more luminous than the Small Magellanic Cloud to be simulated for a variety of assumptions about the detailed physics involved. As part of the activities of the German Astrophysical Virtual Observatory we have created relational databases to store the detailed assembly histories both of all the haloes and subhaloes resolved by the simulation, and of all the galaxies that form within these structures for two independent models of the galaxy formation physics. We have implemented a Structured Query Language (SQL) server on these databases. This allows easy access to many properties of the galaxies and halos, as well as to the spatial and temporal relations between them. Information is output in table format compatible with standard Virtual Observatory tools. With this announcement (from 1/8/2006) we are making these structures fully accessible to all users. Interested scientists can learn SQL and test queries on a small, openly accessible version of the Millennium Run (with volume 1/512 that of the full simulation). They can then request accounts to run similar queries on the databases for the full simulations. In 2008 and 2012 the simulations were repeated.
Country
The mission of the platform is to enable access for academic projects towards experiments in high-throughput without loss of IP and on a cost basis, which does not restrict access towards HTS usage. The FMP hosts the central open access technology platform of EU-OPENSCREEN, the ChemBioNet and theHelmholtz-Initiative für Wirkstoffforschung, the Screening Unit. The Unit serves for systematic screening of large compound or genome-wide RNAi libraries with state-of-the-art equipment like automated microscopes and microfluidic systems. The Screening Unit is part of the Chemical Biology Platform of the FMP also supported by the MDC. See also: https://www.mdc-berlin.de/screening-unit