Filter
Reset all

Subjects

Content Types

Countries

AID systems

API

Certificates

Data access

Data access restrictions

Database access

Database licenses

Data licenses

Data upload

Data upload restrictions

Enhanced publication

Institution responsibility type

Institution type

Keywords

Metadata standards

PID systems

Provider types

Quality management

Repository languages

Software

Repository types

Versioning

  • * at the end of a keyword allows wildcard searches
  • " quotes can be used for searching phrases
  • + represents an AND search (default)
  • | represents an OR search
  • - represents a NOT operation
  • ( and ) implies priority
  • ~N after a word specifies the desired edit distance (fuzziness)
  • ~N after a phrase specifies the desired slop amount
  • 1 (current)
Found 13 result(s)
The International Satellite Cloud Climatology Project (ISCCP) is a database of intended for researchers to share information about cloud radiative properties. The data sets focus on the effects of clouds on the climate, the radiation budget, and the long-term hydrologic cycle. Within the data sets the data entries are broken down into entries of specific characteristics based on temporal resolution, spatial resolution, or temporal coverage.
Country
<<<!!!<<< The repository is offline >>>!!!>>> Store.Synchrotron is a fully functional, cloud computing based solution to raw X-ray data archival and dissemination at the Australian Synchrotron, largest stand-alone piece of scientific infrastructure in the southern hemisphere. Store.Synchrotron represents the logical extension of a long-standing effort in the macromolecular crystallography community to ensure that satisfactory evidence is provided to support the interpretation of structural experiments.
FAIR & long-term storage of research data from computational materials science, or from experimental materials science that is of relevance to simulations. Complementary tools available to explore the full provenance of the calculations and to perform simulations or data analytics in the cloud.
N U C A S T R O D A T A . O R G is your WWW resource for utilizing nuclear information in studies of astrophysical systems. This site hyperlinks all online nuclear astrophysics datasets, hosts the Computational Infrastructure for Nuclear Astrophysics, and provides a mechnanism for researchers to share files online. We created the first online "cloud computing" system for nuclear astrophysics, a virtual pipeline that enables results from the nuclear laboratory to be rapidly incorporated into astrophysical simulations. This system, the Computational Infrastructure for Nuclear Astrophysics or CINA, came online at nucastrodata.org
NASA officially has launched a new resource to help the public search and download out-of-this-world images, videos and audio files by keyword and metadata searches from NASA.gov. The NASA Image and Video Library website consolidates imagery spread across more than 60 collections into one searchable location. NASA Image and Video Library allows users to search, discover and download a treasure trove of more than 140,000 NASA images, videos and audio files from across the agency’s many missions in aeronautics, astrophysics, Earth science, human spaceflight, and more. Users can browse the agency’s most recently uploaded files, as well as discover historic and the most popularly searched images, audio files and videos. Other features include: Automatically scales the interface for mobile phones and tablets Displays the EXIF/camera data that includes exposure, lens used, and other information, when available from the original image Allows for easy public access to high resolution files All video includes a downloadable caption file NASA Image and Video Library’s Application Programmers Interface (API) allows automation of imagery uploads for NASA, and gives members of the public the ability to embed content in their own sites and applications. This public site runs on NASA’s cloud native “infrastructure-as-a-code” technology enabling on-demand use in the cloud.
The NCAR is a federally funded research and development center committed to research and education in atmospheric science and related scientific fields. NCAR seeks to support and enhance the scientific community nationally and globally by monitoring and researching the atmosphere and related physical and biological systems. Users can access climate and earth models created to better understand the atmosphere, the Earth and the Sun; as well as data from various NCAR research programs and projects. NCAR is sponsored by the National Science Foundation in addition to various other U.S. agencies.
When published in 2005, the Millennium Run was the largest ever simulation of the formation of structure within the ΛCDM cosmology. It uses 10(10) particles to follow the dark matter distribution in a cubic region 500h(−1)Mpc on a side, and has a spatial resolution of 5h−1kpc. Application of simplified modelling techniques to the stored output of this calculation allows the formation and evolution of the ~10(7) galaxies more luminous than the Small Magellanic Cloud to be simulated for a variety of assumptions about the detailed physics involved. As part of the activities of the German Astrophysical Virtual Observatory we have created relational databases to store the detailed assembly histories both of all the haloes and subhaloes resolved by the simulation, and of all the galaxies that form within these structures for two independent models of the galaxy formation physics. We have implemented a Structured Query Language (SQL) server on these databases. This allows easy access to many properties of the galaxies and halos, as well as to the spatial and temporal relations between them. Information is output in table format compatible with standard Virtual Observatory tools. With this announcement (from 1/8/2006) we are making these structures fully accessible to all users. Interested scientists can learn SQL and test queries on a small, openly accessible version of the Millennium Run (with volume 1/512 that of the full simulation). They can then request accounts to run similar queries on the databases for the full simulations. In 2008 and 2012 the simulations were repeated.
Country
The Canadian Astronomy Data Centre (CADC) was established in 1986 by the National Research Council of Canada (NRC), through a grant provided by the Canadian Space Agency (CSA). Over the past 30 years the CADC has evolved from an archiving centre---hosting data from Hubble Space Telescope, Canada-France-Hawaii Telescope, the Gemini observatories, and the James Clerk Maxwell Telescope---into a Science Platform for data-intensive astronomy. The CADC, in partnership with Shared Services Canada, Compute Canada, CANARIE and the university community (funded through the Canadian Foundation for Innovation), offers cloud computing, user-managed storage, group management, and data publication services, in addition to its ongoing mission to provide permanent storage for major data collections. Located at NRC Herzberg Astronomy and Astrophysics Research Centre in Victoria, BC, the CADC staff consists of professional astronomers, software developers, and operations staff who work with the community to develop and deliver leading-edge services to advance Canadian research. The CADC plays a leading role in international efforts to improve the scientific/technical landscape that supports data intensive science. This includes leadership roles in the International Virtual Observatory Alliance and participation in organizations like the Research Data Alliance, CODATA, and the World Data Systems. CADC also contributes significantly to future Canadian projects like the Square Kilometre Array and TMT. In 2019, the Canadian Astronomy Data Centre (CADC) delivered over 2 Petabytes of data (over 200 million individual files) to thousands of astronomers in Canada and in over 80 other countries. The cloud processing system completed over 6 million jobs (over 1100 core years) in 2019.
The Ozone Mapping and Profiler Suite measures the ozone layer in our upper atmosphere—tracking the status of global ozone distributions, including the ‘ozone hole.’ It also monitors ozone levels in the troposphere, the lowest layer of our atmosphere. OMPS extends out 40-year long record ozone layer measurements while also providing improved vertical resolution compared to previous operational instruments. Closer to the ground, OMPS’s measurements of harmful ozone improve air quality monitoring and when combined with cloud predictions; help to create the Ultraviolet Index, a guide to safe levels of sunlight exposure. OMPS has two sensors, both new designs, composed of three advanced hyperspectralimaging spectrometers.The three spectrometers: a downward-looking nadir mapper, nadir profiler and limb profiler. The entire OMPS suite currently fly on board the Suomi NPP spacecraft and are scheduled to fly on the JPSS-2 satellite mission. NASA will provide the OMPS-Limb profiler.
nanoHUB.org is the premier place for computational nanotechnology research, education, and collaboration. Our site hosts a rapidly growing collection of Simulation Programs for nanoscale phenomena that run in the cloud and are accessible through a web browser. In addition to simulation devices, nanoHUB provides Online Presentations, Courses, Learning Modules, Podcasts, Animations, Teaching Materials, and more. These resources help users learn about our simulation programs and about nanotechnology in general. Our site offers researchers a venue to explore, collaborate, and publish content, as well. Much of these collaborative efforts occur via Workspaces and User groups.
The AERONET (AErosol RObotic NETwork) program is a federation of ground-based remote sensing aerosol networks established by NASA and PHOTONS (PHOtométrie pour le Traitement Opérationnel de Normalisation Satellitaire; Univ. of Lille 1, CNES, and CNRS-INSU) and is greatly expanded by networks (e.g., RIMA, AeroSpan, AEROCAN, and CARSNET) and collaborators from national agencies, institutes, universities, individual scientists, and partners. The program provides a long-term, continuous and readily accessible public domain database of aerosol optical, microphysical and radiative properties for aerosol research and characterization, validation of satellite retrievals, and synergism with other databases. The network imposes standardization of instruments, calibration, processing and distribution.
The CALIPSO satellite provides new insight into the role that clouds and atmospheric aerosols play in regulating Earth's weather, climate, and air quality. CALIPSO combines an active lidar instrument with passive infrared and visible imagers to probe the vertical structure and properties of thin clouds and aerosols over the globe. CALIPSO was launched on April 28, 2006, with the CloudSat satellite. CALIPSO and CloudSat are highly complementary and together provide new, never-before-seen 3D perspectives of how clouds and aerosols form, evolve, and affect weather and climate. CALIPSO and CloudSat fly in formation with three other satellites in the A-train constellation to enable an even greater understanding of our climate system.
The Open Science Framework (OSF) is part network of research materials, part version control system, and part collaboration software. The purpose of the software is to support the scientist's workflow and help increase the alignment between scientific values and scientific practices. Document and archive studies. Move the organization and management of study materials from the desktop into the cloud. Labs can organize, share, and archive study materials among team members. Web-based project management reduces the likelihood of losing study materials due to computer malfunction, changing personnel, or just forgetting where you put the damn thing. Share and find materials. With a click, make study materials public so that other researchers can find, use and cite them. Find materials by other researchers to avoid reinventing something that already exists. Detail individual contribution. Assign citable, contributor credit to any research material - tools, analysis scripts, methods, measures, data. Increase transparency. Make as much of the scientific workflow public as desired - as it is developed or after publication of reports. Find public projects here. Registration. Registering materials can certify what was done in advance of data analysis, or confirm the exact state of the project at important points of the lifecycle such as manuscript submission or at the onset of data collection. Discover public registrations here. Manage scientific workflow. A structured, flexible system can provide efficiency gain to workflow and clarity to project objectives, as pictured.