Designed for Synchrotrons
Seamlessly integrate DECTRIS CLOUD in your beamline
Augment your user operation through a virtual workspace accessible from anywhere: data availability within seconds, API based automation of processing jobs and instant access to thousands of CPUs ensure you are ready for the most demanding experiments.
In Sync with your Users
Turn your beamline into a real-time collaborative hub: share data, code, and environments instantly with users, collaborators, or support staff the moment acquisition ends. Everyone works on the same dataset, mounted live and centrally. No forks, no syncs, no version drift. Shared workspaces, structured projects, and soon real-time Jupyter collaboration keep the entire team aligned from beamtime to publication.
- One-click to add new users
- Shared software workspaces
- Projects with shared data and logbook
Tools, Algorithm & Data Ready
Uplevel your beamline the moment data lands: a public library of community-curated tools, algorithms, and automated jobs is preinstalled and ready - zero setup, zero downtime. Launch high-performance virtual sessions or notebooks on demand, with data auto-mounted at up to 100 Gbit/s per node, so interactive analysis keeps pace with acquisition. When speed and repeatability count, dispatch tasks to our expanding catalogue of one-click automated workflows and free your beamline for the next shot.
- Time-tested and cutting edge software tools
- Customized, community curated containers
- User-friendly deployment
Custom and Reproducible Software and Workflows
Customize your own interactive software environments - fully isolated and tailored to your workflow, with no need to ask IT. You get full admin rights and versioning out of the box: install what you want, snapshot your work, and jump back to rerun past analyses on demand - even years later. And if something breaks or gets deleted, don’t worry - automatic version protection means your data and scripts are safe and restorable without redoing the experiment.
- Version-locked, shareable environments
- Custom job templates
- Root privileges in your workspaces
Scale Your With your Needs
Scale your compute to match your beamline’s heartbeat: spin up sessions with hundreds of CPUs or multiple GPUs on demand, then resize with a click when needs change. Petabyte-scale storage expands elastically for peak data bursts, so even the biggest runs never bottleneck or blow out local capacity. Forget batch scripts and queuing systems; you focus on science while we handle all the scheduling and orchestration behind the curtain.
- On-demand access to thousands of CPUs + GPUs
- Elastic storage for bursts to petabyte scale
- No queuing system slowing you down
Newsroom


