Computational scientific research involves massive datasets created by today’s cutting-edge instruments and experiments — telescopes, particle accelerators, sensor networks and molecular simulations. Scientific software used to process these massive data sets and extract discoveries from experimental data is typically made up of tens to thousands of smaller functions, blocks of code that handle individual jobs in the long pipeline of data analysis.
Have some questions about Globus? Then join us for our office hours session on August 27th at 3 p.m. CDT. Learn about Globus Connect Personal and more.
Argonne researchers have developed a pipeline between ALCF supercomputers and Advanced Photon Source experiments to enable on-demand analysis of the crystal structure of COVID-19 proteins.
Long, long ago, in a time when professional travel was still possible, Rachana Ananthakrishnan, executive director & head of products for Globus at the University of Chicago, visited the Indiana University (IU) campus in Bloomington to give the annual IU Women in Cybersecurity talk, sponsored by IU’s Center of Excellence for Women & Technology(CeWIT) and the Center for Applied Cybersecurity Research (CACR).
The highly anticipated v5.4 of Globus Connect Server is now available. This release continues our march towards a unified Globus Connect Server v5 platform that incorporates all relevant features from prior versions.
July 27, 2020 (All day) to July 31, 2020 (All day)
Virtual Conference
The Globus team is participating in this year's virtual online conference. Please stop by our virtual booth and chat with our team during the conference.
The ECSS Symposium allows the over 70 ECSS staff members to exchange on a monthly basis information about successful techniques used to address challenging science problems.
The Virtual Residency Workshop brings together hundreds of cyberinfrastructure and research computing practitioners. Join Rachana as she leads a panel to discuss the state of the art, best practices, and challenges in managing research data. She will be joined on the panel by:
Some of the world’s most challenging problems—including fighting diseases, increasing food production, battling climate change, and advancing Artificial Intelligence and Machine Learning—are enabled by High-Performance Computing (HPC). A big part of HPC workflows is sharing data with distributed collaborators. Join us for a discussion on the challenges of managing and storing HPC data with our partner Caringo, a leader in software-defined object storage solutions.
Bio-IT World recently convened an all-star panel to discuss infrastructure, standards, privacy issues and more, in the COVID-19 era. Here are some quotable quotes from our friends at ESnet and Bioteam:
“If your workflow was already 100% network-based, and all you’re doing is orchestrating data movement, data placement, data analysis between large-scale infrastructure systems, you can be home and socially distanced—right?—and continue your work. If your work requires you physically carrying around USB hard drives, suddenly you’re stopped.” - Eli Dart, ESnet.