A team led by UChicago CS researchers Ian Foster and Kyle Chard and Daniel S. Katz of the National Center for Supercomputing Applications at the University of Illinois is building funcX, a new distributed "function-as-a-service" (FaaS) platform that makes it easier for researchers to easily and automatically delegate their computational workload.
By now many of us are working from home, out of labs and offices, and off campus. For several this necessity was a bit of a surprise, finding out over the weekend that you were now in remote work mode.
It may be surprising that with a PhD in biochemistry, Brigitte Raumann spends a lot of time these days considering data storage and transport in high energy physics and astronomy. But that also puts her in the perfect position to recognize some of the lessons from those disciplines that the life sciences can pick up.
Bio-IT World sat down with Raumann to talk about the challenges she sees in data management in the life sciences and the solutions available.
2020 marks the 10-year anniversary of the Globus service as we know it today. It also promises to be a monumental year for the product and team, for many reasons, not least because we’ll likely reach one exabyte (10^18 bytes) transferred as Globus usage continues to grow. I want to take a moment to reflect on how far we’ve come, and where we plan to go in 2020 and beyond.
NIH staff and extramural researchers with an electronic Research Administration (eRA) Commons account can now use those credentials with Globus to access resources and services. This integration is the result of a partnership between the NIH Center for Information Technology and Globus, a division of the University of Chicago that provides data management capabilities—including managed data transfer and sharing—to research organizations.
Research Reality: Research projects can include multiple collaborators, generate large amounts of data, include hundreds or thousands of participants/samples, and require significant computing resources for data storage, transfer, and analysis. The data needs to be shared, combined, and manipulated by researchers who may sit on the other side of a wall, across campus, or across the ocean from each other.
An extensive collaboration led by Argonne recently won the Inaugural SCinet Technology Challenge at the Supercomputing 19 conference by demonstrating real-time analysis of light source data from Argonne’s APS to the ALCF.