AWS Public Sector Blog

5 best practices for accelerating research computing with AWS

AWS branded background design with text overlay that says "5 best practices for accelerating research computing with AWS"

The cloud is accelerating research insights by reducing data analysis and processing times, enabling researchers worldwide to collaborate on solving universal problems from drug discovery to climate challenges. Amazon Web Services (AWS) works with higher education institutions, research labs, and researchers around the world to offer cost-effective, scalable, and secure compute, storage, and database capabilities to accelerate time to science.

In our work with research leaders and stakeholders, users often ask us about best practices for leveraging cloud for research. In this post, we dive into five common questions we field from research leaders as they build the academic research innovation centers of the future.

How can we structure our organization to enable and accelerate research innovation?

Research leaders show that changing people and processes is often more complex than driving change in technology. They face challenges navigating the technical personas within their own organizations, and subsequently building successful organizational structures that enable research IT to differentiate itself from central IT and drive innovation. They flag a persistent gap in the job market for people who speak both research and technical languages, and can effectively navigate both.

To adapt, some institutions are adopting more agile operating models to support research. For example, the Royal Melbourne Institute of Technology (RMIT) in Australia engaged AWS and AWS Partner, The Data Foundry, to create the RMIT AWS Cloud Supercomputing Hub (RACE), through which they deliver high performance computing (HPC) services to researchers across the institution. With RACE, RMIT now has the scalability, performance, and automation capability to drive faster research outcomes, which has freed up time previously spent by valuable research IT staff on manual tasks, to more effectively enable researchers, academics, and students to achieve their goals.

How can we maintain open and collaborative research networks, while also meeting security and compliance regulations?

Historically, universities have operated in an open model with less direct governance as compared with private entities. However, this is changing. Drivers for data regulation enforcement include: a) increases in cyber-attacks on research data; b) growth of large datasets and artificial intelligence (AI) in research, which needs more complete datasets, and c) more healthcare, finance, and manufacturing datasets, which introduce domain-specific requirements.

Managing access to resources and data in simple, intuitive, and compliant ways opens the door to greater research collaboration and expanded use of new technologies, such as generative AI. Data spaces offer researchers one way to facilitate organizing, accessing, and sharing data across different organizations, sources, and systems. They are built with interoperability, data governance, and security in mind.

How do we create a consistent and seamless experience for researchers across systems?

There is a tension between making the tools easy to use or training researchers how to use AWS to build their own tools. Research IT leaders express the desire to deliver a seamless experience for researchers, meaning that the tools that researchers need to do a job be readily available and accessible within their compliant environments. However, researchers shouldn’t have to be cloud engineers, as well as the eminent experts in their domains. One solution to consider is Research and Engineering Studio on AWS (RES), which offers an open source, easy-to-use web-based portal for administrators to create and manage secure cloud-based research and engineering environments. Using RES, scientists and engineers can visualize data and run interactive applications without the need for cloud expertise.

How do we make sure that our cloud adoption strategy is financially sustainable?

Research leaders tell us that they struggle with how to democratize access to the cloud for their researchers in a way that makes it financially sustainable. Chief financial officers (CFOs) stress the importance of measuring the value and return on investment of research conducted, but often lament that the true costs of on-premises resources are distributed across various budgets, or masked. This makes building the business case for operational expenses (such as cloud) as opposed to capital expenses (such as on-premises resources) more challenging. Adopting a Cloud Value Framework can help users understand the business value of moving to and building on AWS, and the Cost Optimization Flywheel can be a useful tool for understanding cost transparency, control, forecasting, and optimization. The Global Data Egress Waiver program can also help researchers more predictably budget their monthly cloud spend, by waiving fees for extracting data back out of AWS after processing and analysis concludes.

Research leaders successfully driving cloud adoption are developing mechanisms to prioritize resources efficiently, and to incorporate the total cost of ownership (TCO) and value their teams derive from operating in the cloud. A good example of this is the University of New South Wales (UNSW), whose research computing leadership have adopted several mechanisms to steer the organization toward adopting cloud purposely and sustainably.

“The researchers in my organization want quick access to a wide range of tools. Imagine a maker-shop with the latest and best equipment. I want to be able to induct them into the cloud version of that space quickly and safely. That’s why we’ve been facilitating researcher access to our E-Research Institutional Cloud Architecture (ERICA) platform for the last six years” said Luc Betbeder-Matibet, director of research technology services at UNSW.

“ERICA is not just a safe place to do research when handling sensitive data, it also enables us to track project costs that we have used for both charge-back and show-back. Getting the financial operations right is not a trivial matter, as ERICA is a good pattern for us that we apply to other kinds of workloads. My advice is that our kinds of research institutions need FinOps down to the project level and to benchmark regularly.”

How else can cloud help us accelerate innovation at our institution?

Research institutions such as Emory University are using cloud to democratize researcher access to new technologies and services, such as Quantum and AI. This enables researchers in more domains to access advanced tooling and HPC. Furthermore, this is enabling new types of research across disciplines, such as the humanities, climate change, and economics.

Investing in researcher training is another way that institutions are accelerating innovation outcomes. For example, the University of Alberta in Canada teamed up with AWS to launch Artificial Intelligence Discovery Place to democratize access to AI education and services. Stanford Data Ocean (SDO) is tackling how to bring more disciplines into the field of precision medicine with their integrated curriculum that combines research basics with cloud computing concepts to set the foundation for using advanced technologies on SDO’s datasets. Moreover, researchers and research IT anywhere can access a no-cost online AWS training pathway, even without formal training programs at their institution.

Conclusion

Researchers seek to push the envelope and see cloud as an instrument to further their research. Institutions strive to develop operating models that enable their researchers to invent within guardrails and with the right training and support. This demands a robust understanding of the different types of researchers that exist within their institution, as well as an understanding of what they need from the cloud.

Visit the AWS Research Computing webpage to learn more about how AWS can help your institution accelerate research.