In the first of a four-part series, Jay Etchings, director of operations for research computing and senior HPC architect at Arizona State University, lays out the concept of the next-generation cyberinfrastructure, a set of integrated technology components working together to support the diverse needs of the research community across disciplines and across scale.

Where does high-performance computing go from here?

Over the past several years, high performance computing has undergone sweeping change unbalancing expectation and resources to meet them. Probabilistic nuances in linear scaling measured purely in cores and nodes have been unkind to their masters and in many ways has contributed to disparate silos vaguely familiar to the mainframes we all once knew.

“Little islands are all large prisons: one cannot look at the sea without wishing for the wings of a swallow.”

Richard Burton

In response to a charge from Arizona State University (ASU) senior leadership to articulate a cyberinfrastructure necessary to successfully execute future-looking, large scale, transdisciplinary research projects, ASU has designed and built a Next Generation Cyber Capability (NGCC). The effort is being spearheaded by Dr. Kenneth Buetow, director of Computational Sciences and Informatics program for Complex Adaptive Systems at Arizona State University. Conceptually, this NGCC is composed of standing capabilities and dynamic, on-demand, virtual resources that grow and contract as efforts require. It is a synergistic, systems “whole” that uniquely synthesizes physical and logical infrastructure which is fulfilled through dedicated resources working in unison. The NGCC is executed through a construct that utilizes resources instantiated locally, virtual, and “cloud-based” resources. Central to the NGCC concept is the capacity to leverage a portfolio of virtual capabilities “on-demand” for short term engagements. ASU’s NGCC has adopted a novel approach that extends the norms of traditional high performance computing. The rich collection of assets represents a “First Generation Data Science Research Instrument” through choreography of a diverse collection of physical and logical capabilities that perform as an integrated whole.

A Whole More than the Sum of Its Parts.

The NGCC physical infrastructure integrates differing hardware platforms configured to support transactional/utility computing, high performance computing, and big data problems. Each of these elements utilizes alternative hardware configurations optimized to support the unique characteristics of the computational problems they address. The transactional component supports day-to-day utility computing and is composed of a cluster of cost-efficient commodity hardware. Its unique characteristic is a very large data storage reservoir. A high performance computing component supports computationally intensive efforts in two alternative configurations, both with high speed connectivity to the transactional storage reservoir. The first configuration is composed of a large cluster of fast processors with modest memory. The second configuration is a smaller cluster of fast processors which access to a common, shared large memory resource. The final configuration, which supports big data problems, is composed of a cluster of fast processor nodes interconnected with high speed links with each node having large memory and large data storage. This cluster similarly is connected at high speed to the transactional data storage reservoir. To support access to outside virtual capacity all components are connected to the Internet by high speed connections.

The biomedical research community faces major challenges in addressing the integration and analysis of the rapidly growing volumes of highly diverse data and their application to discovery, translation and support for improved patient care. The challenge presented by this revolution is the need to develop and implement hardware and software that can store, retrieve, and analyze mountains of complex data, transform it into knowledge to improve our understanding of the human condition. Boundaries in this arena have not only impacted research sciences but also inter-intra university collaboration. Addressing the current challenges required a novel resolution path; although disruptive to the traditional university high performance computing model; a sustainable, collaborative, elastic, distributed, model that promises to overcome legacy barriers and open new avenues into research sciences.

This model embraces the following foundational components:

Research centers without walls that meet not only NIST compliance but are collaboration-centric. (Hybrid Cloud)

Open Big Data frameworks that support global meta-data management for digital curation of omics data.

Dedicated bandwidth, free of policy or capacity restrictions supporting research collaboration. (Friction-Free Science DMZ)

OpenFlow based software-defined networking, dedicated bandwidth supporting research collaboration. (Internet2 Innovation Platform)

Programmable, pluggable architectures for the next generation of research computing. (NGCC Architecture)

Efficient and effective touch-less network management free from the bottlenecks of legacy hardware-defined networking.

Research as a Service workload based provisioning for holistically defined scientific challenges.

The NGCC provides a novel synergism integrating big data platforms such as Hadoop and traditional supercomputing technologies, software defined networking, inter-intra-university high-speed 10G/40G/100G interconnects via the Internet2 Innovation Platform, workload virtualization, coprocessors, and cloud bursting capacity empowering life sciences research organizations to consume research compute on demand as a service enabling the next major wave of analytics innovation.

About the Author

Director of Operations, Research Computing, and Senior HPC Architect at Arizona State University, Jay Etchings is a well-known industry professional with 20 years of progressively versatile, cross-platform experience in management of open systems architecture. With the bulk of a 10 year technical consulting career spent in gaming and connected lotteries, data relationship analysis has been a longtime passion for Etchings. He is well versed in all phases of cutting edge analytics and research computing. A former recovery audit contractor for the centers for Medicaid/ Medicare (CMS-RAC) positions him in alignment with the new ‘precision medicine’ healthcare field that is currently emerging.

Additional contribution provided by…

Dr. Kenneth Buetow also contributed to this article series. Buetow serves as director of Computational Sciences and Informatics program for Complex Adaptive Systems at Arizona State University ([email protected]) and is a professor in the School of Life Sciences in ASU’s College of Liberal Arts and Sciences. [email protected] is creating a Next Generation Cyber Capability (NGCC) to address the challenges and opportunities afforded by “Big Data” and the emergence of 4th Paradigm Data Science. This capability brings state-of-the-art computational approaches to [email protected]’s trans-disciplinary, use-inspired research efforts.