High Performance Computing Collaboratory

[an NSF Graduated Center] The High Performance Computing Collaboratory (HPC²), an evolution of the MSU NSF Engineering Research Center (ERC) for Computational Field Simulation, at Mississippi State University is a coalition of member institutes and centers that share a common core objective of advancing the state-of-the-art in computational science and engineering using high performance computing; a common approach to research that embraces a multi-disciplinary, team-oriented concept; and a commitment to a full partnership between education, research, and service. The mission of the HPC² is to serve the University, State, and Nation through excellence in computational science and engineering. The fields addressed by the Center are simply regions or volumes of space within which physical phenomena vary with position and time. These physical phenomena impact our lives much more than we generally realize: Common examples of these phenomena include compressible fluid flow, such as the air flow around aircraft and automobiles; incompressible fluid flow, such as the flow of water past ships; and electromagnetic (EM) fields, such as microwave signal transmission or local EM fields around power lines. Historically, extensive experiments and costly equipment, such as wind tunnels, were necessary to study these phenomena; however, they can now be simulated using modern powerful computing systems and computational techniques. This use of field simulations has become a powerful tool to supplement and/or replace traditional experimental and analytical methods. In some instances, simulation is the only approach for understanding immeasurable physical phenomena or analyzing particular designs under certain conditions. The primary obstacle to widespread use of computational field simulation (CFS) by industry has been that industry's design-related field problems are usually quite complex, requiring simulations that are time-consuming and costly. The NSF Engineering Research Center (ERC) for Computational Field Simulation at Mississippi State University was created in 1990 with the mission to research the means and methods to reduce the time and cost while increasing the fidelity and scope of complex field simulations for engineering analysis and design. When the ERC was created, complete real-world problems were impractical to simulate on that generation of supercomputers. Capturing the complex geometry of a complete aircraft and creating the discrete small-volumes in the regions for field computations could easily have taken 6-12 months with extensive engineering efforts. Since the field computations for the total 3-D problem could easily have required another 6-12 months on a $20 million supercomputer, the complete problems were too expensive for practical simulations. The Center currently conducts coordinated cross-disciplinary research (which amounts to approximately $12 million per year) interacting with 16 industrial affiliates and 14 government affiliates. The Center's vision is to enable for U.S. industry and government agencies superior capabilities for computational field simulations of large-scale geometrically complex physical field problems through domain-specific integrated simulation environments for rapid analysis and design, facilitating a shift from physical prototyping towards computational simulation prototyping. The research in the ERC focuses on the underlying science of CFS and the development of means and methodologies to enable the necessary reduction in engineering time, clock time, and overall cost of CFS for application domains, including extensions into diverse, very complex multidisciplinary applications, that are relevant to industry. A major emphasis of the Center--which employs 70 faculty and staff researchers and approximately 125 students from various disciplines in engineering, science, and mathematics--has been in computational fluid dynamics (CFD) to provide the means to simulate complete real world applications (such as cruise missiles, complete submarines with rotating propulsors, biofluid flow with particulates, rocket exhaust, and weapon or stage separation). However, the Center's strategic research efforts in building computational problem-solving environments encompass all areas of field physics. The fulfillment of the Center's mission is illustrated by the John Glenn space shuttle flight. The Center has significantly contributed to the art and practice of "unstructured grid generation", yielding high quality grids in significantly less time. Whereas "structured grid generation" on a total aircraft may take several weeks or months, the Center's unstructured grid generation can be accomplished within a day. The Center focused a team on coupling its structured grid CFD algorithm knowledge within a portable, scalable computational architecture onto unstructured grid solver technology. This required substantial research in both boundary layer gridding and solution algorithms. As it turned out, the parallel solver (research) code had just been assembled for the first time when the Space Shuttle mission STS-95 was launched. NASA Johnson Space Center called seeking simulated analysis of the Space Shuttle Orbiter during the return flight after the Orbiter drag chute door was lost during main engine startup. [The NASA engineers wanted to know the dynamic pressure in the region of the missing chute door in order to estimate the aerodynamic loadings during reentry.] The ERC group read a previously supplied Space Shuttle Orbiter geometry into the ERC's integrated simulation environment (SOLSTICE) and created the grids within hours. Initial simulation results were computed on a high performance computer within two days. The significance of this endeavor was not that NASA actually needed the results for successful reentry, but rather that the ERC had been able to take a tough real world problem and compute the solutions in two to three days after receiving the geometry description. [The turnaround time could have been reduced to a day if the ERC's main high-performance computer was dedicated solely to this task (only 1/4 of the machine was actually used)]. This demonstrated an achievement that was a direct result of the ERC's mission and efforts. Our researchers have demonstrated superior ability to simulate very complex real world problems with complex geometries in relative motion. These accomplishments have come from directed cross-disciplinary efforts involving various technologies: grid generation, field solution algorithms, and scientific visualization, coupled with computer and computational engineering. Without the ERC structure, we could not have combined all of the various talents and technologies required. The simulation of field phenomena is historically divided into three phases: grid generation (i.e., capturing a representation of the geometry and field regions and then constructing a grid that divides these regions into many separate or discrete small volumes); the use of solution algorithms to solve discrete approximations for the equations which govern the physical phenomena (constitutive equation modeling) to obtain values for the physical solution at each point in space and time; and scientific visualization (i.e., displaying the geometry and/or solution on a computer screen). Further, the computational capability itself is enabled by the computing system, including the system software, which creates the application programming support environment, and the computer architecture, which incorporates the hardware features and constraints. In addition to the achievements in unstructured grid generation, the ERC researchers pioneered the development of structured grid generation and techniques based on the parametric-based nonuniform rational B-splines (NURBS) representation. These technologies are incorporated into the block-structured grid and unstructured grid tools: generalized unstructured multi-block GUM-B and HyperMesh, respectively. The ERC originated, released, and supported a code to simulate turbomachinery-related flows--a code that has become the defacto standard for turbomachinery manufacturers which are served by NASA Lewis Research Center (Allied Signal, Allison, General Electric, and Pratt & Whitney). Algorithms enabling simulations for a fully configured submarine or surface ship, including rotating propulsors have also significantly advanced the state-of-the-art with the Navy selecting the codes for technology transfer to the shipyards. Other examples of application-enabling research include (1) capabilities for free-surface (air/water) boundaries (e.g., ships and littoral water oceanography with actual geometry, temperature, and salinity), (2) both chemical and thermal nonequilibrium flow (e.g., the Space Shuttle main engine nozzle starting transient; the Titan Centaur booster and separation for Lockheed Martin; an environmental quality network model for underground pollution; a fully two-dimensional radiative heat transfer model; a portable, scalable solver for arbitrary mixtures of thermally perfect gases in local chemical equilibrium; and preconditioning algorithms for low-speed combustion applications), and (3) particle-laden biofluid flows (e.g., first ever oscillating flow in a pulmonary bifurcation section, inhaled aerosols through branching, lung-like tubes; and powder-carrying air jets for industrial coating processes). These efforts clearly demonstrate the ERC’s expertise in creating efficient means to simulate fluid flow through moving complex geometries with complex physical phenomena. The scope of CFS capabilities is being extended across new single and multidisciplinary domains. In scientific visualization the ERC has contributed by (1) advancing the technology in visualizing time-varying data through the release of ISTV in February 1996 and by (2) demonstrating at SuperComputing '95 the ability to steer and visualize a running ocean model in a multiperson immersive environment. Current research encompasses feature detection, multiresolution visualization, data compression, distributed visualization, flow visualization, and interactive virtual environments, exemplified by the recent installation of a CAVE. A close association between computing and field simulation researchers within the Center has resulted in efficient parallel CFS algorithms. Simulations of both compressible and incompressible flows have been researched to develop effective solution methodologies and programming environments for creating portable parallel simulation programs for use with the evolving computer architectures. The ERC has been a leader in the emergence of the Message Passing Interface (MPI) as the standard paradigm for writing distributed (parallel) applications, enabling programs to be portable across a wide range of distributed computing platforms. (MPI parallel applications can be ported across shared or distributed memory architectures "transparently" while exploiting the low latency of shared memory.) A significant event in technology transfer has been the collaboration with computer companies to expedite deployment of MPI software on various platforms. The Center is actively involved in evolving a testbed CFS integrated system, incorporating the various elements in an effective and user-oriented system and focused on next generation computational simulations. The CFS testbed provides for capability demonstrations, a modular framework for technology advancements and maintenance, efficient reuse across physics domains, vehicles for technology transfer, and tools for CFS instruction. Targeted for collaborative use, the integrated system testbed provides the foundation for creating domain-specific versions, such as an integrated simulation system for littoral waters, for affiliates. Technology transfer is facilitated through collaborative research activities, focusing on the particular customer and industrial needs. Current ERC research is leading to addition of the integration of measurements and to the addition of multidisciplinary simulations to the capability of the simulation testbed. As part of its education mission, the ERC has had approximately 700 students directly involved in the research of the Center, has developed a cross-disciplinary computational engineering graduate program to allow students to integrate their study with the research of the Center, has developed graduate and undergraduate CFS courses for engineering students and others, and has developed a minor in computational engineering for undergraduate engineering students. Working with the Department of Art and the School of Architecture, the ERC facilitated the new graduate degrees in animation and electronic visualization (MS in art) and in electronic design (MS in architecture). The ERC has programs with minority and women’s institutions and works with K-12 schools.

Research Areas

A common thread of research at the HPC² is computational simulation of physical phenomena, but this exists in a rich fabric of high technology that ranges from software engineering to submarine propulsion, from load balancing in parallel computing to visualization and interpretation of remotely sensed data, from computational geometry to algorithm development, and from materials modeling to crash simulations.
This list indicates general areas of research activity within the HPC². Many areas listed here may overlap or are subsets of other areas.
Computational Fluid Dynamics
CFD-Based Design Optimization
Multidisciplinary Analysis and Design Optimization
Computational Design
Computational Modeling
Geometry Modeling
Structured Grid Technology
Unstructured Grid Technology
Dynamically-Adaptive Grids and Meshes
Generalized Mesh Generation
Incompressible Flow Methodology
Compressible Flow Methodology
Turbulence Modeling
Large Eddy Simulation
Hybrid LES/RANS Simulation
Generalized Flow Solver
Hypersonic Flows
Non-Equilibrium Flow Simulations
Aircraft Icing
Code Verification
Enterprise Computational Services
Software Systhesis
Materials Modeling
Parallel Algorithms
Computational Heat Transfer
Viscous Ship Hydrodynamics
Computation Solid Mechanics
Numerical Fracture Mechanics
Damage Tolerance of Composite Materials
Scientific Visualization
Information Visualization
Visual Analysis
Image Processing
Feature Detection
Sensitivity Analysis
Computational Systems for Vehicle Analysis
Crash-worthiness Simulations
Education and Training
Pollutant Transport in Urban Areas
Biomedical Device Simulation
Simulation of Biomedical Fluid Flows
Aerosol Deposition in Lung Airways
Geospatial Technologies
Natural Resource Management
Decision Support Systems
Software and User Interfaces
Research Centers, Institutes, and Initiatives
ASSURE - The Alliance for System Safety of UAS through Research Excellence is comprised of twenty-one of the world's leading research universities and more than a hundred leading industry/government partners. This alliance features expertise across a broad spectrum of research areas including: Air traffic control interoperability, UAS airport ground operations, control and communications, detect and avoid, human factors, UAS noise reduction, UAS wake signatures, unmanned aircraft pilot training and certification, low altitude operations safety, spectrum management and UAS traffic management. ASSURE possesses the expertise, infrastructure and outstanding track record of success that the FAA Center of Excellence for Unmanned Aircraft Systems demands.
CCI - The Center for Cyber Innovation is a sustainable solution provider for complex cyber problems. CCI leverages interdisciplinary cyber expertise. CCI has cyber relationships within the federal government, Defense Department and the Intelligence Community that allows CCI to be a solution provider of choice. As a facilitator, CCI engages select industry partners and academia partners to establish results-oriented teams with cyber capabilities.
CCS - The mission of the Center for Computational Sciences is to enhance the applicability and usability of simulations involving interacting physical, chemical, biological, and engineering phenomena by developing integrated computational environments and crosscutting tools that synergistically couple information technology with computation science and engineering.
GRI - The mission of the Geosystems Research Institute is to understand Earth's natural and managed systems and provide comprehensive solutions for socioeconomic and environmental requirements, leading to an improved quality and life. The computational engineering goals of the Geosystems Research Institute are designed to derive geoinformation from geospatial data, develop a fuller understanding of the information through mesoscale modeling and visualization, and provide information services to the geospatial science and user community through computational libraries. Using the infrastructure and expertise of one of the largest computing centers (HPC²) in the United States, the GRI provides capabilities in remote sensing computational technologies, visualization techniques, natural resource management, and the transition of these into operational agency research, planning, and decision-support programs. With its multi-disciplinary team of researchers and educators, the GRI has developed nationally recognized research strengths, with strong relationships and inherent respect from state, regional, and national agencies and business entities.
ICRES - The Institute for Computational Research in Engineering and Science strives to be a world-class center of excellence for research, technology and education equipped to address engineering challenges facing the nation's industrial base. Utilizing high performance computational resources and state-of-the-art analytical tools for modeling, simulation, and experimentation, ICRES will provide a distinctive, interdisciplinary environment that will support economic development and outreach activities throughout the State of Mississippi and beyond.
CAVS - The Center for Advanced Vehicular Systems mission is to research and develop manufacturing and design means and methods for producing vehicles of superior quality with advanced features and functions at superior costs, focusing on computational tools and exploiting the underlying technologies for broader industrial use. The mission also includes engineering extension, education, and advanced technical training outreach for industry. In Starkville, CAVS research and development activities are based on three factors: industrial needs and priorities; opportunities for CAVS to provide added value; and opportunities to build on the state's investment by securing external funding related to broadening the reach of technologies. The CAVS Extension Center at Canton, MS provides direct engineering support for Nissan and its major suppliers; engineering extension work for Mississippi's manufacturers; workforce development, education, and training; and business systems and information technologies.
CAVS-E - Mississippi State University CAVS Extension's capabilities are driven by the needs of the automotive industry. Resulting capabilities are leveraged to benefit other manufacturing industries throughout the state. This is accomplished in a variety of ways including on-site engineering support and professional development training. Our goal is to introduce best practices and emerging technologies which enable Mississippi Manufacturers to compete successfully in global markets.
I²AT - The Institute for Imaging & Analytical Technologies houses pre-eminent research instrumentation that is available to faculty, staff, students, and outside users. Instrumentation includes technologies for diverse microscopy (light, confocal, atomic force and electron) and microanalysis (e.g., X-ray diffraction) applications, in addition to magnetic resonance imaging used in areas of veterinary medicine, cognitive science and medical systems. These technologies provide MSU, the State of Mississippi and the local community with state-of-the-art resources that facilitate scholarly research, spawn competitive funding, foster project completion, enable high-quality undergraduate and graduate education, enhance impact of outreach, and promote economic development in the state of Mississippi.
ISER - The Institute for Systems Engineering Research is a collaborative effort between the U.S. Army Engineer Research and Development Center and Mississippi State University. The goal of ISER's efforts and products is to mitigate risk, reduce cost and improve efficiency in Department of Defense (DoD) acquisition programs, serve as an additional asset for the state's industrial base for systems engineering related tasks, and create an environment that draws DoD and civilian industry development to the state of Mississippi.
IGBB - The mission of the Institute for Genomics, Biocomputing & Biotechnology is to increase the ability of Mississippi scientists to lead high-throughput, multi-disciplinary projects focused on understanding the biomolecular interactions underlying the diversity, value, and sustainability of species of agricultural, medical, bioenergy, and/or ecological importance. The IGBB provides researchers access to a team of highly-skilled professionals trained in cutting edge genomics, proteomics, and high performance computing principles and techniques. The IGBB team not only generates molecular data using state-of-the-art equipment, but works with investigators to efficiently derive biological knowledge from that data. Specific goals of the IGBB include:
-- Conducting scientific research that meets the needs of society and further enhances the unique strengths of MS State;
-- Attracting and retaining outstanding faculty and students at MS State;
-- Offering researchers at MS State and elsewhere the opportunity to collaborate with the IGBB's genomics, proteomics, and computational biology experts;
-- Helping principal investigators leverage the experience and expertise of the IGBB to make their research programs more productive, increase the number and scientific impact of their publications, and enhance their ability to procure extramural funding;
-- Supporting educational activities that enhance the abilities of students and faculty to succeed in the multi-disciplinary fields of computational biology, genomics, and biotechnology.
NGI - The mission of the Northern Gulf Institute is to conduct high-impact research and education programs in the Northern Gulf of Mexico region focused on integration - integration of the land-coast-ocean-atmosphere continuum; integration of research to operations; and integration of individual organizational strengths into a holistic program. The program shall measurably contribute to the recovery and future health, safety, resilience and productivity of the region, through sustained research and applications in a geospatial and ecosystem context.
---
The HPC2 was home to PET, or the Center for DoD User Productivity Enhancement and Technology Transfer whose mission was to bring university research results and expertise to bear in collaborative assistance and training for DoD users as part of the DoD High Performance Computing Modernization Program (HPCMP). The HPCMP included four Major Shared Resource Centers (MSRCs), or "supercomputing" centers: U.S. Army Research Laboratory (ARL) at Aberdeen, MD; U.S. Air Force Aeronautical Systems Center (ASC) at Dayton, OH; U.S. Army Engineer Research & Development Center (ERDC) at Vicksburg, MS; Naval Oceanographic Office (NAVO) at Stennis Space Center, MS. The four MSRCs were used by researchers at DoD centers and at universities and industries with DoD contracts all over the country. The PET program brought top-level HPC expertise from universities to bear in support of these DoD researchers. The team led by Mississippi State University consisted of 12 universities and 4 industrial partners."

Facilities & Resources

The HPC² facilities include two buildings, the Malcom A. Portera High Performance Computing Center (HPCC) and the Center for Advanced Vehicular Systems (CAVS) buildings, within the Thad Cochran Research, Technology, and Economic Development Park adjacent to the Mississippi State University campus in Starkville, Mississippi, and the MSU Science and Technology Center (STC) building at the NASA John C. Stennis Space Center (SSC) near Bay St. Louis, Mississippi. The HPCC building is a 71,000 square foot facility designed in an open manner to facilitate multi-disciplinary interactions and houses the organization's primary data center. The CAVS building is a 57,000 square foot facility consisting of numerous office suites, experimental laboratories housing an extensive array of equipment in support of materials, advanced power systems, and human factors research activities, as well as a small data center. The STC building at the NASA SSC is a 38,000 square foot facility consisting of office space, classroom space, and a data center. Additionally, the CAVS organization occupies a 24,000 square foot building near the Nissan manufacturing facility in Canton, MS to support Mississippi and national industries.

Partner Organizations

Abbreviation

HPC

Country

United States

Region

Americas

Primary Language

English

Evidence of Intl Collaboration?

Industry engagement required?

Associated Funding Agencies

Contact Name

William B. (Trey) Breckenridge III

Contact Title

Director, High Performance Computing

Contact E-Mail

trey@hpc.msstate.edu

Website

General E-mail

Phone

(662) 325-8278

Address

Mississippi State University
HPC Building, Box 9627
Mississippi State
MS
39762-9627

[an NSF Graduated Center] The High Performance Computing Collaboratory (HPC²), an evolution of the MSU NSF Engineering Research Center (ERC) for Computational Field Simulation, at Mississippi State University is a coalition of member institutes and centers that share a common core objective of advancing the state-of-the-art in computational science and engineering using high performance computing; a common approach to research that embraces a multi-disciplinary, team-oriented concept; and a commitment to a full partnership between education, research, and service. The mission of the HPC² is to serve the University, State, and Nation through excellence in computational science and engineering. The fields addressed by the Center are simply regions or volumes of space within which physical phenomena vary with position and time. These physical phenomena impact our lives much more than we generally realize: Common examples of these phenomena include compressible fluid flow, such as the air flow around aircraft and automobiles; incompressible fluid flow, such as the flow of water past ships; and electromagnetic (EM) fields, such as microwave signal transmission or local EM fields around power lines. Historically, extensive experiments and costly equipment, such as wind tunnels, were necessary to study these phenomena; however, they can now be simulated using modern powerful computing systems and computational techniques. This use of field simulations has become a powerful tool to supplement and/or replace traditional experimental and analytical methods. In some instances, simulation is the only approach for understanding immeasurable physical phenomena or analyzing particular designs under certain conditions. The primary obstacle to widespread use of computational field simulation (CFS) by industry has been that industry's design-related field problems are usually quite complex, requiring simulations that are time-consuming and costly. The NSF Engineering Research Center (ERC) for Computational Field Simulation at Mississippi State University was created in 1990 with the mission to research the means and methods to reduce the time and cost while increasing the fidelity and scope of complex field simulations for engineering analysis and design. When the ERC was created, complete real-world problems were impractical to simulate on that generation of supercomputers. Capturing the complex geometry of a complete aircraft and creating the discrete small-volumes in the regions for field computations could easily have taken 6-12 months with extensive engineering efforts. Since the field computations for the total 3-D problem could easily have required another 6-12 months on a $20 million supercomputer, the complete problems were too expensive for practical simulations. The Center currently conducts coordinated cross-disciplinary research (which amounts to approximately $12 million per year) interacting with 16 industrial affiliates and 14 government affiliates. The Center's vision is to enable for U.S. industry and government agencies superior capabilities for computational field simulations of large-scale geometrically complex physical field problems through domain-specific integrated simulation environments for rapid analysis and design, facilitating a shift from physical prototyping towards computational simulation prototyping. The research in the ERC focuses on the underlying science of CFS and the development of means and methodologies to enable the necessary reduction in engineering time, clock time, and overall cost of CFS for application domains, including extensions into diverse, very complex multidisciplinary applications, that are relevant to industry. A major emphasis of the Center--which employs 70 faculty and staff researchers and approximately 125 students from various disciplines in engineering, science, and mathematics--has been in computational fluid dynamics (CFD) to provide the means to simulate complete real world applications (such as cruise missiles, complete submarines with rotating propulsors, biofluid flow with particulates, rocket exhaust, and weapon or stage separation). However, the Center's strategic research efforts in building computational problem-solving environments encompass all areas of field physics. The fulfillment of the Center's mission is illustrated by the John Glenn space shuttle flight. The Center has significantly contributed to the art and practice of "unstructured grid generation", yielding high quality grids in significantly less time. Whereas "structured grid generation" on a total aircraft may take several weeks or months, the Center's unstructured grid generation can be accomplished within a day. The Center focused a team on coupling its structured grid CFD algorithm knowledge within a portable, scalable computational architecture onto unstructured grid solver technology. This required substantial research in both boundary layer gridding and solution algorithms. As it turned out, the parallel solver (research) code had just been assembled for the first time when the Space Shuttle mission STS-95 was launched. NASA Johnson Space Center called seeking simulated analysis of the Space Shuttle Orbiter during the return flight after the Orbiter drag chute door was lost during main engine startup. [The NASA engineers wanted to know the dynamic pressure in the region of the missing chute door in order to estimate the aerodynamic loadings during reentry.] The ERC group read a previously supplied Space Shuttle Orbiter geometry into the ERC's integrated simulation environment (SOLSTICE) and created the grids within hours. Initial simulation results were computed on a high performance computer within two days. The significance of this endeavor was not that NASA actually needed the results for successful reentry, but rather that the ERC had been able to take a tough real world problem and compute the solutions in two to three days after receiving the geometry description. [The turnaround time could have been reduced to a day if the ERC's main high-performance computer was dedicated solely to this task (only 1/4 of the machine was actually used)]. This demonstrated an achievement that was a direct result of the ERC's mission and efforts. Our researchers have demonstrated superior ability to simulate very complex real world problems with complex geometries in relative motion. These accomplishments have come from directed cross-disciplinary efforts involving various technologies: grid generation, field solution algorithms, and scientific visualization, coupled with computer and computational engineering. Without the ERC structure, we could not have combined all of the various talents and technologies required. The simulation of field phenomena is historically divided into three phases: grid generation (i.e., capturing a representation of the geometry and field regions and then constructing a grid that divides these regions into many separate or discrete small volumes); the use of solution algorithms to solve discrete approximations for the equations which govern the physical phenomena (constitutive equation modeling) to obtain values for the physical solution at each point in space and time; and scientific visualization (i.e., displaying the geometry and/or solution on a computer screen). Further, the computational capability itself is enabled by the computing system, including the system software, which creates the application programming support environment, and the computer architecture, which incorporates the hardware features and constraints. In addition to the achievements in unstructured grid generation, the ERC researchers pioneered the development of structured grid generation and techniques based on the parametric-based nonuniform rational B-splines (NURBS) representation. These technologies are incorporated into the block-structured grid and unstructured grid tools: generalized unstructured multi-block GUM-B and HyperMesh, respectively. The ERC originated, released, and supported a code to simulate turbomachinery-related flows--a code that has become the defacto standard for turbomachinery manufacturers which are served by NASA Lewis Research Center (Allied Signal, Allison, General Electric, and Pratt & Whitney). Algorithms enabling simulations for a fully configured submarine or surface ship, including rotating propulsors have also significantly advanced the state-of-the-art with the Navy selecting the codes for technology transfer to the shipyards. Other examples of application-enabling research include (1) capabilities for free-surface (air/water) boundaries (e.g., ships and littoral water oceanography with actual geometry, temperature, and salinity), (2) both chemical and thermal nonequilibrium flow (e.g., the Space Shuttle main engine nozzle starting transient; the Titan Centaur booster and separation for Lockheed Martin; an environmental quality network model for underground pollution; a fully two-dimensional radiative heat transfer model; a portable, scalable solver for arbitrary mixtures of thermally perfect gases in local chemical equilibrium; and preconditioning algorithms for low-speed combustion applications), and (3) particle-laden biofluid flows (e.g., first ever oscillating flow in a pulmonary bifurcation section, inhaled aerosols through branching, lung-like tubes; and powder-carrying air jets for industrial coating processes). These efforts clearly demonstrate the ERC’s expertise in creating efficient means to simulate fluid flow through moving complex geometries with complex physical phenomena. The scope of CFS capabilities is being extended across new single and multidisciplinary domains. In scientific visualization the ERC has contributed by (1) advancing the technology in visualizing time-varying data through the release of ISTV in February 1996 and by (2) demonstrating at SuperComputing '95 the ability to steer and visualize a running ocean model in a multiperson immersive environment. Current research encompasses feature detection, multiresolution visualization, data compression, distributed visualization, flow visualization, and interactive virtual environments, exemplified by the recent installation of a CAVE. A close association between computing and field simulation researchers within the Center has resulted in efficient parallel CFS algorithms. Simulations of both compressible and incompressible flows have been researched to develop effective solution methodologies and programming environments for creating portable parallel simulation programs for use with the evolving computer architectures. The ERC has been a leader in the emergence of the Message Passing Interface (MPI) as the standard paradigm for writing distributed (parallel) applications, enabling programs to be portable across a wide range of distributed computing platforms. (MPI parallel applications can be ported across shared or distributed memory architectures "transparently" while exploiting the low latency of shared memory.) A significant event in technology transfer has been the collaboration with computer companies to expedite deployment of MPI software on various platforms. The Center is actively involved in evolving a testbed CFS integrated system, incorporating the various elements in an effective and user-oriented system and focused on next generation computational simulations. The CFS testbed provides for capability demonstrations, a modular framework for technology advancements and maintenance, efficient reuse across physics domains, vehicles for technology transfer, and tools for CFS instruction. Targeted for collaborative use, the integrated system testbed provides the foundation for creating domain-specific versions, such as an integrated simulation system for littoral waters, for affiliates. Technology transfer is facilitated through collaborative research activities, focusing on the particular customer and industrial needs. Current ERC research is leading to addition of the integration of measurements and to the addition of multidisciplinary simulations to the capability of the simulation testbed. As part of its education mission, the ERC has had approximately 700 students directly involved in the research of the Center, has developed a cross-disciplinary computational engineering graduate program to allow students to integrate their study with the research of the Center, has developed graduate and undergraduate CFS courses for engineering students and others, and has developed a minor in computational engineering for undergraduate engineering students. Working with the Department of Art and the School of Architecture, the ERC facilitated the new graduate degrees in animation and electronic visualization (MS in art) and in electronic design (MS in architecture). The ERC has programs with minority and women’s institutions and works with K-12 schools.

Abbreviation

HPC

Country

United States

Region

Americas

Primary Language

English

Evidence of Intl Collaboration?

Industry engagement required?

Associated Funding Agencies

Contact Name

William B. (Trey) Breckenridge III

Contact Title

Director, High Performance Computing

Contact E-Mail

trey@hpc.msstate.edu

Website

General E-mail

Phone

(662) 325-8278

Address

Mississippi State University
HPC Building, Box 9627
Mississippi State
MS
39762-9627

Research Areas

A common thread of research at the HPC² is computational simulation of physical phenomena, but this exists in a rich fabric of high technology that ranges from software engineering to submarine propulsion, from load balancing in parallel computing to visualization and interpretation of remotely sensed data, from computational geometry to algorithm development, and from materials modeling to crash simulations.
This list indicates general areas of research activity within the HPC². Many areas listed here may overlap or are subsets of other areas.
Computational Fluid Dynamics
CFD-Based Design Optimization
Multidisciplinary Analysis and Design Optimization
Computational Design
Computational Modeling
Geometry Modeling
Structured Grid Technology
Unstructured Grid Technology
Dynamically-Adaptive Grids and Meshes
Generalized Mesh Generation
Incompressible Flow Methodology
Compressible Flow Methodology
Turbulence Modeling
Large Eddy Simulation
Hybrid LES/RANS Simulation
Generalized Flow Solver
Hypersonic Flows
Non-Equilibrium Flow Simulations
Aircraft Icing
Code Verification
Enterprise Computational Services
Software Systhesis
Materials Modeling
Parallel Algorithms
Computational Heat Transfer
Viscous Ship Hydrodynamics
Computation Solid Mechanics
Numerical Fracture Mechanics
Damage Tolerance of Composite Materials
Scientific Visualization
Information Visualization
Visual Analysis
Image Processing
Feature Detection
Sensitivity Analysis
Computational Systems for Vehicle Analysis
Crash-worthiness Simulations
Education and Training
Pollutant Transport in Urban Areas
Biomedical Device Simulation
Simulation of Biomedical Fluid Flows
Aerosol Deposition in Lung Airways
Geospatial Technologies
Natural Resource Management
Decision Support Systems
Software and User Interfaces
Research Centers, Institutes, and Initiatives
ASSURE - The Alliance for System Safety of UAS through Research Excellence is comprised of twenty-one of the world's leading research universities and more than a hundred leading industry/government partners. This alliance features expertise across a broad spectrum of research areas including: Air traffic control interoperability, UAS airport ground operations, control and communications, detect and avoid, human factors, UAS noise reduction, UAS wake signatures, unmanned aircraft pilot training and certification, low altitude operations safety, spectrum management and UAS traffic management. ASSURE possesses the expertise, infrastructure and outstanding track record of success that the FAA Center of Excellence for Unmanned Aircraft Systems demands.
CCI - The Center for Cyber Innovation is a sustainable solution provider for complex cyber problems. CCI leverages interdisciplinary cyber expertise. CCI has cyber relationships within the federal government, Defense Department and the Intelligence Community that allows CCI to be a solution provider of choice. As a facilitator, CCI engages select industry partners and academia partners to establish results-oriented teams with cyber capabilities.
CCS - The mission of the Center for Computational Sciences is to enhance the applicability and usability of simulations involving interacting physical, chemical, biological, and engineering phenomena by developing integrated computational environments and crosscutting tools that synergistically couple information technology with computation science and engineering.
GRI - The mission of the Geosystems Research Institute is to understand Earth's natural and managed systems and provide comprehensive solutions for socioeconomic and environmental requirements, leading to an improved quality and life. The computational engineering goals of the Geosystems Research Institute are designed to derive geoinformation from geospatial data, develop a fuller understanding of the information through mesoscale modeling and visualization, and provide information services to the geospatial science and user community through computational libraries. Using the infrastructure and expertise of one of the largest computing centers (HPC²) in the United States, the GRI provides capabilities in remote sensing computational technologies, visualization techniques, natural resource management, and the transition of these into operational agency research, planning, and decision-support programs. With its multi-disciplinary team of researchers and educators, the GRI has developed nationally recognized research strengths, with strong relationships and inherent respect from state, regional, and national agencies and business entities.
ICRES - The Institute for Computational Research in Engineering and Science strives to be a world-class center of excellence for research, technology and education equipped to address engineering challenges facing the nation's industrial base. Utilizing high performance computational resources and state-of-the-art analytical tools for modeling, simulation, and experimentation, ICRES will provide a distinctive, interdisciplinary environment that will support economic development and outreach activities throughout the State of Mississippi and beyond.
CAVS - The Center for Advanced Vehicular Systems mission is to research and develop manufacturing and design means and methods for producing vehicles of superior quality with advanced features and functions at superior costs, focusing on computational tools and exploiting the underlying technologies for broader industrial use. The mission also includes engineering extension, education, and advanced technical training outreach for industry. In Starkville, CAVS research and development activities are based on three factors: industrial needs and priorities; opportunities for CAVS to provide added value; and opportunities to build on the state's investment by securing external funding related to broadening the reach of technologies. The CAVS Extension Center at Canton, MS provides direct engineering support for Nissan and its major suppliers; engineering extension work for Mississippi's manufacturers; workforce development, education, and training; and business systems and information technologies.
CAVS-E - Mississippi State University CAVS Extension's capabilities are driven by the needs of the automotive industry. Resulting capabilities are leveraged to benefit other manufacturing industries throughout the state. This is accomplished in a variety of ways including on-site engineering support and professional development training. Our goal is to introduce best practices and emerging technologies which enable Mississippi Manufacturers to compete successfully in global markets.
I²AT - The Institute for Imaging & Analytical Technologies houses pre-eminent research instrumentation that is available to faculty, staff, students, and outside users. Instrumentation includes technologies for diverse microscopy (light, confocal, atomic force and electron) and microanalysis (e.g., X-ray diffraction) applications, in addition to magnetic resonance imaging used in areas of veterinary medicine, cognitive science and medical systems. These technologies provide MSU, the State of Mississippi and the local community with state-of-the-art resources that facilitate scholarly research, spawn competitive funding, foster project completion, enable high-quality undergraduate and graduate education, enhance impact of outreach, and promote economic development in the state of Mississippi.
ISER - The Institute for Systems Engineering Research is a collaborative effort between the U.S. Army Engineer Research and Development Center and Mississippi State University. The goal of ISER's efforts and products is to mitigate risk, reduce cost and improve efficiency in Department of Defense (DoD) acquisition programs, serve as an additional asset for the state's industrial base for systems engineering related tasks, and create an environment that draws DoD and civilian industry development to the state of Mississippi.
IGBB - The mission of the Institute for Genomics, Biocomputing & Biotechnology is to increase the ability of Mississippi scientists to lead high-throughput, multi-disciplinary projects focused on understanding the biomolecular interactions underlying the diversity, value, and sustainability of species of agricultural, medical, bioenergy, and/or ecological importance. The IGBB provides researchers access to a team of highly-skilled professionals trained in cutting edge genomics, proteomics, and high performance computing principles and techniques. The IGBB team not only generates molecular data using state-of-the-art equipment, but works with investigators to efficiently derive biological knowledge from that data. Specific goals of the IGBB include:
-- Conducting scientific research that meets the needs of society and further enhances the unique strengths of MS State;
-- Attracting and retaining outstanding faculty and students at MS State;
-- Offering researchers at MS State and elsewhere the opportunity to collaborate with the IGBB's genomics, proteomics, and computational biology experts;
-- Helping principal investigators leverage the experience and expertise of the IGBB to make their research programs more productive, increase the number and scientific impact of their publications, and enhance their ability to procure extramural funding;
-- Supporting educational activities that enhance the abilities of students and faculty to succeed in the multi-disciplinary fields of computational biology, genomics, and biotechnology.
NGI - The mission of the Northern Gulf Institute is to conduct high-impact research and education programs in the Northern Gulf of Mexico region focused on integration - integration of the land-coast-ocean-atmosphere continuum; integration of research to operations; and integration of individual organizational strengths into a holistic program. The program shall measurably contribute to the recovery and future health, safety, resilience and productivity of the region, through sustained research and applications in a geospatial and ecosystem context.
---
The HPC2 was home to PET, or the Center for DoD User Productivity Enhancement and Technology Transfer whose mission was to bring university research results and expertise to bear in collaborative assistance and training for DoD users as part of the DoD High Performance Computing Modernization Program (HPCMP). The HPCMP included four Major Shared Resource Centers (MSRCs), or "supercomputing" centers: U.S. Army Research Laboratory (ARL) at Aberdeen, MD; U.S. Air Force Aeronautical Systems Center (ASC) at Dayton, OH; U.S. Army Engineer Research & Development Center (ERDC) at Vicksburg, MS; Naval Oceanographic Office (NAVO) at Stennis Space Center, MS. The four MSRCs were used by researchers at DoD centers and at universities and industries with DoD contracts all over the country. The PET program brought top-level HPC expertise from universities to bear in support of these DoD researchers. The team led by Mississippi State University consisted of 12 universities and 4 industrial partners."

Facilities & Resources

The HPC² facilities include two buildings, the Malcom A. Portera High Performance Computing Center (HPCC) and the Center for Advanced Vehicular Systems (CAVS) buildings, within the Thad Cochran Research, Technology, and Economic Development Park adjacent to the Mississippi State University campus in Starkville, Mississippi, and the MSU Science and Technology Center (STC) building at the NASA John C. Stennis Space Center (SSC) near Bay St. Louis, Mississippi. The HPCC building is a 71,000 square foot facility designed in an open manner to facilitate multi-disciplinary interactions and houses the organization's primary data center. The CAVS building is a 57,000 square foot facility consisting of numerous office suites, experimental laboratories housing an extensive array of equipment in support of materials, advanced power systems, and human factors research activities, as well as a small data center. The STC building at the NASA SSC is a 38,000 square foot facility consisting of office space, classroom space, and a data center. Additionally, the CAVS organization occupies a 24,000 square foot building near the Nissan manufacturing facility in Canton, MS to support Mississippi and national industries.

Partner Organizations