Numerical Investigation of Flame Propagation for Explosion Risk Modeling Development |
---|
Understanding the risk associated with uncontained rocket engine failures is critical to ensuring the safety of crew, personnel, and equipment. While there are several approaches to conducting probabilistic risk analyses for these scenarios, developing and using an engineering-level risk assessment model informed by numerical simulations and/or experimental data is the most efficient approach. The current engineering-level model used in support of NASA’s Space Launch System (SLS) program requires the flame speed as an input. Understanding the flame speed under different conditions is therefore a key aspect to determining appropriate blast overpressures and drives the need for flame propagation characteristics to be further studied. This numerical study looks to address the need for further flame speed characterization by simulating flame propagation through a hydrogen-oxygen mixture and comparing simulation results with experimental data and engineering model results. The simulations consider variations in initial pressure and velocity distribution to identify influencing parameters on the flame speed and investigate how these parameters may change the way results are used to inform risk assessment models. Pressure and velocity variations are chosen here because they have the potential to vary in full-scale scenarios, but also because in many experimental setups an underlying velocity distribution results from induced recirculation used to ensure a homogeneous mixture prior to ignition. Numerical results for the pressure variation cases show that flame speed increases as pressure increases. This is likely a result of increasing density as well as effects on flame instabilities and building pressure waves in the confined domain. Adding a velocity distribution prior to ignition significantly increased the flame speed throughout the propagation and resulted in non-constant flame speeds as the flame interacted with underlying flow features. Specific fill and recirculation parameters used in the experiment were unknown, so no specific conclusions are drawn about the comparisons with the simulations that included an underlying velocity. However, it is clear the presence and character of a non-uniform velocity field at ignition can significantly affect the subsequent behavior and should be characterized. Comparisons with the engineering model show overpressure is well captured by the model for a given flame speed, but additional flame speed input is needed for non-quiescent scenarios. We conclude that the initial conditions have a non-negligible effect on the flame speed and should be carefully reported for all numerical and experimental studies so that the results can be applied appropriately in risk assessment models. Because the model depends on flame speed as an input, it is critical that these initial conditions are accounted for when modeling different scenarios. Knowledge of these effects ultimately improves risk assessments for rocket engine bay risk analyses, but also for any relatively confined industrial setting where explosion hazards exist. The final paper will include details regarding the CFD simulation process and results, comparisons of the CFD results with a simplified 1-D model being used for vapor cloud explosions for the SLS program, and a summary of conclusions based on the findings. |
Performance of Shrinkage Estimators for Bioburden Density Calculations in Planetary Protection Probabilistic Risk Assessment |
---|
Planetary protection (PP) is a discipline that focuses on minimizing the biological contamination of spacecraft to ensure compliance with international policy. The National Aeronautics and Space Administration (NASA) has developed a set of requirements (NPR 8715.24) based on recommendations from the Committee on Space Research (COSPAR) that each mission must comply with regarding both forward and backward planetary protection. Biological cleanliness requirements to target bodies, such as Mars, include spacecraft assembly control and direct testing of the microbial bioburden to comply with planetary protection requirements. Constraints include spore requirement limits for Mars missions of 5105 spores at launch with a maximum limit of 300 spores/m2 bioburden density on flight hardware surfaces, while preventing recontamination by utilizing International Organization for Standardization (ISO) 8 or better cleanroom environments. The data for each component are collected using either swabs or wipes. For each component, a number of samples are collected on one given date or on several different dates along the course of the part assembly. A swab data collection covers the area of 0.0025 m2 with a single swab, while a wipe covered area varies typically between 0.1 and 1.0 m2 depending on the geometric complexity and size of the sampled component. Having been processed in the microbiology laboratory, the samples were deposited in petri dishes, covered with tryptic soy agar and incubated for 72 hours to observe for growth. Given the clean spacecraft, on the average 93% of the swabs and 63% of the wipes have 0 colony forming units (CFU) count at 72 hours resulting in ~85% of the 39,379 petri dishes yielding 0 CFU. Due to low CFU counts and small sampling areas, given Poisson distributional model, the bioburden density estimates have inflated variance and wide confidence intervals. Shrinkage estimators are standard tools to deal with large variance and inconsistency of the estimates. This study presents results on performance of different James-Stein-type (JS) estimators to evaluate the microbial bioburden using data collected for the InSight mission. Different types of shrinkage estimators are considered, and their performance is compared and contrasted with maximum likelihood estimator (MLE) and each other. The regions of domination for different shrinkage estimators are analyzed and estimators’ admissibility is considered. The results show that shrinkage estimators dominate MLE uniformly and JS estimator has the lowest risk among other shrinkage estimators. |
Probabilistic Modeling of Recovery Efficiency of Sampling Devices used in Planetary Protection Bioburden Estimation |
---|
Microbial contamination has been of concern to the planetary protection (PP) discipline since the Viking missions in the 1970s. In order to mitigate this risk and ensure compliance with international policy, the PP discipline continually monitors the microbial bioburden present on spacecraft and associated surfaces. Spacecraft missions destined for other planetary bodies must abide by a set of requirements put forth by NASA based on recommendations from the Committee on Space Research (COSPAR). Compliance to these biological cleanliness requirements are demonstrated by direct sampling of spacecraft hardware and associated surfaces to enumerate the number of microorganisms present on the surface. The PP discipline has employed a variety of tools to perform direct sampling including four different types of swabs (Cotton Puritan, Cotton Copan, Polyester, and Nylon Flocked) as well as two different types of wipes including (TX3211, and TX3224) which are typically used to sample surfaces no larger than 25 cm2 and 1 m2 respectively. The sampling efficiency of these devices is a critical parameter used to generate spacecraft level cleanliness estimates. In this study, we investigate how recovery efficiency differs by inoculum amount and species. This is analyzed across different sampling devices using a set of microbial organisms applied to stainless steel surfaces. Two different recovery techniques were employed the NASA standard assay as well as the European Space Agency (ESA) standard assay and two different techniques for plating: Milliflex filtration method and direct plating. Data were analyzed by first developing a probabilistic model of the end-to-end experimental process capturing uncertainty from the inoculation of species onto the coupon through recovery and growth. The model was aimed to quantify the mean recovery efficiency, a key metric for understanding the probability that an individual microorganism is recovered and predicting the number of microorganisms present on a surface. A cost function was developed to compare the recovery efficiency of various sampling devices and processes and identify those that provide optimal bioburden estimation capability. The results suggest the nylon flocked swab and the TX 3211 wipe yielded the highest recovery efficiency and optimal bioburden estimation capability. Results from this study will be integrated into the Bayesian statistical framework used to perform bioburden calculations for demonstrating PP requirements compliance. |
Transformative benefits emerging from NASA OSMA's evolving Data-Centric and Model-Based Policy framework |
---|
The evolution from “document-centric” to “data-centric” information leveraging structured data and model-based approaches is at the heart of digital engineering transformational efforts underway across industry and government. It is these approaches that pave the way for data lakes, Authoritative Sources of Truth (ASOTs), and systems-of-systems interoperability and the corresponding transformation benefits thereof. Such benefits include increased data availability, data access equity, data traceability, real-time analytics, batch analytics, and (most importantly) acceleration of the time-to-value and time-to-insights associated with engineering products and analyses. For Safety, Mission Assurance (SMA), and Mission Success (SMS) activities, realization of such benefits is essential for engineers and analysts alike to provide vital information when needed to support critical decision making across the entire life cycle. Far too often, such information lags these decision points and/or is absent of the robust, integrated, knowledge needed given inherent barriers associated with traditional document-centric means to data, analysis, and reporting. An overview is given in this paper of how NASA’s Office of Safety and Mission Assurance (OSMA) is evolving its policies, standards, guidance, and training to eliminate such barriers and realize the benefits emerging in this new digital era. Use and implementation of concepts such as Objective-Driven Requirements, Accepted Standards, Safety and Assurance Cases, data digitization (i.e., ontologies, structured data, and model-centric data), FAIR (Findable, Accessible, Interoperable, & Reusable), and/or FAIRUST (Findable, Accessible, Interoperable, Reusable, Understandable, Secure, and Trusted) that are driving this transformation are described in detail herein. Keywords: Accepted Standards (STDs), Assurance Implementation Matrix (AIM), Automated Program Plan Generator (APPG), Safety and Assurance Cases, Objectives-Driven Requirements, Safety & Mission Assurance (SMA), SMA Plan (SMAP), Safety & Mission Success (SMS) Framework References Feather M., Cornford S., Evans J., DiVenti A., Kotsifakis D., and Bendig J., (2022, January). “Developing an Assurance Case for an Optical Communication Mission in a Model Based Setting,” 68th Annual Reliability and Maintainability Symposium (RAMS): NASA Paper and Presentation. Feather M., Cornford S., DiVenti A., Evans J., (2022, January). “Automated Support for the Project-Specific Instantiation of Standards,” 68th Annual Reliability and Maintainability Symposium (RAMS): NASA Paper and Presentation. Geller, Anna (March 18, 2021). 7 Reasons Why You Should Consider a Data Lake. News Article, Dashbird (https://dashbird.io/blog/7-reasons-why-you-should-consider-a-data-lake McComb, Dave (March 1, 2016). The Data-Centric Revolution: The Evolution of the Data-Centric Revolution Part One. News Article. The Data Administration Newsletter (The Data-Centric Revolution: The Evolution of the Data-Centric Revolution Part One – TDAN.com) |