View: session overviewtalk overview
Panelists: Robyn Wilson, Magda Osman, Lesley Walls, Roger Cooke and Anca Hanea
09:00 | Mini Risk Analysis -The first step in learning about Risk and Vulnerability Analysis PRESENTER: Aud Nilsen ABSTRACT. Students attending university courses on Risk and Vulnerability Analysis (RAV) often come from different backgrounds, and many lack understanding of qualitative and quantitative risk analysis. A Mini Risk Analysis (MRA) is an easy method which can be used to overcome the first learning barrier to risk topics. Using MRA can enhance active learning pedagogy. In our examples we use playfulness and creativity as ways to learn about risk analysis. MRA could serve as a risk assessment in a more limited field or as a starting point to indicate where to drill deeper in more complex situations. The students choose one activity and then divide this into separate tasks. This method concretizes the actual situation. Conducting MRA will help students to more easily understand the process of conducting risk analyses. MRA’s simplicity also has the advantage of raising awareness of uncertainty. Using a standard risk matrix is useful for simplification but often leads to the perception of risks as “fixed entities” which are more controllable. Uncertainty is inherent in every prediction of the future hence also in a risk analysis. The uncertainty is often connected to a lack of contextual knowledge. MRA examples from different student groups and experience from municipalities give the students examples of using MRA in different contexts. MRA has the advantage of being simple and not time-consuming. The students get an introduction to the main phases of an ordinary RAV, which makes using MRA a helpful foundation for learning about RAV. |
09:15 | Benefits and challenges of retelling accident case studies in safety education: Insights from a scoping review PRESENTER: Floris Goerlandt ABSTRACT. The importance of safety is widely recognized across industrial and societal domains. With this increased importance of safety, there is a growing need for safety education in university degree programs and in industrial settings. One promising approach to frame the importance of industrial safety in educational settings is accident storytelling. In such lessons, individuals learn about real-world accidents and their underlying causes and consequences through case studies, narrated as stories. Based on a scoping review of the academic literature, this paper provides insights into the benefits and challenges of accident storytelling as a method of learning about industrial safety. It is is highlighted that accident storytelling can be an effective tool for promoting safety awareness and improving safety behavior, but also that the approach has limitations which need careful reflection and consideration for educators. This paper has implications for safety education in various domains and provides broad insights especially for safety and risk educators. |
09:30 | Preparedness in school- lessons learned and the way forward PRESENTER: Bjørn Ivar Kruke ABSTRACT. In recent years, there has been a significant increase in violent incidents in Norwegian schools. At the same time, many threats of school shootings and bombings have been posted in the digital domain, targeting various schools in Norway. Norwegian schools and police authorities have issued guidance on contingency planning for severe incidents in kindergartens and educational institutions. The guidance specifies that institutions must plan their preparedness and exercises based on a risk and vulnerability analysis. However, although responsibilities are specified, the guidance does not stress the need to include teachers and pupils in preparedness planning. Thus, this paper aims to study how preparedness planning incorporates teachers and pupils at schools and the impact on the school climate of inclusion in preparedness planning. Data stems from three studies examining the degree to which school staff and pupils are confident in handling unforeseen incidents, focusing on incidents involving ongoing life-threatening violence (Norwegian abbreviation: PLIVO). All studies conclude that the preparedness work is limited to the school's management staff. Teachers and pupils are not engaged in preparedness work. These findings show a need for further development work within relevant pedagogy and curriculum development in schools, where preparedness is included in a new bow-tie diagram. Good emergency preparedness in schools can positively influence the school climate and promote health, inclusion, well-being, learning, and the capacity to respond to an incident/accident. Norwegian schools are required by law to facilitate a safe and sound school climate, making preparedness essential for building and maintaining a good school climate. Our studies indicate that pupils want to be part of the preparedness but are not involved in the preparedness work. Involving pupils in emergency planning may broaden their perspective, positively impact emergency planning, and prepare them for responding to an incident/accident. |
09:45 | Fostering Risk Management Skills for Future Sustainability Leaders: An Exploration of Project Management Training in Higher Education PRESENTER: Perseta Grabova ABSTRACT. In an era of rapid change and global challenges, sustainability citizens require transversal competencies to navigate complexity and maintain employability in a competitive job market. Risk management, a critical skill in project management, emerges as a vital competence for addressing uncertainty and adapting to evolving work environments. This study aimed to explore how project management training can develop risk management competencies in higher education students, focusing on its potential to enhance employability skills for sustainability citizens. The research was conducted through a 15-hour project management course within the Master of Science in Risk Management at the University of Tirana, covering topics such as risk management and project organization. A self-evaluation questionnaire assessed students' acquisition of transferable skills before and after the course, framing outcomes in clear, self-reflective statements. Analysis showed considerable progress in students' perceived risk management abilities, with risk management emerging as one of the top competencies students felt they improved, alongside analytical thinking and research skills. Risk management, as developed through these training programs, embodies complex problem-solving, critical thinking, and systems analysis—key components identified by the World Economic Forum (WEF) as crucial for future employability. The study found strong connections between risk management and other key project competencies, such as research abilities, presentation skills, and planning. These findings suggest the potential of project management training, particularly its embedded approach emphasizing experiential and reflective learning, in developing risk management as a crucial employability skill for sustainability citizens. The integration of risk management with other key project competencies highlights its broader applicability in preparing students for complex decision-making in uncertain environments. Future research could explore how these approaches translate to long-term professional outcomes and their applicability across diverse educational contexts. |
10:00 | GAP-analysis: Integral Safety Education at Technical Universities in The Netherlands for the Construction Process Industry ABSTRACT. In this paper, on the one hand we investigated to what extent the field of integrated safety is part of technical studies at the Dutch Technical Universities (e.g. Delft University of Technology, Eindhoven University of Technology and Technical University of Twente). On the other hand, we focused on the requirements of key figures and project team members in the construction process. A GAP-analysis is made between those two perspectives. Currently, integrated safety is not included in the Dutch Technical Universities and studies as Civil Engineering or Architecture. Often, students from these technical studies become future key figures and project team members in the construction process, who will later be responsible for safety in practice. A basic level of knowledge will help them fulfill their role in terms of safety. This applies to project team members in the infrastructure and construction sectors. Every engineer (and architect) should have knowledge of integrated safety in construction to incorporate this aspect into all phases of the construction process. We investigated safety in technical studies not only on the base of construction safety. A broader look is taken in the analysis: multidisciplinary and multidimensional. Structural safety, traffic safety, machine safety, electrical safety, and external safety are examples of other safety domains which are a part of this research. Knowledge and attention to these other safety domains, beyond construction safety, are limited among engineers. Finally, in this paper we advise how future engineers can be equipped with integrated safety, making it a part of their daily work. This is done from an ethical point of view. |
09:00 | Safety and Security Risk Mitigation in Satellite Missions via Attack-Fault-Defense Trees PRESENTER: Reza Soltani ABSTRACT. Cyber-physical systems, such as self-driving cars or digitized electrical grids, often involve complex interactions between security, safety, and defense. Proper risk management strategies must account for these three critical domains and their interaction because the failure to address one domain can exacerbate risks in the others, leading to cascading effects that compromise the overall system resilience. This work presents a case study from Ascentio Technologies, a mission-critical system company in Argentina specializing in aerospace, where the interplay between safety, security, and defenses is critical for ensuring the resilience and reliability of their systems. The main focus will be on the Ground Segment for the satellite project currently developed by the company. Analyzing safety, security, and defense mechanisms together in the Ground Segment of a satellite project is crucial because these domains are deeply interconnected—for instance, a security breach could disable critical safety functions, or a safety failure could create opportunities for attackers to exploit vulnerabilities, amplifying the risks to the entire system. This paper showcases the application of the Attack-Fault-Defense Tree (AFDT) framework, which integrates attack trees, fault trees, and defense mechanisms into a unified model. AFDT provides an intuitive visual language that facilitates interdisciplinary collaboration, enabling experts from various fields to better assess system vulnerabilities and defenses. By applying AFDT to the Ground Segment of the satellite project, we demonstrate how qualitative analyses can be performed to identify weaknesses and enhance the overall system’s security and safety. This case highlights the importance of jointly analyzing attacks, faults, and defenses to improve resilience in complex cyber-physical environments. |
09:15 | A Security Twin to Defeat Intrusions in Cyber Physical Systems PRESENTER: Vincenzo Sammartino ABSTRACT. Cyber risk assessment and management have to face a dynamic risk landscape so that probabilities of interest cannot be estimated using historical data. This paper advocates the adoption of synthetic data generated by combining adversary simulation with digital twin technology. A security twin of a cyber physical system (CPS) extends an inventory of the system with information on current vulnerabilities and attacks. By describing threat agents through other twins, we can supply the twins with a platform that simulates the strategies of threat agents to discover how they exploit vulnerabilities and implement their intrusions. To analyze alternative scenarios, a Monte Carlo approach is adopted that runs multiple independent simulations. This produces an intrusion graph that faithfully can describe rapidly evolving environments and results in more accurate risk management and better resilience of the system in spite of data shift. Initial experimental results support the effectiveness of security twins in accurately modeling intrusions. The synthetic data produced by the simulations can also be used to train AI tools to defend a CPS. |
09:30 | Game theory-based defense strategies against coordinated attacks on multi-state interdependent critical infrastructures PRESENTER: Maria Valentina Clavijo Mesa ABSTRACT. In an increasingly interconnected world, Critical Infrastructures (CIs) delivering essential services, like energy and water, share vulnerabilities through their interdependences of many kinds, including physical, operational and others. These interdependences generate systems of systems exposed to coordinated attacks that can lead to cascading failures across infrastructures. Recognizing the need to protect CIs, this work presents an attack-defense game model to determine optimal defense strategies for multi-state interdependent CIs. The model combines game theory and network theory to assess the topological and operational features of the interdependent infrastructures considered. The Dynamic Input-Output Inoperability Model (DIIM) is used to estimate the operational impact of disruptions on the CIs, and critical nodes are identified based on their topological importance, their operational role and their influence on the interdependencies in the system of systems. Metaheuristics are applied to effectively identify the CIs pivotal nodes as most potential targets for coordinated attacks. The model considers the risk attitude of both attackers and defenders by evaluating their respective game payoffs with Cumulative Prospect Theory (CPT). A case study regarding a system of systems made of a power grid and a water network is used to illustrate the application of the proposed model with the aim of determining optimal defense strategies to maximize the operability of the two interconnected CIs. The proposed attack-defense game model offers decision-making support for infrastructure owners to prioritize investments in protecting critical nodes and for system-of-systems planners to assess potential service losses in the event of coordinated attacks in a community, guiding effective resource allocation and protection strategies. |
09:45 | A stochastic defender-attacker-defender model for smart system resilience enhancement against hybrid cyber- physical risks ABSTRACT. With the advancement of digital and communication technology, modern critical infrastructure systems, e.g., power grids, tend to be controlled automatically and remotely through cyber systems. Such cyber-physical systems (CPS) promote more efficient operations but induce synergetic physical and cyber vulnerabilities in coupling network structures. These hybrid threats necessitate the risk analysis and resilience enhancement of CPS from the cyber-physical perspective. In this paper, we propose a stochastic defender-attacker-defender model with the consideration of both uncertain physical disruptions and malicious cyber attacks to secure CPS against hybrid risks. The developed model provides decision-makers with resilient allocating and operating strategies for distributed energy resources and intelligent firewalls in CPS. The most threatening cyber attack scenario, which targets the availability and integrity of information systems simultaneously, is formulated to realistically simulate cyber risks. An efficient solution algorithm is developed to address the multi-level stochastic optimization challenge. Detailed case studies are conducted in the modified IEEE 13-node and 123-node systems with their communication systems to showcase the effectiveness of the proposed approach. The analytical results reveal the interactions and characteristics of concurrent cyber-physical threats and offer valuable managerial insights for decision-makers to deploy defense strategies. |
10:00 | Cyber-physical Studies for Smart Grid Sustainability and Resilience PRESENTER: Denys Mishchenko ABSTRACT. Integrating renewables, power system infrastructure is becoming more digitally connected to ensure safer, more efficient, and decarbonized future. The challenge is that infrastructure is becoming increasingly vulnerable the more connected it becomes. A geopolitical tension with increased risks of cyber-attacks to critical infrastructure, and importance of security of energy supply shape power system this decade. To form and provide a solid foundation, to exploit the full range of system benefits from consumer engagement strategies to the use of flexible mechanisms in an efficient energy system, it is necessary to advance digitalization through an energy system integration strategy, including data exchange. From the perspective of control and stability of converter-dominated systems and the introduction of microgrids into the ever-expanding grid, the development of scalable and reliable control schemes is an urgent need. From the overall smart grid perspective, energy professionals are ready to offer different solutions, for different parts of the grid and voltage levels, to keep the lights on towards a reliable and resilient future of the society. Different aspects of cyber-physical mechanisms for flexible, reliable and resilient smart grid utilization are discussed in the paper. The key factors and barriers for critical infrastructure resilience and sustainability are summarised from the perspective of power system performance, behaviour and processes. |
09:00 | Introduction on the history of ESRA Norway ABSTRACT. Introduction on the history of ESRA Norway |
09:15 | Applicability of performance measures for production performance analysis in oil and gas industries including lower carbon activities PRESENTER: Stefan Landsverk Isaksen ABSTRACT. Production performance analysis includes methods for predicting production availability, to identify bottlenecks and assess different alternatives in design. Such methods have a tradition for use in the oil and gas industry and are recently also considered for use in renewable energy industries. However, despite principles generally being transferable across different industries, some methods might need adjustment to be applicable for the range of renewable energy. The need for adjustment is also reflected in the extended scope of ISO 20815, where the next revision aims at giving guidance to production assurance and reliability management in oil and gas industry including lower carbon. Some of the relevant performance measures for various lower carbon activities are addressed in this paper. The objective is to consider the applicability of existing measures outlined in ISO 20815, by discussing practical implications and attempts to establish definitions that can be applied across all industries, while addressing nuances needed to reflect unique aspects of each industry. For example, production availability is generally defined with reference to planned or potential production. In the oil and gas industry, the reference has typically been given by production profiles. Production profiles as a reference would also be meaningful in industries like wind and solar, but they are more volatile and must be handled somewhat differently, indicating a need to calculate production availability differently from the current standard. Selected industry cases are used as examples to demonstrate the feasibility and coherence of performance measures in production assurance analyses within various energy industries. |
09:30 | How do differing risk perceptions between public and private sectors in Norway influence the security of critical maritime infrastructure? PRESENTER: Richard Utne ABSTRACT. After the supposed sabotage of the gas pipelines Nord Stream 1 and 2 in the Baltic Sea, September 2022, Norwegian authorities were criticized for deficient prevention of Norwegian oil and gas facilities. Private sector experts argued how critical maritime infrastructure, vital to uphold societal security, were vulnerable to sabotage. Private companies in the oil and gas sector on the Norwegian continental shelf asked for measures against this perceived threat. Although the government emphasized that there was no direct security threat to Norway, it nevertheless decided to strengthen security measures to protect maritime critical infrastructure. This raises the question: How do divergent risk perceptions between private and public sectors shape security strategies for critical maritime infrastructure? This article examines how differences in risk perception between private and public stakeholders influence securing critical maritime infrastructure. |
09:45 | Optimizing risk communication to ease risk management PRESENTER: Coralie Esnoul ABSTRACT. Risk management can be difficult to apprehend for partners and customers when defining tasks and responsibilities in development projects. What are the risks related to the product? What are concerns regarding the risks related to the process? The overall project? Or the risks that the end-users will have to overcome when integrating, operating and maintaining solutions? It becomes even more complex when stakeholders have different experiences with risk management, already existing infrastructure, how to deal with need for competences or technologies, or even are not prepared to take the ownership of their own risks. With the goal of addressing such challenges, a method for simplifying and making risk collection practical has been developed for the European project E-LAND (EU Horizon 2020) in 2019. The aim is to increase risk understanding by focusing on risk communication to enable the stakeholders to take ownership of their own risks. The method has already been applied in different European projects, in internal company assessment activities and in different domain applications (energy, digitization, AI and data management). This article presents an update on the method, what difficulties and changes were necessary to adapt the communication to various parameters (domain application, type and knowledge of the partners, ways of working, etc.). The paper describes main challenges when performing risks management (boundaries of the study, difficulties to convince the project management, improvement in templates to the application, etc.). This evolution is meant to optimize the overall process of risk identification, highlight the main risks and help the decisions makers to reach agreements and mitigate their risks with an adapted solution. At the end, the feedback received by the risks manager and the partners, show that the risks collection is seen as becoming easier and that the risk ownership and understanding of risks have improved. |
09:00 | Constructing binary decision diagrams using machine learning ABSTRACT. Binary decision diagrams are a highly popular method for calculating system reliability. By representing the structure function of a binary monotonic system as a binary decision diagram, the calculation of the system's reliability can, in principle, be performed efficiently. However, constructing such diagrams can still be challenging. To ensure that calculations are done quickly, it is important that the diagrams are as compact as possible. In this article, we will show how binary decision diagrams can be constructed using machine learning. The method assumes the existence of a dataset with corresponding values of component states and system states. Such a dataset can easily be generated of any size if the structure function is known and can be calculated efficiently. However, the method can also be used to approximate an unknown structure function based on similar experimental data. The number of possible component states naturally grows exponentially with the number of components in the system. Consequently, if the number of components in the system is high, it will, in practice, not be possible to obtain sufficient data to perfectly describe the system's structure. In the article, we will compare different strategies for handling this problem. Specifically, it is of interest to compare methods that aim to approximate the structure function as accurately as possible with methods that instead focus on estimating system reliability as accurately as possible. The methods will be illustrated with a range of examples. |
09:15 | Machine Learning-Driven Prediction of Consumers' Pre-purchase Safety Behaviors in Online Shopping Malls PRESENTER: Kenichi Miura ABSTRACT. As online shopping malls play an increasingly crucial role in consumers’ daily lives, large amounts of consumer and transactional data have become available. While current machine learning applications in e-commerce focus primarily on enhancing customer experience, increasing sales, and providing personalized recommendations, the analysis of consumer risk behaviors remains underexplored. This study addresses that gap by predicting consumers’ pre-purchase safety behaviors to enable the development of personalized safety education programs, ultimately helping prevent unsafe or non-compliant product purchases. We utilize an online survey dataset, which includes consumer demographics, newly defined safety knowledge levels, and reported safety practices—such as checking reviews, monitoring public alerts, and verifying sellers. Five machine learning models were compared: Linear Regression, Random Forest, Neural Network, XGBoost, and SVM. Results from the model comparison indicate that SVM outperforms the other methods, achieving the lowest mean absolute error in numerical predictions and the highest accuracy and AUC in binary classifications of safety behaviors. These findings highlight the influential role of consumer safety knowledge and demographics in shaping pre-purchase risk decisions. Based on the SVM model’s predictions, we propose personalized consumer safety education initiatives, such as pre-purchase pop-up or e-mails, that online mall operators can implement to promote safer purchasing decisions. The study demonstrates the feasibility and effectiveness of machine learning in identifying high-risk consumers, offering valuable insights for enhancing product safety awareness and fostering safer e-commerce environments. |
09:30 | Making Sense of Dynamic PSA Results: A Hybrid Approach PRESENTER: Ines Mateos Canals ABSTRACT. An integrated dynamic probabilistic safety analysis (IDPSA) combines the enhanced realism of a deterministic safety analysis (DSA) with the advantages of a probabilistic safety analysis (PSA). The GRS software tool MCDET (Monte Carlo Dynamic Event Tree) for dynamic PSA allows to analyze and quantify the influence of (aleatory and epistemic) uncertainties on the behavior of dynamic systems over time. It can be used both to detect unforeseen accident sequences as well as to quantify the dependency of certain end state scenarios on the respective uncertain input parameter(s). The effects of the high-dimensional parameter space induced by state and time variations of events are simulated and represented using a Monte Carlo approach in combination with the dynamic event tree simulation. This leads to large samples of event trees and time-dependent scenarios requiring state-of-the-art methods of data analysis to analyze the amount of data generated. This paper introduces how data analysis and machine learning can be used together with system knowledge to extract and condense the relevant information, estimate safety margins, and determine the most influential discrete and continuous parameters. Furthermore, it is outlined how various techniques can be combined using an example application for an accident during mid-loop operation. It will also be illustrated how interactive data visualization can be applied in the process of understanding the system processes and component interactions leading to the time series derived in an IDPSA. |
09:45 | Physics-informed Neural Networks used for Structural Health Monitoring in Civil Infrastructures: State of Art and Current Challenges PRESENTER: Vaibhav Gupta ABSTRACT. Structural Health Monitoring (SHM) is a fundamental task in the life-cycle assessment and management of civil infrastructures, specifically dams, locks, bridges, and roads. It aids in cost reduction, facilitates the early detection of degradation processes, damages, and structural deficiencies, ensures timely maintenance, and provides early risk warnings. SHM is directly related to the concept of Digital Twin, which is usually defined as a virtual replica of the physical asset. On the one hand, SHM provides the data for the implementation of digital twins, while on the other hand, digital twins can improve the effectiveness of SHM and support data analysis. Together, they represent a powerful combination for managing and maintaining critical infrastructure. A hybrid approach has become increasingly established in recent years, which comprises a combination of physics-based models and data-driven techniques. This approach mitigates the constraints of both models to align the digital twin's behavior more closely with that of the corresponding physical asset. This study explores the hybrid modeling framework known as physics-enhanced machine learning for forecasting potential structural damage. Among the various hybrid modeling approaches, we focus on physics-informed neural networks (PINNs) and their applications in SHM of civil infrastructures. This study provides a comprehensive classification of research employing the PINNs architecture and critically evaluates its associated limitations. Additionally, we explore advanced deep learning architectures that can integrate PINNs within their computational frameworks to enhance SHM performance by addressing its limitations. This work is a foundational reference for understanding state-of-the-art advancements in PINNs for SHM applications. |
09:00 | Functional failure mode, effects and resilience analysis (FMERA) to determine functionality gaps of today’s rotor blade on-site inspection and repair PRESENTER: Julia Rosin ABSTRACT. Already today average rotor blade length of newly installed wind turbines is beyond 80 m in industrialized countries. Geometrical challenges furthermore comprise large circumferences close to nacelle, large differences between circumferences until the rotor blade tip and nacelle heights reaching and much beyond 150m. Thus, the rated power generated per average operation day increases further reaching the order of 4MW even for slow wind rotor blade turbines with their comparatively large rotor blades. However, along with generated electric power revenues of the order of, e.g., 10k€ a day in Europe, ever increasing standstill time costs arise. Thus, the ambition is to strongly minimize and control inspection and maintenance downtimes and costs. The present article presents a novel functional failure mode and effects and resilience analysis (FMERA) analytical approach on extended system level that includes on on-site inspection and repair capabilities to identify key missing functional capabilities. Missing capabilities are assessed using in addition the resilience dimensions preparation, detection and prevention, absorption, response and recovery, and adoption and learning to assess the criticality of the identified capability gaps analytically. The article identifies main physical access technology as well as inspection gap capabilities of today’s solutions as well as potential future technological solutions that are expected to close these gaps. Solution space includes operational technician teams, main operation, and inspection times of wind turbines, as well as site access considerations including offshore. Furthermore, technologies much beyond state of best practice are assessed, e.g. glass fiber reinforced plastics, carbon fiber materials, and alternatives to traditional steel ropes. It is discussed how the proposed FMERA system analytical assessment and optimization process could be further improved for the present sample domain of assessment of rotor blade inspection and repair capabilities as well as further application options. |
09:15 | Application of innovative methods and tools in experiential training of workers in confined spaces PRESENTER: Loriana Ricciardi ABSTRACT. An examination of accidents involving workers in confined spaces has shown that for each accident that occurs there are often multiple injuries or victims. Furthermore, if the injured has been in contact with toxic products or has remained too long without oxygen, the resulting damage usually remains permanent. Education and training can reduce the negative impact of risk factors present in these kinds of environments through experience and knowledge of the operating modes, to manage not only the ordinary activity in these spaces, but also in the management of high-risk emergency situations for the individual. It is now established that an experiential training, based on learning by doing and that considers the psychophysical aptitudes required of workers, is essential for the proper management of the security of their own and their colleagues. Therefore, INAIL researchers, in a logic of prevention through innovation, designed and built a physical simulator to replicate in an effective and protected way all types of risk and consequent danger to workers operating in these environments. The simulator is built to alter the cognitive conditions of the subjects who use it making them experience situations of risk and consequent danger extremely realistic and especially typical of confined environments (poor visibility, cramped spaces, communication difficulties, poor ventilation and emergency rescue). Additionally, to make training more realistic and immersive, INAIL researchers, in collaboration with professors of the Engineering faculty of the University of Naples, are working on the reconstruction of confined spaces in virtual and augmented reality, to train the operator to deal with potentially dangerous situations in an environment which, although virtually reconstructed, is as similar as possible to the one in which he will actually operate. A concrete example of the use of these techniques is the subject of this work |
09:30 | Exploring systemic safety in a port community PRESENTER: Maija Nikkanen ABSTRACT. Maritime transport and ports are vital to the global economy and essential for the security of supply. A wide range of actors and organizations collaborate to ensure the smooth and safe operation of ports. In addition to interacting with each other, the human actors engage with various machinery and port infrastructure. The increasingly extreme weather conditions also impact the operational environment. This systemic interconnectedness of people, technology, and environment is key to understanding how ports function, particularly when faced with potential disruptions that may be caused by extreme weather, pandemics, or technological and organizational challenges. We address this issue using a systemic safety perspective, grounded in systems theory. The aim of the paper is to 1) ask what systemic safety could mean in a port environment and community, 2) explore what kind of systemic links are found between the safety themes in our data, and 3) consider how these findings should be taken into account when developing safety in ports. To achieve this, we conducted 14 interviews with experts in organizations operating as part of one Finnish port community. We summarize the interview data into statements and group them into eight safety themes: 1) digitalization & technology, 2) development, monitoring & reporting, 3) COVID-19 pandemic, 4) education, 5) cargo safety, 6) weather, 7) safety culture, and 8) communication & cooperation. We categorize each statement as a perceived strength, problem area, or something in between, and identify the secondary themes they address. Finally, we quantitatively analyze the strengths and types of links between the themes, hypothesizing that the analysis can help us understand what constitutes the collective systemic safety of the port community. Based on the results, we aim to reflect on how port safety could be improved by addressing these systemic interactions. |
09:45 | RAMS is not enough! The design of a software integration risk analysis matrix (SIRAM) for assessing the impact of software integration on physical system performance PRESENTER: Arno Kok ABSTRACT. Physical assets are increasingly digitally enhanced using software and associated information technologies. These added functionalities make them more complex to engineer and maintain as they depend on additional software and information technology components. However, the available RAMS analysis methodologies do not explicitly include software application and integration. We propose a software integration risk matrix (SIRAM) as an extension of the current RAMS methodology, to assess the effect of software on overall system performance. This extension can aid decision-making by indicating the expected software's integration impact on system performance and maintenance needs and is developed using design science research methodology (DSRM). A case study within the Dutch railways served as the basis for the design and testing of the proposed matrix. The testing shows that the proposed software integration risk matrix can add value by managing that critical software impacts will be part of the system integration process. |
10:00 | Comparison Between Baseline and Recovery ECG Data in an Experiment with a Unmanned Aerial Vehicle Human-Machine Interface PRESENTER: Andrew Sarmento ABSTRACT. This research undertakes a comparative analysis of baseline and recovery Electrocardiogram (ECG) data to assess the cardiovascular effects of cognitive stress on Unmanned Aerial Vehicle (UAV) operators. It involves 24 subjects from diverse aviation backgrounds who participated in simulated UAV operations. The research aims to evaluate physiological responses by analyzing ECG data collected before and after stress-inducing activities, utilizing statistical methods such as the Z-transform for data normalization and the Student’s T-test to analyze differences between pre- and post-simulation heart rate variability (HRV). The findings indicate significant changes in HRV as a result of stress during the simulations, underscoring the importance of monitoring physiological responses to better understand cognitive workload. The experiment involved two flights with different Human-Machine Interface (HMI) configurations, such as voice command use and multi-operator settings, to simulate stress conditions. Baseline ECG signals were recorded before the simulation, and recovery signals were recorded afterward, with heart rate data analyzed through statistical methods. The results demonstrated consistent differences between the baseline and recovery states for most participants, indicating that cognitive stress substantially affected cardiovascular metrics. As future work, investigations could expand to include other physiological indicators, such as galvanic skin response and eye-tracking data, for a more holistic view of cognitive stress in UAV operations. Additionally, artificial intelligence models could be developed to better interpret these physiological responses, ultimately enhancing the design of human-machine interfaces in high-stress environments. |
09:00 | Expanding Theorizing about Organizations’ Systems: Conceptualizing a Risk Position as a System Element ABSTRACT. Risk influences organizational structures and processes in ways that have important consequences. Prior research suggests actors engage in strategic risk positioning that orients organizations in relation to uncertainty to increase possibilities for firm survival. Although strategic risk positioning takes place within organizational social and structural contexts that are established elements of organizations’ systems, the theorizing is not integrated. Rather, theorizing about organizations’ systems characterizes them as comprising strategy, structure, culture, leadership, and high-performance work practices. By taking a typological approach, and drawing on studies of risk, positioning, and organizational systems, I develop the concept of a risk position that is an element of an organization’s system. I define a risk position as a unit or organizational orientation toward the potential for loss resulting from risk exposure and the significance of the objective that creates exposure to loss due to the pursuit of opportunity. I develop a typology of risk positions that describes four types in relation to these two dimensions. I explain how the risk position concept has interdependence with other system elements. In addition, I show how considerations of risk positions provide managers with language and meaning that enables intentional risk positioning and contributes to organizational performance. My modeling contributes to studies of risk, organizations’ systems, and performance. |
09:15 | Managing Human Factors: An Innovative Frameworks for Human-Robot Collaboration: Insights into Interaction Modalities Causing Workload PRESENTER: Xiranai Dai ABSTRACT. This study presents a systematic literature review examining the impact of interaction modalities on workload in human-robot collaboration within Industry 5.0. Using a framework inspired by Wickens' Multiple Resource Theory, it highlights the negative impact of linguistic tasks and the positive effects of tactile feedback on workload. Input-assist interactions, particularly in visual-spatial contexts, show potential for optimization, while autonomy and environmental factors like robot size and speed further influence workload. The findings suggest practical strategies for enhancing collaboration and call for exploring underutilized modalities and complementary metrics to improve workload assessments in futur |
09:30 | Hand Gesture Identification for Soft Controller regarding Human Behaviours PRESENTER: Zhengji Wu ABSTRACT. In the context of the Internet of Things (IoT), gesture control offers a natural and convenient interaction method. Compared to traditional physical buttons or touchscreens, gesture control is more intuitive and flexible, making it highly suitable for smart clothing applications. This study focuses on analyzing the reliability of PET (Polyethylene terephthalate)-based piezoresistive thin-film sensors for hand gesture recognition, demonstrating their potential for smart clothing. A survey identified 13 common hand gestures frequently used in daily activities. Experimental results showed that the sensors effectively recognized these gestures, achieving a recognition accuracy of 99.4% through neural network modeling with the original design of 25 sensors. To simplify the design and improve reliability, a greedy algorithm was used to find a locally optimal solution, reducing the number of sensors from 25 to 6 while maintaining a recognition accuracy of 97%. This optimization significantly reduced system complexity, lowered costs, and made the product more environmentally friendly. In reliability testing, the original 25-sensor design had a failure rate of 7.69×10−5, with the first failure occurring after 13,000 uses. The optimized design with 6 sensors exhibited a slightly improved failure rate of 7.64×10−5, with the first error appearing after 13,082 uses. As testing continued, error fluctuations increased in both layouts, indicating that long-term sensor performance degrades over time. However, the optimized design notably enhanced the system's durability and reliability. In conclusion, this study confirms that PET-based sensors are not only reliable but also benefit from sensor quantity optimization, improving system durability and cost-effectiveness. These qualities make them highly suitable for integration into smart clothing’s remote-control systems, offering a lightweight, durable, and efficient solution for future innovations. |
09:45 | Human performance and reliability in a complex seismic accident scenario at a nuclear power plant PRESENTER: Ilkka Karanta ABSTRACT. A scenario, where a nuclear power plant has experienced an earthquake, and lost its automation systems and main control room, is improbable but exceptionally severe. Then, the mission of the operators is to steer the plant to a safe and as stable as possible state, and secure residual heat removal. In this paper, the scenario is described. Factors affecting operator performance in the scenario are considered. A human reliability analysis model for the scenario is proposed, and its main features outlined. The model consists of treatment of timings and of human errors. Concerning timings, an activity network model is proposed. Concerning human errors, special features of the scenario are treated in performance shaping factors (PSF); central PSFs are identified and factors affecting them are described. |
10:00 | Human Factors and Design Principles for Safe Remote Operations in the Petroleum Sector PRESENTER: Mina Saghafian ABSTRACT. The petroleum sector is a safety-critical sector that is increasingly investing in automation and remote operation technologies to increase the safety and efficiency of its processes. To ensure the safest and most resilient transformation to these technological developments, the petroleum sector should base the process on human factors approaches for an optimal system design that can deal with even more complexities of autonomous systems. While most research focuses on the investigation of failures and accidents in complex systems, our aim in this paper is to explore the design strategies that enhance the overall system performance. We consider this a successful design. The research question that we posed was: what human factors principles can be deployed in interactive automation and remote operation systems design in the petroleum sector? This paper is part of the Meaningful Human Control (MAS) project where a larger literature review was conducted to identify successful design principles across various sectors. This article zooms in on the petroleum sector literature. Following a set of inclusion criteria, we have selected a representative sample of articles to be included in analysis. A total of nine articles were selected for full-text analysis, using the thematic analysis method. Interventions deployed and recommended in the petroleum sector were derived from the articles. The results showed the importance of predesign stage, inclusion of the right expertise early on. Furthermore, Human Centred Design guidelines should be applied. All the stages must be peer-reviewed, and verification and validation tests must be conducted throughout the design process. In addition to that, organizational human factors principles that were found to be an important part of advancements in successful automation and remote operation in the petroleum sector. |
09:00 | How crises solve organisations: a case study from the Covid-19 pandemic PRESENTER: Torgeir Kolstø Haavik ABSTRACT. How organisations work is not so easy to grasp under normal conditions, when formalised work processes, routines and division of labour tend to overshadow and black-box the adaptations and sensemaking taking place in organisational life. During crises, however, black boxes in organisations tend to be opened, allowing for reflexive inspection. This paper is based on a study of a municipal organisation during the Covid-19 pandemic. The objective of the paper is to investigate how crises can change organisations beyond the limited realm of preparedness and crisis management, and to introduce a niche across interrelated research literatures where this topic can be further pursued. Theoretically, we draw on safety and crisis literature, in combination with research literature on organisational innovation. For around 18 months, we performed in-depth studies of a Norwegian urban municipality’s adaptation to continuity and leadership challenges. Results from the study shed light on two central aspects of how crises affect public governance organisations. A central finding is that long-standing emergency management principles and their preconditions are put to test. Not only do organisations solve crises, but crises also solve organisations; during crises, organisational potentials become visible and give rise to adaptation of organisational structures and work forms, some of which may outlast the crises. |
09:15 | Mitigating disruptions: Assessing the role of regulatory measures in fuel supply chain resilience during floods in Brazil PRESENTER: Jardel F. Duque ABSTRACT. Climate change has emerged as one of the most pressing global challenges, with extreme weather events increasingly disrupting supply chains worldwide. In April 2024, catastrophic floods in Rio Grande do Sul, Brazil, severely affected the fuel distribution infrastructure, prompting regulatory intervention through temporary relief measures. This paper presents a spatial competition model to assess the resilience of the fuel supply chain and to evaluate some of the regulatory responses. We develop a three-echelon supply chain model that incorporates diesel and biodiesel suppliers, distributors, and retailers, and simulate market dynamics through iterative price updates while considering transportation costs, capacity constraints, and mandatory biofuel blend requirements. Our analysis reveals that supply overcapacity significantly influences price stability, with tighter capacity leading to higher prices. When simulating the removal of a major biodiesel supplier - mirroring occurred real events - our results suggest that reducing mandatory biodiesel content may have had unintended consequences, potentially increasing overall fuel costs to retailers. These findings demonstrate the complex interplay between regulatory interventions and market dynamics during supply chain disruptions, offering insights for policymakers and industry stakeholders in developing more effective resilience strategies. |
09:30 | Value personas based quantitative decision support: Augmenting multi-facetted decision problems with empirical societal value – behavior correlations PRESENTER: Ingo Schönwandt ABSTRACT. Effective resilience management of critical infrastructures is complicated by their integration within evolving societal contexts, where shifting public perspectives and needs may impact the perceived legitimacy of infrastructure systems. This study addresses this challenge by incorporating Schwartz’s value theory into decision analysis using the Lake Model, an experimental simulation of a coupled human-nature system. Previous approaches primarily employed synthetic societal value – parameter links to integrate societal perspectives within decision frameworks. Here, we augment the decision framework by integrating empirical societal value – behavior correlations to enhance the model's capacity to capture diverse societal dynamics. By employing distinct value personas—representations of individuals' core motivations for behavior—the study demonstrates the sensitivity of decision outcomes to variations in societal value-based parameters. This method enhances the ability to identify a broad spectrum of policy strategies, thereby improving the interpretability and applicability of the model for complex decision-making. The findings suggest that integrating empirical societal value – behavior correlations into decision-support tools offers a more nuanced understanding of stakeholder preferences and decision strategies, facilitating more informed decision support. This approach contributes to improved resilience management for critical infrastructure systems by bridging abstract societal value theories and practical decision-making frameworks. |
09:45 | Civilian Drones as an Exogenous Player on the Search and Rescue Teams Process. Case study analysis of the Rio Grande do Sul State flood using FRAM PRESENTER: Henri F. von Buren ABSTRACT. Following the 2024 floods in Rio Grande do Sul, Brazil, civilian drone interference significantly hampered a critical search and rescue (SAR) operation despite a flight restriction zone established by the Department of Water Control Air Space (DECEA) and the Regional Body of the Southern region (CINDACTA II), ultimately requiring a complete flight ban. This study employs the Functional Resonance Analysis Method (FRAM) to model the regional SAR activation process and analyze the far-reaching consequences of this external disruption beyond the activities of advanced emergency teams on the ground. The analysis reveals impacts on planning, logistics, ground teams, and the entire chain of command. This research contributes to the development of disaster risk management and accident analysis, focusing on the need for firm control of civilian drone activity in emergencies. The research provides relevant lessons for high-performance organizations like the Brazilian Air Force (FAB), Navy, Army, Civil Defense, and Fire Department, integral to national disaster responses. |
10:00 | The implementation of ecosystem accounting in Norwegian zonal planning using theories from risk science. ABSTRACT. Ecosystem accounting data covering the extent of ecosystems, their condition and the ecosystem services they provide will be produced annually across Europe in the coming years. This data will sit alongside other National level accounts, and provide measures equivalent to GDP, such as Gross Ecosystem Production, to guide policy making However, there is still a significant gap in research around how this data can be used at lower levels of governance, particularly spatial planning. Spatial planners will have access to data which links the socio-economic and ecological worlds through the ecosystem services framing. Ecosystem services are the variety of benefits that society gains from nature, such as clean air, clean water, thermal regulation, food, etc. but also the protection from hazards such as floods and landslides. Spatial planning across many parts of the world follows a communicative rationality theory, whereby participation is key for the inclusion knowledge into planning decisions. At present, ecological spatial data, such as red list ecosystem and species, is treated as socially constructed knowledge alongside stakeholder inputs into the decision. The relational theory of risk tells us that an object has to be valued in order to be at risk, in the case of species and ecosystems within communicative planning, we are reliant on participants within the planning system valuing those species or ecosystems. Ecosystem accounting provides us with additional value frames, for example, an ecosystem can be framed as providing downstream communities with flood protection or that a woodland between a community and industry is maintaining good air quality. Moreover, ecosystem accounting reduces uncertainties around the risk involved in destroying ecosystems by providing a quantitative measure of the biophysical and monetary values of those ecosystems. We use theories and methods from risk science to propose how ecosystem accounting can be implemented within spatial planning decisions. |
09:00 | Iterative approach to Safety and Reliability in the context of military aircraft design organization PRESENTER: Milan Pšenička ABSTRACT. The military aircraft development process requires a comprehensive approach including, among other things, safety, and reliability analyses. With the use of effective functional system models, conventional or systemic based, it is nowadays possible to analyze the selected systems quite thoroughly during the whole design phase of an aircraft. However, it is important to remember that safety and reliability is a continuous and therefore iterative process that does not end with the development and certification of the aircraft. It continues with its production and subsequent operation. It is the operation of the aircraft that is the source of important safety and reliability related data (primarily records of defects and failures), which is essential to collect and use for constant modification/addition to safety and reliability analyses. At the same time, the outputs from the safety and reliability analyses are used in other areas, such as spare parts and overall maintenance planning. We present the complex approach how to handle the whole process in the context of a military aircraft design organization, how to effectively connect operational based data directly to Safety Analyses such as Functional Hazard Analyses (FHA) and how this iterative process fit into the regulatory scheme of Safety Management System (SMS) for design organizations. |
09:15 | Uncertainty Modeling in Aeronautical System Testing Campaigns: An approach using Fuzzy Logic and Monte Carlo Simulation PRESENTER: Daniella Castro Fernandes ABSTRACT. During an aircraft development, several tests are performed to assure the safety of flight and the compliance with certification requirements. These testing campaigns require significant resources, such as prototypes availability, technical staff and material resources. An effective planning is essential to avoid impact on the prototype schedule, and subsequent phases, including marketing and certification campaigns. This work proposes the combination between Fuzzy Theory and Monte Carlo Simulation to evaluate the testing campaigns schedule and the risks associated with reducing the planning, through modeling the uncertainties. The Fuzzy Theory handles the subjective data and adverse conditions, while the Monte Carlo Simulation estimates temporal uncertainties, based on probabilistic models and data elicited with specialists. |
09:30 | Applying the Perceptual Cycle Model (PCM) over an Aircraft Approach and Landing Procedure PRESENTER: Christianne Reiser ABSTRACT. In December 2005, Southwest Airlines Flight 1248 overran the runway during landing at Chicago Midway International Airport, colliding with an automobile. The aircraft was substantially damaged, and one person was killed. The aircraft landed in adverse weather conditions and in a contaminated runway. The pilots failed to use available reverse thrust in a timely manner. Events like that are named runway overrun. Most of them occurs in landing phase and are associated with adverse weather, unstable approaches, long touchdown, runway surface conditions, deficiencies in aerodrome facilities and inadequate use of deceleration devices. The risk of an overrun increases when more than one of these precursors is present as multiple hazards create a synergistic effect. Following unfavorable weather or runway surface condition, another airport or runway might be requested. Following an unstable approach, a go around might be performed. Following a long touchdown, the deceleration devices might be more aggressively applied. The pilots are the last barrier of the complex aviation system, but their situational awareness or over-reliance may activate an inappropriate schema. In high workload circumstances, it is not uncommon to emerge problems in decision-making that potentially lead to an incident or accident. This paper proposes the application of the Perceptual Cycle Model (PCM) in the analysis of naturalistic decision making of the approach and landing procedure. The PCM draws on Schema Theory to demonstrate how the environment and context surrounding the decision interact with the cognitive structures and actions of the decision maker. Schemata are mental templates of knowledge clusters that are structured upon experiences similar in nature, driving future behaviors and being updated upon the exposure to new experiences. The paper explores, through the PCM, how the flight crew decision making processes during approach and landing precedes an overrun and/or a normal landing. |
09:45 | Impact of communication delays on pilot workload and performance in RPAS landings: A study of HUD interfaces and workload metrics PRESENTER: Andrew Gomes Pereira Sarmento ABSTRACT. Studies on managing critical failures during the flight of Remotely Piloted Aircraft Systems (RPAS) must be conducted, especially as long-distance operations between different cities become increasingly feasible. This study analyzes a simulated scenario in which, during a journey between cities, a critical failure occurs that does not affect flight dynamics but requires landing at an unmapped airport, without the ability to remotely adjust the landing settings. In this context, a pilot takes control of the aircraft and performs the landing at an airport along the route. Given that communication between the ground control station and the RPAS is conducted via satellite, a delay of approximately two seconds was observed between the pilot's command and the aircraft's execution. The experiment consisted of three flights, each utilizing a different Head-Up Display (HUD). During the flights, the Instantaneous Self Assessment (ISA) evaluation was applied, while the NASA-TLX was used after each flight to measure workload. After the three flights, the pilots completed the SWORD questionnaire to assess cumulative workload. The objective of this paper is to compare the different subjective evaluations (ISA, NASA-TLX, and SWORD) and investigate their correlation with the pilot's actual performance, aiming to understand the validity of these metrics in the context of RPAS operations under delay conditions. |
09:00 | An integrative perspective on frameworks for multi-risk research PRESENTER: Malvina Ongaro ABSTRACT. Multi-risk research aims to provide the instruments to describe and manage pluralities of interconnected risks. As it is a comparatively recent and strongly multi-disciplinary field, there exists a multitude of alternative approaches providing instruments to guide risk assessment and risk management. However, a systematic framework to address multi-risk is still missing. In this work, we propose a unitary perspective that integrates different approaches, identifying the different and complementary purposes they serve in decision-making related to natural hazards. In doing that, we define the conditions under which it is justified (and sometimes recommended) to adopt each of them. In most cases, the objective of a multi-risk framework is to identify the expected impacts and allow to calculate variations in these impacts as a consequence of potential interventions. This can be done at varying levels of completeness and resource demand. A comprehensive assessment of multi-risk impacts involves several levels of analysis and is particularly demanding in terms of data, models, and calculations. These are required to obtain precise forecasts of the expected impacts considering all relevant hazards. However, depending on needs and on the resources available it is also possible to stop the analysis at earlier stages or to adopt a perspective that focuses on selected risks that are particularly salient for scientific or political reasons. Scenario-based approaches propose to pick a specific chain of possible events, rather than modelling all possible interactions, and construct a timeline describing how exposure, vulnerabilities, and impacts evolve after each event. While only a comprehensive approach can provide a full assessment of probabilistic risk in the area, which is often required for all-things-considered decision procedures, scenario-based approaches can support decision-making whenever precautionary attitudes are justified. Understanding the conditions for the applications of different approaches permits the selection of the most effective tools in each decision context. |
09:15 | Third-Party Risk in Research: A Literature Review ABSTRACT. Third-party risk management is crucial for organisations to manage potential risks associated with outsourcing and supply chain operations. While the topic of third-party risk management has been widely discussed in professional literature and regulatory papers, it remains an emerging area of research in scientific literature. The purpose of this theoretical paper is to examine the overall landscape of research devoted to the third-party risk topic and to answer key questions: 1. What is the definition of third-party risk in scientific research? 2. What types of risks are managed within third-party risk management frameworks, and 3. What is the essential difference between third-party risk, vendor risk, and supplier risk concepts? To achieve this goal, a systematic literature review using the PRISMA 2020 methodology was conducted. A total of 107 unique publications were identified in the Scopus and Web of Science databases using the keyword “third-party risk” and analysed using a two-stage approach: first through an abstract review, followed by a full-text analysis. The papers included in the final set were further analysed using bibliometric and content analysis methods. From a theoretical perspective, the research findings provide a comprehensive overview of previous work on the topic of third-party risk, highlighting future research opportunities. From a practitioner's perspective, this research helps clarify the conceptual differences between vendor risk, supplier risk, and third-party risk, supporting the development of a more effective organisational risk management programme. |
09:30 | Permanently exceptional: The paradox of contemporary crisis PRESENTER: Kristin Scharffscher ABSTRACT. Theoretically speaking, a crisis represents a fundamental deviation from the ordinary or the normal. That said, a central problem to crisis research is the “sustained lack of consensus around the definition of crisis and disaster” (Wolbers et al. 2021, p. 375). In 2004, Mitroff et al. called crisis “an ill-structured mess” and lamented that “the field of crisis management has been seriously impeded by its failure to develop appropriate frameworks for the study of crises” (p. 175). Symptomatic of the general disagreement pertaining crisis conceptualisation, Ulrich Beck argued that he preferred the term ‘risk’ over ‘crisis’, because “risk is – unlike crisis – not an exception but rather the normal state of affairs and hence will become the engine of a great transformation of society and politics” (Beck 2013, 6). Beck was nevertheless preoccupied with what he referred to as “this turmoil”, in that the world undergoes “a much more radical transformation in which the old certainties of modern society are falling away and something quite new is emerging (Beck 2015, 3). At the intersection of routine and exception, normal and extreme, history repeated and unprecedented, we intend to explore the paradoxical hallmarks of contemporary crisis. With a view to empirical studies from recent crises including the COVID-19 pandemic, we discuss whether an encompassing conceptualisation of crisis is to be found at the very heart of the proverbial mess. |
09:45 | Are special sessions in ESREL impacting the cross-sector learning? ABSTRACT. Many researchers, especially of the safety dimension of risk, advocate that lessons learned from accident investigations should be transferred between different industry sectors – a concept usually known as ‘cross-sector learning’. Expanding this idea to conferences, many attendees expect to experience a ‘cross-discipline knowledge transfer’. This means that if you are, for example, an attendee interested in reliability modelling, you could benefit from seeing how different sectors such as aviation, maritime and nuclear are investing in new technology in the same session. However, in the last five years, there has been a significant increase in the number of special sessions in ESREL – the European Safety and Reliability Conference (ESREL). There might be benefits such as deeper exploration of an emerging topic, but there is also a risk that having the main risk disciplines being scattered in many special sessions, their most experienced specialists are not going to be present in the same room to contribute, criticise and compare how that discipline is being used in many domains. This paper analyses how special sessions have been rising in ESREL conferences through the years, and if the papers presented could have another distribution within the most traditional disciplines and domains. |
10:00 | A classification system for different types of research methods in risk science PRESENTER: Kjartan Bjørnsen ABSTRACT. Different types of research methods are used in risk science. Examples include the “scientific method” (“the hypothetico-deductive method”), interviews, surveys, experimentation, analysis, simulation and statistics. It is also common to distinguish between more high-level method characteristics or categories, such as being quantitative, qualitative, descriptive, analytical, theoretical, normative, applied, fundamental, conceptual and empirical. This paper presents and discusses a classification system for risk research methods and categories of such methods. A main logic of the classification system is the distinction between research aiming at i) describing and understanding aspects of the world, and ii) research aiming at enhancing the instruments used to obtain i). For both i) and ii), the importance and role of rationalism – reasoning and argumentation – is highlighted. The system is constructed to help users to properly design their research, by pointing to relevant types of research methods. |
09:00 | The Zero Waste Paste project: Development of an energy-efficient and automated process for the gentle recovery of raw materials from solder paste waste PRESENTER: Philipp Heß ABSTRACT. Solder pastes are employed in the mass production of printed circuit boards. However, they possess a restricted shelf life. The solder paste containers and the residues of the (expired) solder paste are classified as hazardous waste. There is no automated process to remove the solder paste from the containers without leaving any residue, thus allowing for the targeted recycling of all components. Instead, the containers are incinerated as hazardous waste, which results in inefficient recovery and high environmental pollution. This paper outlines a research project that aims to develop an environmentally friendly circular economy, with the objective of recovering or recycling all components of solder paste waste. The initial phase of the project will entail the development of a process and construction of an associated system prototype to facilitate the automated, efficient, and gentle recovery of raw materials from solder paste components (solder metal and flux) and the container material (polypropylene and polyethylene). Concurrently, a logistics system is being developed to ensure the efficient collection and pre-sorting of containers according to the paste components they contain. The incorporation of recovered components into the manufacturing process of new solder paste is also being considered, with the development of new recycling options to realize the implementation of a circular economy with the objective of zero waste. The design of the washing process represents the initial stage of the concept phase and serves as a fundamental element. Cleaning and separation tests were conducted to evaluate the suitability of various processes and solvents. The tests encompass a range of technical processes, solvents and liquid nitrogen. The objective is to assess the repeatability and reliability of the cleaning and separation processes, while minimizing the impact on the environment and human health. The initial findings of the project’s analyses are presented and discussed in detail. |
09:15 | Optimal SIL Allocation to the Safety Functions implemented over Layers of Protection – Design Sensitivity due to Dependent Failures ABSTRACT. Safety Instrumented Systems (SIS) based on E/E/PE technology have nowadays become a standard for managing risks in complex technical enterprises. These systems typically use multiple layers of protection to mitigate risks to acceptable levels while ensuring high system availability. Compliance with functional safety standards like IEC 61508/61511 or ANSI ISA-84.01 requires assigning risk reduction factors (RRF) to each safety function and protection layer. IEC 61511 offers guidance on failure detection and prevention but lacks provisions for automatic mitigation systems, such as fire & gas systems. This gap, acknowledged in IEC 61511-4:2020, can lead to unnecessary design costs. In this paper we address this gap by providing cost-effective SIS design for mitigation layers without compromising the safety. By introducing RRFs as proxies for implementation costs, we use optimization to calculate these factors while adhering to the risk equation. Cost-optimal RRFs are calculated for each protection layer, considering an overall risk reduction target, and given loss distribution profile associated to single hazardous category. The model accounts for dependent failures across two successive layers. We demonstrate the algorithm's effectiveness through practical examples involving various loss distribution profiles. |
09:30 | Partition ratio evaluation for safety risk management of liquid ammonia spills to seawater PRESENTER: Marta Bucelli ABSTRACT. Refrigerated liquid ammonia at atmospheric pressure is considered the safest transport and storage mode. In recent years, ammonia has gained attention as an alternative fuel to abate carbon emissions from the shipping fleet. Its implementation and use on the waterways imply some careful adjustments of technical systems and handling procedures on board to ensure, first and foremost, the safety of the public and the crew in case of accidental spill. In fact, ammonia is toxic for humans but it is also a threat to the marine environment, potentially killing fish by direct contact and by stimulating algae production leading to localized oxygen scarcity. On-board systems and procedures should therefore be thoughtfully engineered to minimize the risks to humans and the environment due to accidental ammonia releases. A complete and detailed understanding of the physical behavior of ammonia releases is therefore of paramount importance to ensure adequate design. This work presents a case study considering an emergency situation on a ship vessel caused by a large release of liquid ammonia to the sea. Accurate thermodynamics of the ammonia-water system is used to calculate the partition ratio of ammonia to water and to air. The partition ratio calculation is based on the best available models in literature. The work discusses the sources of uncertainties in the modelling of accurate thermophysical behaviors of ammonia-water interactions. Finally, we discuss how accurate thermodynamic models might serve the design of safety systems and procedures to safeguard both crews and marine life and prevent unnecessary over-conservatism. |
09:45 | Towards dynamic safety control structures in STAMP to manage safety-critical industrial establishments PRESENTER: Francesco Simone ABSTRACT. Managing safety in industrial establishments dealing with dangerous substances is paramount, as recognized by the Directive 2012/18/EU (also known as Seveso III Directive), which is in force in Europe. In particular, the establishments falling under the Seveso III Directive can be recognized as Socio-Technical Systems (STSs), due to the presence of tightly interacting technical, social, and organizational entities. The need to adopt a systemic perspective for the safety management of STSs has been acknowledged in recent safety science. For this reason, this paper relies on the System-Theoretic Accident Model and Processes (STAMP) principles to let critical interactions among system’s elements emerge by constructing dedicated Safety Control Structures (SCSs). In Seveso-related processes, such interactions are highly variable because of their interplays, and time and causal dependencies, and the static nature of the SCS limits the analysts’ perspective on the problem. In traditional STAMP models, such interaction is implicit, requiring ad-hoc solutions to be interpreted. This paper discusses the benefit of adding a dynamic dimension to the SCS of the Seveso III Directive in order to properly consider temporal and causal developments, as well as dependencies within the various processes. Such dynamicity is related to the definition of triggers able to activate (or deactivate) the mutual dependencies among agents, based upon their correlation and their presence in specific circumstances. Results show how such dynamic dimension is a core feature for operationalizing any SCS in real case scenarios. |
09:00 | The effect of speculative fiction on climate risk perceptions PRESENTER: Anna Kosovac ABSTRACT. This presentation considers the extent to which fiction plays a role in amplifying or attenuating risk perceptions related to climate change. We survey 63 people who are provided with a speculative fiction text to read. We analyse the findings of risk perceptions both before and after reading the text to determine whether the text sways their perceptions of climate change. This study is unique in the way that it considers fiction as a tool to help amplify risk perceptions in communities, and potentially promote climate action. This has not been undertaken in this way before and represents a gap in the research on risk perceptions. |
09:15 | Perceived Risks, Benefits, and Sustainability Considerations of Novel AgriFoods PRESENTER: Khara Grieger ABSTRACT. The development and use of novel food and agricultural technologies, including genetic engineering (GE) and nanotechnology (nano), may contribute to more sustainable agricultural systems. At the same time, past experiences with novel technologies have demonstrated the importance of understanding various risks and benefits and incorporating stakeholder perspectives during innovation and development phases. Through a USDA-funded grant, we explored how various stakeholders in the U.S. perceive the risks, benefits, and sustainability of GE and nano-agrifood products. Stakeholders also provided recommendations for promoting more sustainable agrifood systems that incorporate novel technologies. Among other findings, participants identified environmental and human health risks, as well as whether a product addresses a societal need, as key parameters for evaluating the risks and benefits of GE and nano-agrifoods. Overall, stakeholders perceived nano-agrifood products as more sustainable than GE-agrifoods. They also suggested several strategies to enhance the sustainability of agrifood technologies within industry and regulatory agencies, including increasing transparency and engagement in research and innovation. This work is significant because understanding risk-benefit evaluations and stakeholder perceptions of GE and nano-agrifoods is essential for the sustainable development of agrifood systems that increasingly rely on novel technologies. This information is also critical for fostering responsible innovation practices that help shape the future of these technologies in ways that mitigate potential risks and ultimately meet societal needs. |
09:30 | A Serious Game Approach to the Challenges of Scientific Uncertainty in Risk Communication PRESENTER: Christoph Boehmert ABSTRACT. Scientific uncertainty is an immanent part of risk assessments. It is impossible to empirically prove the harmlessness of any agent. While this can be considered a "problem" for risk assessment itself, it is also a Problem for risk communication. For agents that are generally considered rather well-researched and safe by scientific bodies, such as electromagnetic fields (EMFs) used in mobile communication, risk communicators have to decide how much uncertainty they want to communicate. This is especially challenging as their audience, often laypeople, strives for the absence of risk, seeking “absolute” safety if the benefits do not outweigh any potential drawbacks. Hence, risk communicators face a dilemma – it would be scientifically sound to mention uncertainties, but their lay audiences often want clear answers. As a tool to remediate this dilemma, we are developing a serious game. The game targets lay people interested in and potentially also worried about EMFs in mobile communication and human health. In the game, players play a risk communicator and experience the dilemma posed by uncertainty firsthand. The main goal the game is supposed to achieve is an increase in openness towards risk communicators and risk communication generally. Further desired outcomes are an increase in trust in science communicators and an improved understanding of scientific uncertainty. In the presentation, key features of the game will be highlighted, its co-creative development process described and the results of a first evaluation of the game’s effects will be presented. |
09:45 | The Ecosystem of Trust (EoT): Enabling effective deployment of autonomous systems through collaborative and trusted ecosystems PRESENTER: Jon Arne Glomsrud ABSTRACT. Ecosystems are ubiquitous, but trust within them is not guaranteed. Trust is paramount because stakeholders within an ecosystem must collaborate to achieve their objectives. With the twin transitions (digital transformation to go in parallel with green transition) accelerating the deployment of autonomous systems, trust has become even more critical to ensure that the deployed technology creates value. To address this need, we propose an ecosystem of trust (EoT) approach to support deployment of technology by enabling trust among and between stakeholders, technologies and infrastructures, institutions and governance, and the artificial and natural environments in an ecosystem. The approach can help the ecosystem’s stakeholders to create, deliver, and receive value by addressing their concerns and aligning their objectives. We present an autonomous, zero-emission ferry as a real-world use case to demonstrate the approach from a stakeholder perspective. We argue that assurance (grounds for justified confidence originated from evidence and knowledge) is a prerequisite to enable the approach. Assurance provides evidence and knowledge that are collected, analysed, and communicated in a systematic, targeted, and meaningful way. Assurance can enable the approach to help successfully deploy technology by ensuring that risk is managed, trust is shared, and value is created. |
10:00 | Agents of Action? Youth Climate Perceptions: A Literature Review PRESENTER: Una Aarsheim Milje ABSTRACT. Understanding youth perspectives are an important part of climate risk research, as they are both the group most vulnerable to climate change risks, and the decision-makers of tomorrow. Some research also indicates that youth might experience and respond to climate risks in qualitatively different ways compared to adults. This paper presents a literature review on youth climate perception, mapping out the theoretical and methodological contributions present in the literature. As such, the paper provides a thematic overview of the literature and discusses the main findings in light of relevant perspectives on risk perception, emotions and pro-environmental behavior. Contrary to previous accounts, the review revealed an equilibrium between qualitative and quantitative studies on youth perception and climate change risks. Qualitative studies largely focus on single-event collective behavior, commonly strikes and protests, whereas the quantitative research mainly explores individual perceptions and behavior. The literature identifies several factors important to generate and sustain action, such as climate knowledge, social networks, risk perception, identity, and emotions. Although finding that various literature acknowledges the importance of risk perception for youth climate action, the review finds few works concerning youth in risk related research. |
09:00 | Monitoring Confounder-adjusted Principal Component Scores with an Application to Load Test Data ABSTRACT. In structural health monitoring (SHM), measurements from various sensors are collected and reduced to damage-sensitive features. Diagnostic values for damage detection are then obtained through statistical analysis of these features. However, the system outputs, i.e., sensor measurements or extracted features, depend not only on damage but also on confounding factors (environmental or operational variables). These factors affect not only the mean but also the covariance. This is particularly significant because the covariance is often used as an essential building block in damage detection tools. This talk will present a method for calculating confounder-adjusted scores utilizing conditional principal component analysis, which entails estimating a confounder-adjusted covariance matrix. The technique will be applied to monitor real-world data from the Vahrendorfer Stadtweg bridge in Hamburg, Germany. |
09:15 | An improved subset sampling method and application in high cycle fatigue reliability assessment of a mistuned composite fan bladed disk assembly PRESENTER: Xu Tang ABSTRACT. Carbon fiber reinforced plastics are characterized by the outstanding mechanical properties. Many corporations are developing this kind of composite fan blades and apply to the next generation high-bypass-ratio turbofan engines. Stochastic sources always cause discrepancy between realizations and nominal design, e.g. raw material strength, manufacturing tolerance, defect and damage (crack), assembly, service environment, etc. The current focus of probabilistic safe design for gas turbine components have been put on efficiently evaluating the reliability index. The objective of this study is to derive an efficient approach to evaluate the risk of a structural mistuned fan stage subject to vibration-induced high cycle fatigue. The stochastic variable is failure probability of a single composite fan blade based on corresponding probabilistic design curve in one of typical vibration modes. Since crude Monte Carlo simulation (MCS) is strongly dependent on probability of rare event, subset sampling is a remedy for this limitation by separating failure domain into a series of intermediate regions. The target probability is a product of auxiliary conditional failure probability with intermediate thresholds. However, low acceptance rate via the classical Metropolis-Hastings Markov chain Monte Carlo (MCMC) simulation leads to erroneous estimates of conditional probability. In the modified subset sampling scheme, samples of Markov chain is generated by affine invariant ensemble algorithm that the acceptance ratio of candidate points is increased by generating proposal samples using stretch move. Through conducting simple validation cases to compare performance with adaptive Kriging MCS algorithm, the proposed method would reduce the number of limit state function evaluations and increase numerical accuracy. It is then fully integrated into composite fan blade-disk finite element model. Intermediate conditional failure probabilities are calculated with the distributed surrogate models of vibratory stresses by sampling. The reliabilities in typical mistuning patterns would also demonstrate good performance with application to a complex composite structure. |
09:30 | Probabilistic design approaches of centrifugally cast concrete structures in Telecommunications infrastructure PRESENTER: Ona Lukoševičienė ABSTRACT. The increasing need on telecommunications infrastructure has led the demand for tall, durable and sustainable structures such as masts and towers. Centrifugally cast concrete, known for its high strength and density, excellent performance under dynamic loads, is becoming an important material for these applications. This study focuses on the probabilistic assessment of centrifugally cast concrete structures to evaluate their reliability and performance under real environmental conditions. The research employs probabilistic models, including Monte Carlo simulations, to analyse the safety and behaviour of structures. Taking into account the model uncertainties of material properties, load variations, and environmental factors, the assessment provides a comprehensive understanding of the behaviour of centrifugally cast concrete cores. Main factors include second-order effects, geometric and material nonlinearities, creep, and cracking, which are critical for tall, slender structures subjected to complex loading conditions. A comparative analysis of towers and masts with circular cross-sections made from centrifugally cast concrete cores highlight significant advantages of these systems. These include enhanced reliability indices, reduced material consumption, and lower environmental impact. The innovative mast design, with pre-tensioned guy wires and strut elements, improves stability by reducing bending moments and effective buckling lengths, thus optimizing structural behaviour. The probabilistic approach allows to determine the main and additional design parameters and contributes to the development of economical, environmentally friendly, and reliable infrastructure. This study emphasizes the importance of integrating probabilistic methods into structural design processes to address uncertainties and improve the sustainability and resilience of telecommunications structures. By combining advanced materials with probabilistic engineering principles, this research advances the evolution of telecommunications infrastructure, ensuring its adaptability to modern requirements and future challenges. |
10:45 | The benefits of using the IDEAcology platform for SEJ elicitations with IDEA ABSTRACT. The IDEA protocol (Investigate, Discuss, Estimate and Aggregate) for structured expert judgement is gaining popularity in ecology and biosecurity applications. The IDEAcology interface was designed to help plan and facilitate the implementation of the IDEA protocol. Several recent elicitations using IDEAcology demonstrated that the platform seriously reduces the logistical difficulties and provides an efficient way for experts to input, visualise and cross-examine the estimates of their peers. This in turn simplifies all the stages of an elicitation process, from a pre-elicitation streamlined project management, all the way through post-elicitation data management. During the elicitation it aids fast and efficient reporting back to experts and it provides cost-effective quantitative assessments gathering in both face-to-face and remote formats with comparable efficacy. The IDEAcology interface, including training materials, can be accessed through www.ideacology.com. In this talk, I'll present both the platform and its benefits and a couple of recent applications of SEJ with IDEA on IDEAcology. |
11:00 | Risk, uncertainty, decision and stakeholders - best practice for sustainable outcomes ABSTRACT. Quantifying risk and uncertainty and decision modelling have a great deal in common, and in particular that they are focussed on a main stakeholder or decision-making panel who stands to benefit from the utility they have defined. In every case, to achieve the objectives of the exercise, an essential starting point with any problem is to interact with problem-owners, their advisers, experts and close stakeholders to understand their perspectives, views, values, uncertainties, worldview, etc. In this paper we demonstrate the vital steps to be taken in achieving a high-value outcome, from discovering the purpose of the modelling, through engaging relevant stakeholders to delivery of a sustainable and requisite model. |
11:15 | Modeling and diagnostics of biased judgments ABSTRACT. The paper presents a heuristic modeling and resolution of cognitive biases in an ambiguous context resulting from the biased subjective understanding of the conflict situation. The paper applies an original symptom-based context evaluation procedure for a clear and rational interpretation of situational awareness in the decision-making process. By sequentially and simultaneously adopting and using two overlapping types of cognitive process, additive and subtractive, judgmental unreliability is assessed for conflicting and violating contexts. The main idea to overcome these biases is to use the assumption of the dual involvement of the symptom as delay waves in the intuitive conscious processing of the information. To account for the causes of misjudgments in different erroneous actions (e.g., errors of omission and errors of commission) one must dynamically track and identify the possible trajectories of the actual recognized context. Judgment in an unambiguous context is a monotonic process with decreasing information entropy and contextual probability of erroneous action, i.e. increasing probability of successful cognition as a sum of increasing discrete probabilistic amplitudes of contextual symptom recognition. But judgment in an ambiguous context is a wave-like alternating process with successively alternating decreasing-increasing (concave) context probability and information entropy, i.e. alternating increasing-decreasing (convex) cognitive probability as the corresponding sum of varying discrete probability amplitudes. These probability amplitudes result from a combination of changes between the number and weights of objectively existing and subjectively imagined context stimuli/symptoms in a given situation. By distinguishing these at least two overlapping additive and subtractive types of cognition consisting of different numbers of stages, we can resolve some of the paradoxes of expected utility theory by predicting possible alternatives/trajectories of the context development and probabilistically comparing them for some of the known cognitive biases. |
11:30 | Estimating current and future climate risk for indigenous cultures via structured elicitation: The test case of Ovalau Island, Fiji PRESENTER: Nicholas Moran ABSTRACT. Many of Fiji’s indigenous iTaukei communities maintain traditional lifestyles, where village cultures are supported by small-scale agriculture and fisheries. These communities may be at risk due to climate change, particularly given their close connection to the environment. Damage to reef and mangrove ecosystems, and changes in seasonal weather patterns, rainfall and drought may directly impact fisheries and agricultural production and have secondary impacts on cultural values and practices. This study develops a methodology to measure changes in traditional subsistence lifestyles and associated cultural values associated with recent changes in climate, and to estimate the risk of future climate impacts. There is often a lack of historical data quantifying past productivity and cultural values in the Pacific, therefore we use a structured elicitation approach to estimate historical changes in subsistence and cultural values. Using the expert local knowledge of indigenous communities of six villages on the island of Ovalau, we quantify the recent changes to agricultural production, fisheries, water security, and 10 measures relating to cultural values and practices (e.g., social cohesion and wellbeing, food sharing/Veiwasei within villages, connection to land/Vanua, etc.). A large majority of participants identified reductions across all measures elicited, with participants often identifying loses of over 50% for each measure within just the last 20 years. These perceived changes to subsistence production and cultural values are combined with historical climate data for Ovalau in a Bayesian Network modelling framework. Estimates of future changes may then be produced using projected future climate scenarios. This study highlights a strong perception of recent climate loss and damage among Ovalau’s communities, and also demonstrates an innovative method for estimating historical climate impacts and predicting future climate risk based on the experiences, knowledge and expertise of indigenous communities. |
10:45 | Preparing as caring: how gendered practices and temporalities of care interact in discourses of preparedness, 1939-2023 PRESENTER: Johanna Overud ABSTRACT. This project aims to analyze how preparedness, as a gendered practice, is represented across different historical time periods. By focusing on individuals’ engagement in building preparedness, the study examines these activities as practices of care. Historically, care and care work have been strongly associated with the feminine, linked primarily to the private or domestic sphere and women’s central role in reproduction. This project explores how civilian duties and interpersonal relations for preparedness can be seen as a historically specific form of governance. This project aims to analyze how preparedness, as a gendered practice, is being “done”, in the echo of different times. In this project, individuals’ engagement building up preparedness in relation to state information is put in focus and analysed as practices of care. We adopt an integrated approach to understand the gendered aspects of national preparedness and civil responsibilities in the past, present, and future. By focusing on how preparedness is placed in the home, (through information campaigns etc). By examining these practices as forms of care, we provide a conceptual framework for understanding how care, through the doing of preparedness, extends from the family to society. This framework allows for a discussion on who is cared for and the implications of these care practices. The study also delves into popular culture, including housekeeping descriptions and representations of home-preppers. A particular focus is placed on 1940s women’s weekly magazines in Sweden, such as Husmodern and Veckojournalen, which featured articles on national preparedness for housewives, but also on state guidelines on house keeping including food preservation and resource management. This analysis is extended to how contemporary media focuses on preparedness in the home and how this should be built up, including podcasts and media description, to draw parallels and highlight ongoing practices of care and preparedness |
11:00 | Physical Disability Inclusive Crisis Management PRESENTER: Ida Joao-Hussar ABSTRACT. Disaster risks are increasing, necessitating the strengthening of disaster preparedness, integration of risk reduction, and ensuring effective response and recovery. It is important to include vulnerable groups and their representatives in preparedness processes, ultimately promoting greater resilience among individuals and communities. Previous research has explored factors influencing the effectiveness of disability inclusion activities, but a comprehensive overview of these barriers and facilitators is lacking. Thus, it is important in consultation with, and participation of, people with disabilities (PWD) and their organizations to truly adopt the envisioned whole of society approach. The author’s doctoral research focuses on the inclusion of people with physical disabilities (PPD) in crisis preparedness and reducing their vulnerability. The author aims to share her research on inclusive crisis management practices for PPD at the European Safety and Reliability (ESREL) and Society for Risk Analysis Europe (SRA-E) conference. A scoping review based on 32 papers was conducted to identify the factors that hinder and facilitate the participation of PPD in crisis preparedness and response. This study examines the individual, institutional, contextual, and situational barriers and facilitators to the inclusion of PPD in crisis management. The findings reveal that potential barriers to PPD inclusion in crisis management include lack of awareness and understanding among responders, accessibility issues, and negative attitudes that lead to exclusion. However, the growing recognition of inclusive practices, advances in assistive technologies, and the development of inclusive policies and guidelines can facilitate their involvement in crisis preparedness and response. Overall, addressing these barriers and promoting inclusive practices can ensure equal opportunities for all individuals and communities to prepare for and recover from disasters. |
11:15 | Vulnerability in crisis – lessons from the experiences of the most marginalised during the COVID-19 pandemic in Europe PRESENTER: Kristi Nero ABSTRACT. Building resilience in individuals and societies has become a focal point of scientific research and practical innovations in the modern field of crisis management. For this, understanding the vulnerability and how it is shaped, is vital. In recent years the static perspective of vulnerability as intrinsic characteristics of certain groups is increasingly undermined and giving way to recognising the multiplicity of drivers and their interplay that render individuals vulnerable. While acknowledging the dynamic and intersectional nature of vulnerability, it is evident that certain segments of society, such as marginalised groups, are burdened by multiple factors even prior to crises. Previous crisis have demonstrated that insufficient awareness about the needs and capabilities of the diverse society by the decision makers exacerbates previously existing vulnerabilities, creates new disadvantages and hinders access to services during disasters. However, very little attention has been paid to the pathways of vulnerability during disasters empirically. Global COVID-19 pandemic brought higher mortality rates and increased risk of getting infected or being a hospital case among individuals with socio-economic disadvantages. This article combines empirical data from cross-sectional questionnaire survey among the 313 clients of care organisations like soup kitchens, day centres and living facilities in 13 European countries with 32 expert-interviews, 5 workshops and international colloquium with managers and staff of these organisations. The results show that situation of vulnerable people was worsened by decision-makers’ lack of awareness of their situation when implementing crisis measures. Similarly, crisis communication did not consider their needs and left the burden of reaching those who couldn’t receive or understand official information to social workers. Insights from this study deepen our understanding of the factors shaping vulnerability and their interaction. Leveraging this knowledge, social and institutional structures can be built to act as safety nets to anyone that becomes vulnerable when disasters hit. |
11:30 | Lost in Translation: Operationalizing Justice in Heat Resilience Building Projects PRESENTER: Amir Hossein Pakizeh ABSTRACT. Urban heat resilience strategies are crucial for enhancing societal safety and reliability, but they also involve the distribution of benefits and burdens, often intersecting with justice-related concerns. In this study, we adopt a pluralistic perception of justice to explore how various interpretations of distributional justice theories influence the practical implementation of urban heat resilience strategies. Our analysis focuses on the urban microclimate of Greater Sydney, Australia, where we examine the distribution of tree canopies. We analyze how varying interpretations of justice theories, the granularity of the data, and the spatial scale result in different assessments of the resource needs. Our findings show that increasing tree canopy coverage can counterintuitively exacerbate injustices in how heat resilience is distributed across populations. Specifically, areas already disadvantaged may continue to face higher heat risks due to unjust resource distribution. Additionally, our analysis demonstrates that greater granularity in data uncovers further deviations from justice, while adjusting the spatial scale to more localized levels does not necessarily alleviate these deviations. This study highlights the importance of integrating justice considerations into the design of urban resilience strategies to ensure that they effectively reduce vulnerability and enhance resilience for all communities. By critically examining how justice theories are operationalized within resilience efforts, we emphasize the need for policies that address both the structural factors driving vulnerability and the diverse needs of local contexts. |
10:45 | Towards Operationalizing Ecological Resilience for Infrastructure PRESENTER: Yamil Essus ABSTRACT. Engineering methods for enhancing infrastructure resilience have long been criticized by ecologists, yet the lack of quantitative tools to operationalize ecological resilience is stopping its adoption. Enhancing resilience from an engineering perspective is an attempt to increase the local stability of the system around an optimal operation point, reducing the magnitude of and recovery time from a failure, should one occur. However, this definition presumes that all possible equilibrium states of the system are known in advance and often assumes the existence of a unique optimal state. We propose to incorporate ecological resilience ideas into infrastructure projects by modeling system behaviors and identifying emerging equilibria rather than assuming them. We model a hypothetical decision problem where a government agency has to decide whether or not to install hard-adaptive measures to coastal flooding, such as a levee, and each individual in the community has to decide whether to stay or retreat. We model how population changes in the levee-buyout problem and find that, under certain conditions, two stable equilibrium points can be found. The benefits the government agency expects from a levee depend on the number of people. Thus, the levee is only economically feasible if the population exceeds a threshold determined by the cost-benefit analysis. An operational definition of ecological resilience can provide a more holistic analysis, leading to versatile systems better suited to withstand climate change. The levee-buyout problem illustrates how this modeling approach can extend engineering resilience by incorporating the possibility of multiple equilibria and numerical tools to quantify stability between them. |
11:00 | Unveiling the Intersection of Crisis Management and Resilience in Tackling Real-world Uncertainty in Infrastructure's Emergency Events PRESENTER: Ali Nouri ABSTRACT. The management of uncertainty in critical infrastructure during emergency events is recognized as a significant challenge. In this paper, the synergy between crisis management and resilience strategies is explored as complementary approaches for addressing real-world uncertainties in emergency event management. Crisis management is centered on rapid response and recovery, while resilience is focused on system adaptability and long-term sustainability. In the context of complex, dynamic emergencies—such as infrastructure failures caused by extreme weather events—both approaches are seen as essential for ensuring operational continuity and minimizing damage. Various forms of uncertainty, including aleatoric (inherent variability), epistemic (knowledge gaps), stochastic (event-driven risks), and ontological (black swan events), are discussed to highlight their influence on decision-making in emergency management. A framework is proposed that integrates crisis management and resilience strategies, enabling better anticipation, resistance, and recovery from disruptions in infrastructure systems. Case studies on critical infrastructure, such as power distribution networks affected by ice storms, demonstrate how these strategies can improve both immediate crisis response and long-term resilience. By uniting these approaches, greater infrastructure robustness and reduced impact from future emergencies can be achieved. The comprehensive framework provided offers insights into developing more adaptive, robust, and resilient systems in the face of increasing uncertainties driven by global risks such as climate change. |
11:15 | A framework to study the lifespan resilience of critical infrastructure in the face of the evolution of disaster risks PRESENTER: Alejandra Cue Gonzalez ABSTRACT. Risk analysis of critical infrastructure across short, middle, and long-term horizons is challenging yet crucial for decision-making aimed at minimizing disturbances, damages, and destructions. Climate change, among other global trends, significantly impacts the exposure and vulnerability of critical infrastructures while influencing their resilience capacities. This complex interplay of factors challenges organizations to proactively identify and prepare for potential future events, challenges, or changes before they occur. This capacity of anticipation is crucial to maintain system resilience, as it allows organizations to detect early warning signs, allocate resources efficiently, and respond more effectively to expected and unexpected disturbances. It is about foreseeing threats and recognizing opportunities for improvement and innovation. It requires continuous learning, flexible planning, and a systemic understanding of complex interactions within and outside the organization. This paper presents the Disaster Risk-gUided Scenario Definition (DRUID) framework that aims to guide the definition of scenarios that support the study of the resilience of the lifespan of a critical infrastructure. It is an interative four-step approach. The first step consists of posing the problem to address using a canonical problem formulation and then collecting the necessary data. The second step elaborates representative prospective scenarios using the General Morphological Analysis (GMA) approach. For each representative scenario, the third step aims to study the resilience of the critical infrastructure when faced with specific types of disaster risks. It consists of an iterative model that aims to identify the potential consequences of disaster risks on the resilience of the critical infrastructure throughout its lifespan, given plausible decision-making alternatives. The fourth step involves discussing the results obtained regarding the formulated problem. The paper describes and partially illustrates the DRUID method through a pedagogical case study on the development of a PV power plant in a mountainous region with Mediterranean climate in the southeast of France. |
11:30 | Resilience quantification for critical infrastructure in urban areas including nature-based solutions PRESENTER: Mirjam Fehling-Kaschek ABSTRACT. In the face of increasing urbanization and climate change, the resilience of critical infrastructure is essential for maintaining the functionality and safety of urban areas. This study presents a framework for quantifying the resilience of critical infrastructure systems, with a focus of integrating nature-based solutions (NBS) to mitigate climate-related risks. NBS have attracted broad attention recently since they promise to not only increase the resilience of humanly build infrastructure but to also support environmental protection. However, given their limited implementation and the lack of practical experiences regarding their effectiveness against, e.g., natural disasters, modelling approaches, like the one presented in this study, are essential to analyse the effects of NBS on the protection of CIs against a broad set of disruptions. To estimate resilience, we construct a network-based representation of the critical-infrastructures within the area, encompassing central supply components like power supply systems and essential facilities such as hospitals. The links of the network correspond to physical or logical connections of the components, enabling simulations of cascading failures triggered by extreme events. The methodology combines a comprehensive assessment of infrastructure vulnerabilities with the potential benefits of NBS such as natural flood management strategies. Focusing on climate-related threats, particularly flooding events, we simulate the time evolution of damage cascades to estimate the performance loss over time due to adverse events. Case studies from urban areas at risks of flooding demonstrate the effectiveness of this approach, highlighting how NBS can enhance the resilience of the area by mitigating performance losses during adverse events. These findings contribute to a deeper understanding of the interplay between infrastructure resilience and ecological strategies, offering valuable insights for urban planners and policymakers aiming to foster resilient and sustainable urban environments. |
11:45 | Enhancing Community Resilience: A Simulation-Based Framework for Post-Hazard Access to Essential Services at the Building Level. PRESENTER: Zaira Pagan-Cajigas ABSTRACT. Assessing access to critical services is crucial for evaluating resilience, as the literature suggests that individuals with better access to critical services tend to be more resilient. In this study, we build upon the Equitable Access to Essential Services (EAE) method, which redefines resilience as “access to essential services”. Our framework measures access by assessing building-level proximity to discrete point services (DPS), such as hospitals, grocery stores, banks, and gas stations, before and after a hazard. This approach acknowledges the importance of considering the interconnections between services while integrating community insights into the analysis. To understand the functionality of each DPS post-hazard, we simulate the effects of a tropical cyclone on an interconnected system, including power, water, and communications networks. The results of the simulation provide status updates on power, water, and communication at a building level, which are then used to determine whether a DPS remains operational post-hazard. Subsequently, we assessed building-level access to the nearest operational service. This framework helps identify areas with access disparities within the community, guiding decision-makers and emergency management officials to allocate resources more equitably. To demonstrate the application of our framework, we selected Cayey, Puerto Rico, as our case study community. Data for the interconnected system was obtained from publicly available data and fieldwork. The DPS included in this analysis were identified through interviews with community leaders and a survey conducted in Puerto Rico. Furthermore, we evaluated access outcomes for this community by comparing different resilience strategies, such as the impact of integrating backup power generators or water storage systems into DPS. Ultimately, this research establishes a forward-looking simulation model that not only measures community resilience and inequities in accessing essential services at the building level but also enhances our broader understanding of community resilience dynamics. |
10:45 | Multivariate Simulation of Product Fleets based on Usage Data: Case Study on Light Electric Vehicles PRESENTER: Georgios Ioannou ABSTRACT. As the requirements for technically complex products and their functionality increase, product complexity continues to rise. At the same time, development times and costs must be reduced to ensure that technical products remain marketable. This leads to an increase in possible damage causalities and potential field failures. This applies in particular to the development of new markets, such as electromobility in the light vehicle sector, known as light electric vehicles (LEVs). The battery systems installed in these vehicles harbor a comparatively high risk of function- and safety-critical failures. These present companies with new challenges when operating product fleets in the private and commercial sectors due to the high level of safety and reliability required. To reduce the risk and increase the reliability of LEVs in the field, analysing product data from the field is highly relevant in order to be able to predict the remaining useful life. One way of supporting the reliability analysis process during the utilization phase of the product life cycle is to simulate and prognose the further use of a product or a product fleet. The existing and simulated usage data can then be used for forecasts regarding the remaining useful life of products. This paper presents the results of a feasibility study in which a concept for the multivariate simulation of product fleets based on usage data from the field is applied in the context of LEVs. Field data from a Kumpan electric 54 e-moped, which was recorded over a period of several days, serves as the data basis. The available data was analysed and used for the multivariate simulation. Finally, the simulation results were compared with the original data and a conclusion was drawn. |
11:00 | Enhancing reliability in hybrid systems: design, performance, and O&M factors. PRESENTER: Luciana Yamada ABSTRACT. The integration of different renewable energy sources into hybrid systems has garnered significant attention for its potential to enhance system reliability. By combining complementary sources, these hybrid systems provide a more stable energy supply, reduce the impact of source variability, and improve overall performance. High reliability requires a realistic representation of the system, which must consider factors such as equipment availability, accessibility, failure and repair rates, different energy profiles, and both investment and maintenance costs. This work presents an optimization framework for hybrid renewable energy system design that incorporates O&M factors to estimate system performance accurately while identifying the most cost-effective configurations to enhance investor returns when hybridizing or expanding energy parks. The availability is modeled through a stochastic process influenced by the selected number of components and the frequency with which they can fail (failure rate) or be repaired (repair rate). Moreover, the effects of maintenance schedules and weather delays on system performance are incorporated through an accessibility factor integrated into the availability model. This approach allows a more accurate system performance estimation considering downtime and associated financial losses while the configuration is analyzed. This study provides a more accurate system performance prediction and enhances investor returns by incorporating the availability factor into the configuration analysis. It also offers insights into when it is critical to model O&M factors. |
10:45 | Integrating Dependency Analysis Through Structural Equation Modeling and Artificial Neural Networks: A Case Study in the Mining Industry PRESENTER: Pablo Viveros ABSTRACT. Industry 4.0 technologies are revolutionizing industrial maintenance management, highlighting Machine Learning (ML) techniques as key tools to anticipate failures more efficiently [1]. This study analyzes the dependencies between components of a crushing line of a mining company in Chile using ML models, with the objective of predicting unplanned events (maintenance and operational events), since the importance of a comprehensive approach in the analysis of failure types is currently recognized, allowing to improve maintenance management by offering an alternative to reduce costs associated with maintenance and downtime [2]. The main motivation is to increase the accuracy of early warning systems, supporting more informed decisions. ML models such as Random Forest (RF) and Artificial Neural Networks (ANN) are employed, whose dependency analyses have shown positive results in previous studies [3]. In addition, Structural Equation Modeling (SEM) is integrated, which allows exploring the complex interrelationships between the system variables and the different TUEs [4]. The models were evaluated using the confusion matrix, accuracy, precision, recall and F1 score, complemented by SEM-derived indicators that strengthen the validity of the results. ANN showed outstanding performance with an accuracy of 0.9926 and significant relationships according to SEM, while RF suffered from overfitting, limiting its applicability in SEM. This is why the findings of ANN are extended to another prediction model, also based on ANN, given its success in failure prediction [5]. This dependency analysis provides a novel approach that strengthens the understanding and improves the results of fault prediction models, contributing significantly to the existing research in this area. |
11:00 | Rotating machinery reliability assessment study via Deep Learning models: new bearing vibration bench's simulated data. PRESENTER: Leonardo Raupp ABSTRACT. Over recent years, there is a high growth of interest in data-driven approaches, in special Deep Learning (DL), for reliability assessment, as reliability engineering is crucial for maintaining the success and competitiveness of industries as it can help improve the Reliability, Maintainability, Availability and Safety (RAMS) of systems. Those models, though, require high amounts of data to be trained, which is not widely available as only a few publicly accessible datasets exist to evaluate them. On top of that, there are multiple alternative ways that a problem can be approached. Therefore, this work provides a comparative study of multiple Deep Learning models, including Multi-Layer Perceptron (MLP), Convolutional Neural Networks (CNN), Recurrent Neural Networks (RNN) and Auto-Encoders (AE), for reliability assessment of bearing’s faults on a new bearing vibration dataset, obtained by the operation of an experimental vibration bench. The bench operates with two 1205K bearings simultaneously, their housings interconnected by a synchronizer pulley with a guide. In this dataset, 4 bearing conditions are considered: healthy, inner ring damage, light outer ring damage and heavy outer ring damage, while also considering a combination of them by coupling different damage types to different housings in the bench, each case ranging from 5 Hz to 30 Hz rotational speed. The sample speed considered is 2048 Hz for 1 second each acquisition. This lower sampling rate presents a challenge in capturing high-frequency vibration features, but also reflects more practical or resource-constrained data collection scenarios, where high-frequency sampling is not always feasible. On top of that, the new dataset includes four distinct bearing conditions, but with the added complexity of combining different types of damage across bearings. The application of the aforementioned models should be a basis for understanding the vibration patters associated to combined fault modes under more constrained industrial scenarios. |
11:15 | Condition monitoring approach based on unsupervised anomaly detection for pumps regulating groundwater level under a coastal infrastructure PRESENTER: Arto Niemi ABSTRACT. This paper describes a condition monitoring approach for pumps regulating groundwater level under a port infrastructure. We focus on the Bremerhaven container terminal located in northwest Germany at the mouth of the river Weser. Our aim was to construct a strategy to detect potential pump failure indications that could inform conditional maintenance actions. Two signals were available for us: the groundwater level, measured with a radar, and the binary pump on/off operation signal. For this purpose, we tested four unsupervised machine learning-based anomaly detection algorithms, in combination with multiple post-processing methods for anomaly scoring and thresholding. Additionally, we developed a model to simulate the groundwater level signal, enabling the test of failure modes that were not present in measured data. We found that the appropriate selection of model and post-processing method was critical for obtaining satisfactory results in both measured and simulated signals. |
11:30 | Physics Informed Deep Learning for Flow Measurement Error Correction PRESENTER: Stephan Wernli ABSTRACT. High precision devices for flow measurement are used in various industries, including pharmaceuticals, chemicals, food and beverages, cosmetics, gas and oil. Independent of the underlying physical principle of the devices, they suffer systematic measurement errors due to various factors that cannot always be directly measured and compensated. Correcting such systematic measurement errors is crucial in achieving a yet higher measurement accuracy. In this work we introduce a hybrid method for automatic error correction of vortex-based flow measurement devices. The method combines computational fluid dynamic (CFD) simulation results together with data driven deep learning in order to estimate the measurement errors in different regimes. We demonstrate the effectiveness of the method using experimental data and show that the hybrid approach outperforms a pure data-driven approach, even in cases of unknown measurement configurations. |
10:45 | New standards for industrial robots PRESENTER: Alessandra Ferraro ABSTRACT. The new standards ISO 10218-1 Robotics — Safety requirements — Part 1: Industrial and robots, and ISO 10218-2 Robotics — Safety requirements — Part 2: Industrial robot applications and robot cells will be updated in few months. Main changes aim to give measures useful for the risk assessment developed by manufacturers and integrators. This activity results particularly heavy and sometimes may not develop as effective as necessary for safety and well-being of operators, especially for collaborative applications. Among the changes made, the standard ISO 10218-1 classifies robots in two classes named Class I or Class II. Classification as a Class I robot shall be determined by the maximum capability of the manipulator without being limited by robot or safety functions, but merely based on these values: mass per manipulator (M) is 10 kg or less; maximum achievable speed of the tool center point is 250 mm/s or less; maximum achievable force per manipulator (FMPM) is 50 N or less. The latter must be tested in accordance with the test methodology proposed by the standard. In this paper we analyze this methodology and compare it with the previous one. |
11:00 | A safety model for industrial environment based on the Bayesian network paradigm ABSTRACT. A safety model for identifying hazards and predicting their possible consequences in an industrial facility is proposed. Complex cause-effect relations between safety-related events, starting from the initial ones, through the intermediate, to the final, are represented in the form of a Bayesian network. The input data originate from sensors and meters installed in safety-critical locations. Using the Bayesian network methodology, the impending hazards, accidents, or machinery breakdowns can be predicted from symptoms indicated by the monitoring devices. Also, the "reverse analysis" of the network can establish the root causes of these undesired events so that preemptive maintenance can be carried out to avoid them. For illustration, a simplified safety model of a biogas plant is presented along with its basic analysis. Although there is abundant literature on Bayesian networks in the safety and reliability context, much of it is limited to theoretical considerations or provides only general guidelines for the construction of such networks. Thus, publications reporting specific applications of this methodology are rather rare. The current paper aims to contribute to filling this gap. |
11:15 | Industrial warning system with active devices for signal reception and dynamic noise attenuation using artificial intelligence algorithms PRESENTER: Federico Paolucci ABSTRACT. The protection of workers' hearing in industrial environments is essential to ensure their safety and health. This article presents a general architecture and the essential components of a distributed integrated industrial system capable of receiving notifications from machinery and transmitting voice messages in real-time to workers. Utilizing a localization system based on Bluetooth Low Energy (BLE) technology, the system identifies the real-time position of workers. The one-way communication between machines and the notification server relies on an HTTP protocol with POST requests, allowing the sending of customized alerts to specific groups of workers. This improves communication effectiveness, ensuring that every potentially interested worker receives critical information for their safety. The system consists of two key components: a personal protective device (PPE) equipped with adaptive electronic filters supported by artificial intelligence, designed to dynamically filter harmful noises while allowing the transmission of essential alerts, alarms, and voice communications. The second component includes a user localization system and the transmission/reception between machines and worn PPEs, capable of generating safety alerts and operational instructions, thereby enhancing workers' situational awareness and protection. The article explores the system's architecture and highlights its potential benefits in terms of risk reduction in industrial environments, contributing to the creation of a safer work environment. By emphasizing the importance of technology in safeguarding workers' health, this system represents a significant step forward toward a safer and more responsible industry. |
10:45 | The Application of GFMAM’s Influential Asset Management Subject Matters in the South African Steel Industry PRESENTER: Lethabo Sethole ABSTRACT. The Global Forum for Maintenance and Asset Management (GFMAM) was established by industry professionals and organizations to view asset management as a strategic activity for achieving organizational goals. In the 21st century, there are two main types of technology-based businesses: rapid technology improvement and slow improvement industries like steel, mining, and metallurgy. Efficient decision-making linked to asset and maintenance management is crucial for long-term operations. Without a structured approach to asset management, organizations in slow-improvement industries risk inefficiencies, higher operational costs, and reduced competitiveness. The steel market is expected to decline over the next five years, with traditional methodologies negatively impacting the industry. Despite this, the steel sector remains significant to South Africa's GDP and employs thousands. The study aims to determine the rankings, critical success factors, and limitations of GFMAM's most influential subject matters in the South African (RSA) steel industry. A quantitative method was used, with 22 RSA steel organizations invited to participate in an online survey. The survey ranked the most influential subject matters as: (i) strategic planning, (ii) asset management policy, (iii) asset management leadership, (iv) asset management strategy and objectives, and (v) asset management planning. The study also identified critical success factors and limitations for each subject matter. It is recommended that RSA steel companies use these findings to gain a competitive edge in the market. |
11:00 | A qualitative model of the reliability-maintenance cost relation of critical hydraulic structures in support of complex modeling and communication of model results PRESENTER: Alexander Bakker ABSTRACT. The flood protection of the Netherlands critically hinges on the full readiness of their storm surge barriers. A strict form of risk based asset management is applied to ensure a good condition. This method relies on detailed reliability models that explicitly link maintenance to performance. In this way, safety standards should be met in an efficient way and without unnecessary investments in maintenance. Nevertheless, the necessity of investments appears often hard to explain. This may be caused by the fact that the reliability models rely on strong, but implicit assumptions about how the maintenance is performed. Here, we introduce a simple, S-shaped model that transparently explains the impacts of a wide range of maintenance strategies on the storm surge barrier performance. This model can be used for the communication with decision makers and to qualitatively assess on what aspects the underlying reliability model could be refined to optimally support the asset management. This added value of the model is illustrated on the basis of a simple storm surge barrier. |
11:15 | Characterization, dynamic modeling, and monitoring of the degradation of hydroelectric production infrastructures PRESENTER: Jack Lally ABSTRACT. Global hydropower infrastructure, particularly in France, faces three primary challenges: aging infrastructure, climate change, and hydropeaking. These issues result in increased rates of degradation, with a higher degree of associated unpredictability. The objective of this research is to identify a model that would inform an optimised maintenance plan within a host organisation, to aid in ensuring the availability and good operation of hydropower production assets while balancing strategic production objectives with risks. The chosen methodology for predictively modeling the asset degradation phenomena must leverage the knowledge of how degradation mechanisms evolved historically for particular critical assets, taking into account condition monitoring data over time, to recognise trends in their health state and thus enabling the optimisation of maintenance interventions, scheduling them to low-output periods, minimising production losses. The work presented in this communication involved an investigation of the existing data and its sources, including experts’ feedback, within the host enterprise, a review of the relevant literature on dynamic modeling and the monitoring of degradation mechanisms, and an evaluation of potential degradation modeling methods that could be applied to two distinct assets that were selected as case studies: the spillway gate and the alternator. Ultimately it was determined that a model based on a Bayesian Stochastic Petri Net (BSPN) would meet the desired criteria for a degradation model for asset management, while also allowing for refinement over time as more data becomes available and adapts as variable degradation drivers continue to evolve. |
11:30 | An MCDA Framework for Asset Management of Medical Equipment in Crisis Contexts PRESENTER: Molk Souguir ABSTRACT. Investment and maintenance policies for hospital medical equipment require a well-informed decision-making process to ensure patient safety, service quality, and strict adherence to standards. Multicriteria Decision Analysis (MCDA) methods, such as ELECTRE, PROMETHEE, and AHP, play a crucial role in assisting managers responsible for investments and maintenance in selecting optimal asset management policies. These decisions pertain to renewal rates, equipment reserve levels, and preventive maintenance intervals, which define various management alternatives. In normal situations, criteria such as patient safety, total cost, equipment availability, and environmental sustainability are typically assessed through reliability, maintenance, obsolescence, and cost models. Monte Carlo simulations are frequently used to objectively estimate these criteria to obtain objective values, limiting the expert judgment efforts. In a crisis context —where demand for services exceeds capacity— asset management alternatives must be adapted to simplified crisis scenarios, considering both normal and crisis periods. Criteria should be evaluated under different assumptions for each period, with possibly specific factors (e.g., mortality rate) introduced for the crisis context. This research aims to develop a multicriteria decision model that supports decision-makers in selecting the optimal investment policy and maintenance plan for medical ventilators in both normal and crisis situations. It also explores the impact of crisis-specific criteria on decision-making, with the ultimate goal of enhancing resilience during crises. The proposed MCDA framework, validated by a hospital in Belgium, enables continuous refinement of the definition of management alternatives which appears as the critical step in the process to manage investments and maintenance policies of medical equipment under both normal and crisis conditions. |
11:45 | Failure Mode and Effects Analysis for a Battery Storage System Using Second Life Lithium-Ion Batteries PRESENTER: Hector Hernandez ABSTRACT. It is well known that Lithium-Ion (Li-ion) batteries are one of the most common tools for storing energy due to their versatility and scalability. With the growth of the electric vehicle market, the relatively short life of Li-ion batteries in vehicle service could lead to significant battery waste. To address this issue, methods have been developed to recycle these components and give them a second use in complex electrical systems, contributing to the fight against climate change. However, if not properly treated, failures in Li-ion batteries can present risks to human health and the environment. Therefore, reliable systems are needed for the use of Li-ion batteries, especially in critical energy storage applications. Several aspects must be considered when assessing the reliability of a system, some of them include evaluating the failure modes of system components and determining how these failures may impact the entire system. A specific method used throughout all stages of this process is Failure Mode and Effect Analysis (FMEA). Although this is methodical and time-consuming, FMEA helps identify the causes of events that lead to system failure determining the consequences, and ultimately, minimizing both the occurrences and recurrences of such events. For this work, a system based on recycled Li-ion batteries for energy storage purposes was evaluated using FMEA. This reliability analysis comprehensively assessed risks represented by each of the components leading to the identification of the dependence between sensors, tools for temperature regulation, and control methodologies (voltage, current, and cycles), which contribute to creating a suitable environment for the use of batteries in energy storage. Failures related to these components can lead to capacity and power fade issues, which, if they progress, can result in total system loss or may pose serious threats to human health and to the environment. |
10:45 | Probabilistic Modeling of Riverine Flooding in Urban Areas: Addressing Uncertainty and Spatial Resolution Challenges PRESENTER: Lucas da Silva ABSTRACT. Riverine flooding poses significant challenges for urban risk assessment due to issues of spatial granularity and uncertainty modeling. Effective flood hazard evaluation requires high-resolution geospatial data that capture local geomorphological features. However, global climate models often lack the resolution needed to predict localized, high-intensity events, such as floods triggered by heavy rainfall or sudden river level changes. Uncertainty in riverine flood events adds complexity due to high spatial and temporal variability. To address this, we propose a probabilistic riverine flood risk assessment model, integrating local variables which might involve topography, watercourses, and drainage systems. Statistical downscaling improves the spatial resolution of climate and hydrological data, enabling finer-scale predictions. However, this approach relies on historical data, which may not fully reflect emerging patterns or rapid environmental shifts. Our framework enhances urban adaptation strategies globally by tailoring models to the particular characteristics of each area. It addresses shortcomings of existing models, which often lack resolution for effective flood planning, and balances the tradeoff between accuracy and computational complexity. This dual focus ensures scalability while considering local needs. By focusing on riverine flooding, the model bridges gaps in current assessments, helping urban planners incorporate localized factors into mitigation strategies and adaptive policies. The approach provides actionable information to strengthen resilience against climate-induced floods and minimize socioeconomic impacts. Presenting a globally adaptable framework, this model improves understanding of riverine flood dynamics in diverse urban contexts. It supports informed decisionmaking for sustainable urban development, addressing the growing risks of extreme flood events and their impacts on urban functionality. |
11:00 | Coupling risk assessment framework of urban lifelines PRESENTER: Yiping Bai ABSTRACT. In the era of worldwide urbanization, the networked urban lifelines (natural gas, power, water, drainage, etc.) facilitate the operation and development of cities. However, the more complicated and interdependent lifelines may cause higher risks or even the system breakdown of urban lifelines. The leakage and explosion of underground natural gas pipelines in adjacent structures is a typical coupling accident, causing 26 deaths in Shiyan in 2021, and 32 deaths in Gaoxiong in 2014. Also, the natural disaster-triggered power outage and shutdown of multiple lifelines affected millions of people in Texas in 2021 and Longdon in 2019. However, the existing work focused more on single lifeline risks instead of their coupling effect in a comprehensive perspective. So, there is an essential need for quantitative coupling risk assessment of urban lifelines in cities. In this work, energy transfer theory, knowledge graph, natural language processing, and expert elicitation are first introduced to identify hazards of urban lifelines comprehensively and illustrate potential causations of coupling accidents in a data&expert-driven way. Then, Bayesian network and numerical simulation are integrated to perform dynamic and quantitative risk assessment of typical (single) accidents of urban lifelines. Meanwhile, the coupling effect and similar terms are discussed and identified to propose a coupling risk assessment framework. Finally, utilizing the proposed framework to elaborate on coupling risk assessments of “gas leakage-dispersion-explosion-fatality” and “rain-flood-underground space ponding-evacuation-drowning”. It can be concluded that the current emergency systems are not sufficient for risk mitigation of lifelines due to the coupling effect. Significant delays and risk rebounds have occurred since the activation of emergency systems. Based on the proposed work, the coupling risks of urban lifelines can be systematically identified and measured, and optimal risk mitigation approaches can be put forward to facilitate the design and maintenance of urban lifelines for a more resilient city. |
11:15 | Contextual conditions for the application of urban heat mitigation measures: A review of reviews ABSTRACT. Climate change and urban heat islands are contributing to the warming of urban areas worldwide. Previous research has made it clear that the implementation and success of urban heat mitigation measures are dependent on context, but which contextual conditions matter is less clear. This study addresses the state of knowledge regarding contextual conditions for the application of physical urban heat mitigation measures, and gauges the transferability of measures to Nordic cities. A scoping review of literature reviews was done. Results show that contextual conditions are not systematically reported in literature, and are often mentioned implicitly rather than explicitly. Relevant contextual conditions for physical mitigation measures include: climate and prominent wind patterns; water availability; soil perviousness; the population´s thermal comfort and tolerance; site geometry; surrounding surfaces; space availability and site adaptability; budget; maintenance; information availability; proximity to the sea; site function; and sun path; risks or co-occurring societal challenges; social norms, cultural values and aesthetics. Contextual conditions for which commonalities were found between Nordic cities allowed for the creation of an overview of considerations for transferability of measures to this region, which should inform design criteria for future planning. Knowing what contextual conditions are relevant to heat mitigation measures can guide the analysis of transferability of measures in the future, and clusters of cities with similar contexts can be made, between which the transferability of solutions would be high. |
11:30 | Unthoughtful industrialization: Unseen and emergent risks ABSTRACT. Industrialization has given us many things in modern days. We are getting an easy life, modern equipment for our daily use, modern and easy transportation. It has caused a revolution. Although we could build modern cities and transportation system; we destroyed our ecosystem and increased natural disaster. In between eighteenth and nineteenth centuries, with the advancement of science, we found methods to produce many chemicals commercially. That created a wave, and new industries were set up one after another. Questions are with the industrialization; consequences which industrialization created, were we aware of them? How have we mitigated those risks? This paper tries to give a sample example how we created a full lifecycle of industrialization to produce a harmful product, synthetic fertilizer and discusses the risks evolved from that lifecycle. Although it’s not a complete picture, but still an example. In the same way, new inventions are coming every year. That may create opportunities for new type of industrialization. Have we thought enough about emergent risks from those new inventions and commercialization of those products? This paper tries to identify some emergent risks. One example which is used here is commercialization of robots. This will give a thought to other researchers to focus on other emergent risks in the future. It is expected that, thinking about those emergent risk can give us to prepare us for unwanted incidents in the future. |
11:45 | Conceptual framework to study the politics of climate risk management in local government ABSTRACT. Contemporary climate risk management in local governments tends to fortify existing risk governance, normalize an extraordinary situation, and depoliticize the issue of climate change by favoring technical approaches. This paper introduces a novel conceptual framework based on securitization and riskification to empirically investigate the politics of climate risk management in local government. Despite the increasing importance of local action in handling climate change, this level of government has received limited attention in the riskification literature. By applying risk logic as an explanatory approach, this framework identifies the effects of climate risk management on climate change policy and action. Together with the discourse used, actors involved, and tools employed, this constitutes the politics of climate risk management. The findings contribute to ongoing debates on how increasingly threatening climate change futures are translated into a bureaucratic system characterized by a prevalence of risk governance. The paper concludes by suggesting reconsideration of referent objects as a future research avenue to advance climate risk management in local government. |
10:45 | APPLICATION OF DATA-DRIVEN BAYESIAN BELIEF NETWORK FOR CAUSAL ANALYSIS OF FACTORS CONTRIBUTING GROUND DAMAGE AVIATION SAFETY INCIDENTS PRESENTER: Stanislav Bukhman ABSTRACT. Aviation remains one of the safest means of transportation. Recent safety report of International Civil Aviation Organisation, ICAO, states that the rate of fatal aviation accidents steadily decreases in last five years and in 2023 equals to 17 people per one billion transported passengers. From this perspective, year 2023 stands out as one of the safest on record in aviation history (ICAO, 2024). This improvement was achieved due to continuous advancements in technology, stricter regulations, and enhanced operational monitoring (EASA, 2024; FAA, 2024; ICAO, 2022). Despite these improvements, aviation industry continues to face challenges in other areas of safety, particularly non-fatal accidents and incidents (ICAO, 2024). One of the areas of concern is the damage to the aircraft that includes incidents and accidents during its ground service. Although these incidents are usually not life-threatening for passengers, crews and ground staff, they impose significant financial burden on the industry that includes repair costs, operational disruptions, delays and insurance claims. According to International Air Transport Association, IATA, ground damages cost industry 5 billion USD a year and this cost is estimated to double by 2035 (Bailey, 2022). Effectively addressing ground damage requires comprehensive investigation and analysis of the common triggers and contributing factors behind such incidents. Our study applies data-driven Bayesian belief networks to modelling and evaluating of these factors, identifying key causes leading to ground damage (Meng, An, & Xing, 2022; Qazi & Khan, 2021). As the aviation sector continues to expand, understanding the root causes of accidents and incidents is essential for identification of targeted preventive measures. Such approach will enhance mitigation of these risks, ensuring both operational efficiency and passenger trust. |
11:00 | CRITICALITY MANAGEMENT OF AIRPORTS PRESENTER: Linda Martens Pedersen ABSTRACT. Presentation of a project Safetec has done for Avinor regarding development of a methodological approach for setting requirements for uptime and availability to infrastructure at individual airports. Avinor’s 44 airports in Norway, are divided into 5 categories by size of the airports. The project proposes definitions and KPIs that can form the basis for targeting availability. It describes a methodology for identifying critical functions and analyzes the current situation regarding the performance of the various categories, airports and functions. This provides a basis that can help Avinor to set appropriate requirements for the airports and distribute these requirements to individual functions. The project defines passenger delay hours as an overall KPI for airports. This KPI is the sum of the total delay for the sum of all passengers affected by a failure event. The KPI directly addresses Avinor's operational goals and takes into account both frequency, duration and number of people affected. To evaluate which functions are critical to airport availability, a mapping of all main functions and systems at an airport was performed. The aircrafts, passengers, luggage and cargo were followed from start to finish of the journey through an airport. Analyses of the current situation of airports have been carried out to estimate the delay hours for an average airport for each category. Both passenger delay hours, baggage delay hours, flight delay hours, consequences for emergency preparedness and passenger experience have been considered. In order to present proposed models for availability requirements, the current situation is used, together with some general principles. Principles discussed are based on: The principle of equality: The idea that the likelihood of a passenger experiencing delay should be fairly equal regardless of airport size Optimization principle: The idea that we want to minimize the total sum of delay hours for all Avinor airports |
11:15 | An Easily Understood GOF Test for Life Data with Censored Observations ABSTRACT. Numerous measures exist to assess goodness of fit (GOF) for life data. These assessments range from graphic inspection, such as a probability plot, to analytical tests such as the Kolmogorov-Smirnov test. Both the method by which many analysts visually scrutinize a probability plot and the actual calculation used in some analytical tests are more synonymous to being a test for outliers. With life data, a Monte Carlo simulation or parametric bootstrap test can be conducted to assess goodness of fit for the assumed distribution. Using standard methodology, it's possible that simulated failure times may occur prior to the actual observed times. This makes such simulated data unrealistic when the modeled data include current or active fleet times. The method employed here will correct this potential error in the method. A synopsis of typical goodness of fit tests will be shown against sample data, with the lacking capability of existing common tests. An improved assessment will be demonstrated that determines goodness of fit relative to an observable test statistic, whether the failure data are complete or include censored observations. A p-value can be generated from this assessment, or the user can prescribe simple common sense rules for comparison of the data to the assumed distribution. Finally, once the distribution is deemed a good fit to the data, clarification on the variability in the model can be shown with a plot of an exact confidence interval for the number of failures that could have occurred to date, as well as confidence bands on the probability plot. These confidence bands are based on exact realizations of data that cannot produce more total time on units than has been observed to date. This is extremely important when modeling an appropriate life distribution for active fleet data that will continue in operation. |
11:30 | From rigid orderliness to barely controlled chaos: sociotechnical risk and AI in aviation PRESENTER: Jan Hayes ABSTRACT. Given the potentially catastrophic consequences of errors, faults and poor decisions in aviation, to date artificial intelligence (AI) applications are permitted only in non-safety related activities and tasks, and machine learning is banned during in flight operations. By restricting the possible adverse consequences of using AI technologies, this approach also severely restricts the possible benefits and so there are broad plans from regulatory bodies to allow further integration of AI into the sector. Based on interviews with aviation sector safety and AI experts, this study aims to understand the strengths and vulnerabilities in current aviation safety processes and how processes and practices may need to be adapted to address safety in AI. Drawing on Macrae’s SOTEC (Structural, Organizational, Technological, Epistemic, and Cultural) framework for sociotechnical risk in autonomous and intelligent systems, we develop a preliminary set of risks posed by use of AI in aviation across these five domains. One of the significant challenges is the fact that different parts of the sector have different safety management approaches and so may be impacted by AI in different ways. Safety in aircraft manufacturing and flight operations is certification and compliance based. When it comes to safety in air traffic management, with multiple actors making judgment-based time pressured decisions, one interviewee described the environment as ‘barely controlled chaos’. Uncertainty is high and risk-based processes prevail. This paper unpacks these issues and looks at the implications for identification and evaluation of novel risks linked to new AI applications. |
11:45 | Development of high-performance composite materials for aerospace applications PRESENTER: Mohammad Almajali ABSTRACT. Introduction: This study explores the development and performance evaluation of high-performance composite materials, particularly carbon fiber-reinforced polymers (CFRPs), for aerospace applications. CFRP materials enhance structural integrity, fatigue performance, and damage tolerance compared with conventional materials for aerospace engineering applications. Methods: A structured methodology was followed involving various tests to select materials, refine fabricate techniques, and evaluate performance in developing high-performance composite materials for aerospace applications. Results: CFRPs achieved a fatigue life of approximately 107 cycles, outperforming GFRPs, which broke at 105 cycles. Impact tests revealed that CFRPs had better impact resistance, averaging 250 J/m than GFRPs’ 100 J/m. Structural integrity tests showed no extreme damage after loading for many cycles, proving the strength of CFRP prototypes. CFRPs also led to a weight reduction of 30% and a superior strength-to-weight ratio of 1.5, with titanium at 10% and 1.2 and aluminum at 0% and 1.0%. Conclusion: CFRPs are advanced aerospace materials, and their ability to provide substantial weight reduction and high strength-to-weight ratios makes them ideal for enhancing aerospace efficiency and reliability. |
Participants: Jérôme Lacaille ( SAFRAN), Robert Meissner ( DLR), Gabriel Michau ( STADLER) and Vincent Chérière ( AIRBUS)
10:45 | Developing Resilience-Oriented Indicators for Integrated Process Safety and Process Security Risk Management PRESENTER: Muhammad Shah Bin Ab Rahim ABSTRACT. This study develops resilience-oriented performance indicators (PIs) that integrate process safety and process security risk management in the chemical process industry. Drawing on the resilience engineering principle, we propose a unified framework addressing both safety and security concerns through four system capabilities contributing to its resilience: Anticipation, Absorption, Adaptation, and Ascension. These capabilities provide a systematic structure for categorizing the PIs, which are further classified into Management, Process, and Result indicators. Insights from nine expert interviews spanning academia, consultancy, government, and industry helped refine the indicators and assess their practical relevance. The experts discussed eight hypothetical disruption scenarios, ranging from technical failures to supply chain disruptions and terrorism, offering valuable perspectives on implementation challenges and opportunities. The findings emphasize the importance of aligning safety and security measures while tackling systemic barriers such as resource constraints and procedural resistance. This research contributes a novel framework for integrated risk management, supported by actionable PIs that bridge theoretical resilience concepts with practical application. It also lays the groundwork for further validation in real-world settings and broader adoption across diverse industrial contexts. |
11:00 | Exploring Experiences with the Regulatory Toxicology System – Promoters and Inhibitors of New Approach Methods PRESENTER: Angela Bearth ABSTRACT. The transition from animal-based toxicological assessments to New Approach Methods (NAMs) marks a paradigm shift in regulatory toxicology, holding the promise of improvements of the understanding of biological mechanisms and higher human relevance. NAMs comprise new approaches to generating data to support risk assessment in toxicology without the use of conventional and arguably unethical animal testing. Nevertheless, the effective use of NAMs in regulatory toxicology has proven challenging. The interdisciplinary CHANGE project (Collaboration to Harmonise the Assessment of Next Generation Evidence) takes a system-based perspective to these challenges, focusing on the views of the global regulatory toxicology system as expressed by the people working in it. This talk will present the applied qualitative methodology and the outcomes of the initial ‘Explore Phase’ with data collection in a three-day interactive workshop in Oslo and several follow-up online workshops. In a series of collaborative sessions, the international workshop participants were invited to share anecdotes about the regulatory toxicology system, and every session was recorded and transcribed. The resulting 48 transcripts were coded to systematically analyse the observations about the regulatory toxicology system. Taking a unique approach, we applied system thinking to the data to reveal a network of promoting and inhibiting effects embedded in the global regulatory toxicology system and abstracted causal loops, which will be utilised for the ‘Reflect Phase.’ CHANGE is a unique project, built upon the kind of interdisciplinary collaboration that risk and safety research relies on. |
10:45 | Towards robust deep reinforcement learning agent for the path following of autonomous ships amid perception sensor noise PRESENTER: Paul Lee ABSTRACT. In the era of maritime autonomous surface ships (MASSs), intelligent agents are projected to make safety-critical decisions without human intervention. Considering the various disturbances associated with the maritime environment, enhancing their robustness during safety-critical operations is pivotal, including those related to path following. The aim of this study is to propose a methodology that enhances the robustness of path following for a MASS amid perception sensor noise by controlling the state space parameter of a deep reinforcement learning agent. The agent is trained to follow a predefined path at various noise levels between a minimum and maximum value, and a robustness metric based on the cross-track error is defined. The case study considers a container vessel that uses light detection and ranging for the situation awareness of its surrounding environment. Simulation results suggest that when the state space parameter related to the value of the noise level is controlled, the robustness is enhanced up to 5,668% from its maximum trained value by not violating the cross-track error threshold. When the state space parameter is not controlled, an enhancement of up to 112% is noted, highlighting the effectiveness of the proposed methodology. This study contributes towards the development of agents capable of making robust decisions during safety-critical operations under uncertainty. |
11:00 | Comparative Neuroergonomic Analysis of Mental Workload in Industrial Human-Robot Interaction Assembly Task PRESENTER: Carlo Caiazzo ABSTRACT. The fifth industrial revolution, or Industry 5.0 (IR5.0), is on the verge of setting humans at the center of the production systems, designing innovative industrial human-robot interactive (HRI) workplaces where psychophysiological measures can be deployed to evaluate the human’s mental workload while interacting with the robot. The goal of research study is to show a comparative evaluation of three laboratory experimental conditions: in the first, the participant assembles a component without the intervention of the robot (Standard Scenario); in the second scenario, the participant performs the same activity in collaboration with the robot (Collaborative Scenario); in the third scenario, the participant gets fully guided task in collaboration with the robot (Collaborative Guided Scenario) through poka-yoke or lean principles. The purpose of this analysis is to demonstrate the different responses of participants in terms of mental workload, efficiency, and productivity in the three settings. Furthermore, the research used observational measurements to calculate the productivity index in terms of accurately assembled components across the three scenarios. EEG sensors are put on the applicant to collect quantitative data for comparison analysis and to assess the operator's mental workload during the task in the three different scenarios. The quantitative and objective EEG study of the mental effort is backed up by observational measurements of the corrected components constructed to correlate the mental workload with production rate. Following these measurements, a qualitative analysis, using questionnaires, is useful to assess the user experience while working with the robot in the further scenarios. Through statistical analysis, the mental workload is significantly decreased in the activity with the robot. |
11:15 | A fast resilience evaluation method for multistate networks based on state space reconstruction PRESENTER: Tao Liu ABSTRACT. Resilience evaluation is the basis for analyzing and enhancing the multistate network’s ability to withstand and bounce back from disruptive events. Because the resilience evaluation of multistate network is an NP hard problem. It is important to develop efficient method for resilience evaluation of multistate network with less computational effort. This research focuses on the efficiency and accuracy improvement of resilience evaluation method for multistate networks based on state space reconstruction. First, to cope with different situations during the resilience process, a multistate network resilience evaluation framework is proposed. Second, an improved state space decomposition (SSD) method, and a deeper SSD method are proposed to obtain the sets of state space of a multistate network with demand accuracy. Third, we developed a state space reconstruction (SSR) method to update the sets of state space during the resilience process. The efficiency investigations are conducted to demonstrate the performances and efficiency of the proposed method. An application is provided to illustrate how the proposed method can be used to improve the resilience of real-world transportation network. |
11:30 | A large language models-integrated method for risk analysis of battery energy storage systems PRESENTER: Huixing Meng ABSTRACT. Battery energy storage systems (BESS) are increasingly widely-used in industrial and civil fields. Along with related accidents, the safety of BESS is becoming pyramidally prominent. To improve the safety level of BESS, risk analysis is beneficial to undertake risk prevention and control measures. Nevertheless, risk analysis usually heavily relies on subjectivity-prone expert knowledge. In this work, we propose a risk analysis method integrating the large language model (LLM), function resonance analysis method (FRAM), and Bayesian networks (BN). The method is validated through practical BESS accidents. The results are valuable references in the safe operations of BESS. |
13:00 | Measuring the impacts of human and organizational factors on human errors in the Dutch construction industry using structured expert judgement PRESENTER: Xin Ren ABSTRACT. It is acknowledged by numerous studies that the leading cause of structural failures is unintended human error. Under the new system approach towards human error, error is perceived as a symptom of improper system design, organization, or other troublesome issues embedded inside the system, other than the cause for accidents and failures. These system contexts and the underlying factors, which can shape the performance of people at task and potentially lead to the occurrence of human errors and system failures, are the Human and Organizational Factors (HOFs). Yet little is known regarding how much influence HOFs have on human errors in the construction sector. Therefore, this study measured the impacts of the critical HOFs on human error occurrence in structural design and construction tasks within the context of the Dutch construction industry. The primary research question addressed in this study concerns the extent of HOFs’ contribution to human error occurrence. To answer this question, the Classical Model for Structured Expert Judgement (SEJ) is employed, enabling experts to provide their judgments on task Human Error Probability (HEP) influenced by different HOFs, which are subsequently aggregated mathematically. SEJ is chosen as a suitable approach due to the limited availability of applicable data in the construction sector. As a result, the impacts of HOFs are quantified as multipliers, representing the ratio between the evaluated task HEP and its baseline value. These multipliers are then compared with corresponding multipliers from existing Human Reliability Analysis methods and studies. The findings reveal that fitness-for-duty, organizational characteristics and fragmentation exhibit the most pronounced negative effects, whereas complexity, attitude, and fitness-for-duty demonstrate the most significant positive impacts on task performance. These results offer valuable insights that can be applied to enhance structural safety assurance practices. |
13:15 | Results from a WHO Expert Elicitation Study for Attribution of Global Burden of Disease to Foodborne Transmission and Specific Foods PRESENTER: Tina Nane ABSTRACT. Foodborne disease poses a major world safety concern, with 1 in 10 people getting ill from food contaminated with microbial or chemical agents. The proportions of the burden of food-related disease attributable to transmission routes and to specific foods are crucial for formulating policy to improve public health safety. Unfortunately, data on outbreaks are not sufficient to support such decisions and surveillance or monitoring data are missing in most countries. Consequently, the World Health Organization (WHO) requested an expert judgment study for the global attribution of burden of disease to foodborne transmission and to specific foods. The chosen expert judgment method is the Classical Model for Structured Expert Judgment. The WHO expert judgment study has been a major endeavor. Experts were identified and selected using a complex procedure coordinated by the WHO Foodborne Disease Burden Epidemiology Reference Group and its Source Attribution Task Force. More than 800 experts were identified initially and more than 150 of these took part in the study. Training materials consisting of recorded (interactive) videos, training questions and practice exercises were developed for this study in Qualtrics. The online, one-to-one elicitations were carried out by trained elicitors. The experts provided judgments on source attribution for 41 hazards over 17 clusters of countries. To obtain specific hazard/regional insights required tailoring the questions of interest, which focus on six main transmission pathways and 14 specific food groups. Together with these questions of interest, related calibration questions were also customized with respect to hazard expertise and to regional experience. Ultimately, for each broad expertise class and region, thirteen calibration questions were identified. Expert uncertainty assessments were aggregated using the Classical Model, and the resulting estimates were normalized with respect to the main pathways and food groups. Finally, experts and elicitors provided feedback on the elicitation process. |
13:30 | Reliability Assessment of Fairleads in FPSO Units Based on Expert Elicitation PRESENTER: Gilberto Francisco Martha de Souza ABSTRACT. Fairleads are critical components of the mooring systems in Floating Production Storage and Offloading (FPSO) vessels, designed to guide anchor lines and withstand significant operational stresses. The reliability of these components is essential to ensure the safety and integrity of offshore operations, particularly given the harsh environments and extended service lives of FPSO units. However, the lack of historical failure data for fairleads poses significant challenges to reliability assessments, as their robust designs and high safety factors result in minimally recorded failures. Traditional inspection intervals, dictated by classification society rules, may not fully account for the specific operational conditions and reliability requirements of fairleads. Additionally, the application of rigorous theoretical models is often impractical due to the absence of sufficient data and the complexity of these offshore systems. In this context, expert elicitation emerges as a practical solution to address these limitations and to support reliability assessments and inspection planning. This paper presents a structured methodology for eliciting expert judgments, including the selection of experts, questionnaire design, and the integration of elicited data into reliability models. The proposed approach is demonstrated through its application to fairleads in a Brazilian FPSO fleet, showcasing its effectiveness in deriving probabilistic reliability estimates and determining optimal inspection intervals. The results highlight the potential of expert elicitation to enhance maintenance strategies, ensuring the long-term operational soundness of critical offshore equipment. |
13:45 | Wisdom or Madness: Expert Data on Wisdom of Crowds ABSTRACT. This talk reviews some recent popular literature on wisdom/madness of crowds. The expert judgment data base assembled by the TU Delft and GW is then used to compare popular notions with a large set of expert forecasts for which realizations are known. There are several surprises. (1) expert absolute percentage errors are very fat tailed. Sample averages do not converge. The mean absolute percentage error (MAPE) gets larger as we average the forecasts of larger sets of experts. (2) Because of this, the advantage of averaging forecasts is lost – the average tends to the max, (3) expert dependence is higher regarding the placement of medians than the 90 % confidence bands. (4) For point forecasts 40% of the variance is explained by variables, 8% by experts, for confidence bands 17% is explained by variables, 54% by experts. (5) Larger pools of experts are beneficial for mean statistical accuracy but detrimental for MAPE. (6) The value of vetting, and weighing experts is underscored. |
14:00 | Underwater Inspection Planning Based on Reliability and Decision-Making Techniques: An FPSO Platform Case Study PRESENTER: Renan Favarão da Silva ABSTRACT. Ensuring the integrity and safety of Floating Production Storage and Offloading (FPSO) platforms is crucial for the oil and gas industry, particularly in offshore environments. A critical component of this is the underwater inspection process, commonly referred to as Underwater Inspection in Lieu of Drydocking (UWILD). Despite its importance, planning such inspections is not trivial and requires a systematic approach that balances risk, reliability, and resource optimization. In this context, this paper proposes a method for planning underwater inspections based on reliability and decision-making techniques. It complies with three main processes: identification of what needs to be inspected, determination of when to inspect, and selection of which inspection method to apply. Each process integrates specific techniques to support the application of the proposed method. First, to prioritize inspection items, potential failure modes are identified, and their effects and criticality are assessed. For determining recommended inspection intervals, life data analysis and degradation analysis are applied to derive reliability functions and data-supported decisions. Finally, for the selection of inspection methods, a Multicriteria Decision-Making (MCDM) approach is used to prioritize inspection techniques based on the specific requirements of each maintenance scope. The proposed method is demonstrated through a case study based on an operational context of a Brazilian FPSO platform. The results obtained show the proposed method can support maintenance planning as it provides structured guidance to systematically define and review the scope of underwater inspections, contributing to the reliability and integrity ensuring. Accordingly, this study is expected to contribute the integration of reliability and decision-making techniques in the field of physical asset management research and the oil and gas industry. |
13:00 | A relational approach to characterising householder risk perceptions of disruption in heat transitions PRESENTER: Nick Pidgeon ABSTRACT. Domestic heat decarbonisation threatens substantial disruptions to households in temperate countries currently reliant upon natural gas for heating. However, the concept of disruption carries diverse meanings, potentially relating to cost, material space and everyday heating practices. Here, using interpretive risk theory, we elucidate how Boholm and Corvellecs’ concept of relational risk allows understanding of how risk of disruption is experienced and takes on meaning in everyday life. We use this framework to examine perceptions of four low-carbon heating technologies: heat pumps, hydrogen, hybrid heating and heat networks—alongside associated upgrades to distribution network infrastructure. Data from six, 1-day qualitative workshops (total n=49 participants) representing a diversity of geographic and housing contexts across the United Kingdom, shows how existing relationships shape hopes, fears and expectations for heat decarbonisation. Our findings help clarify the role of affective relationships, feelings of precarity, security and pressure in distinguishing material inconveniences from more fundamental disruptions to valued ways of life. While conducted in the UK, the results are also relevant in other gas-dependent European countries and regions. |
13:15 | Towards a multidimensional measure of privacy as a value PRESENTER: Nicole Huijts ABSTRACT. Privacy is important in all the aspects of our daily life, whether we are using the internet, meeting with friends, working, or seeking solitude. Privacy has often been argued to consist of multiple dimensions; however, no study has thoroughly investigated which dimensions can be found in how people value privacy. The main aim of this research, consisting of an exploratory and a main study conducted with UK participants, was to develop a multidimensional measurement of privacy as a value and to examine its fit within the most comprehensive value scale available based on Schwartz’ Value Theory, consisting of 19 values. In the exploratory study, after developing 26 items that aimed to measure the various aspects of privacy, we found that privacy as a value may consist of four factors: online privacy, observational privacy, solitude and interactional privacy. In the main study, we measured each of the four factors with three items each and found that the latter two factors are so closely related that they should be collapsed into one factor. Furthermore, observational and interactional privacy were more strongly correlated with each other than each of them with online privacy. We further found that the factor online privacy correlated positively with the value of self-direction thought, while both observational and interactional privacy positively correlated with self-direction action, power-dominance, and face. Additionally, interactional privacy correlated with power-resources. The developed measurement needs further validation and can be used as an extension of the 19-value scale to understand value-based decision-making in privacy-relevant contexts. |
13:30 | Risk perception of unwanted incidents: comparison of how lay people and experts evaluate societal risk PRESENTER: Morten Sommer ABSTRACT. The gap between lay people’s and experts’ risk assessment has been extensively studied. Lay people´s perception of risk reflects their real concerns, and instead of focusing on the challenges of including risk perception in risk management, studies should focus on how risk perception can contribute to improving risk policies [1]. During the last decades, however, lay people’s level of education and general knowledge have increased [2]. The growth in social media and acceleration of traditional media is a relevant factor in this regard, as information become more easily available for lay people. Now days, expert knowledge is only a keystroke away and the newsfeed is constantly informing about risks, crises, and disasters. In this paper, we explore the connection between lay people’s risk perception and experts’ risk assessment, to see if the existing theory on risk perception still stands. To study this, we asked a sample of 562 citizens how worried they are about unwanted incidents identified through comprehensive risk and vulnerability analysis from various municipalities in Norway. We further map the extent to which participants have received information about these risks, and whether information is received via the media or the municipality. The aim is, first, to assess the concordance in perceived risk between citizens and the municipality and, secondly, to see whether citizens’ risk perception is affected by the source of relevant risk information. Finally, based on an interview with risk managers in the municipality, we investigate how they communicate risk to the citizens, and what they focus on in this regard. [1] Aven, T., & Renn, O. (2010). Risk management and governance: Concepts, guidelines and applications. Springer. [2] Teichler, U. (2020). Higher education in economically advanced countries: Changes within recent decades. Higher Education Governance and Policy, 1(1), 1-17. |
13:45 | Mind the knowledge gap: hormonal birth control and risk perception among young Norwegian women PRESENTER: Laura Andrea Johnsen Ødegården ABSTRACT. Background: The study is motivated by a noticeable decline in the sales of hormonal contraception coupled with an increase in the abortion rate and sales of emergency contraception among young Norwegian women. This trend coincides with a growing negative framing surrounding use of hormonal contraception on social media platforms and in social circles. This study aims to explore young Norwegian women’s risk perception associated with hormonal contraception, the information sources they use and trust, and how these factors influence their contraceptive choices. Methods: Our theoretical perspective is anchored in risk and societal security, and the collected data comprise of qualitative interviews, a survey, and a document analysis. Results: Our findings suggest that young Norwegian women have a high perceived risk linked to the use of hormonal contraception, particularly with regard to common side effects such as depression/low mood and hormonal imbalances. Medical research and general practitioners do not adequately address young women’s need for information on these side effects. With limited input from healthcare providers, young women turn to friends and social media for guidance - often without openly admitting to it. In informal settings, advice to stop using hormonal contraception is common. Conclusions: This study indicate a communication gap between healthcare providers, mainly general practitioners, and young Norwegian women regarding hormonal contraception. Closing the information gap is vital to enhance informed decision-making and bodily autonomy. |
14:00 | Risk perception of air pollution and policy acceptance: an exploratory study on a representative Italian sample PRESENTER: Maria Stocco ABSTRACT. Air pollution is one of the most significant health risks in Europe, with over 300,000 deaths annually, ranking as the second leading cause of death in the Continent (HEI, 2024; EEA, 2023), but there is low awareness of the problem among the population (Oltra & Sala, 2015). While policymakers are implementing various initiatives to curb emissions and improve air quality, these measures are often perceived by the public as either too costly or ineffective. In some cases, a lack of awareness about pollution levels and contributing factors further complicates the issue (Pignocchino et al, 2023; Maione et al, 2021; Oltra & Sala, 2016). In this pre-registered and correlational study, we investigate the variables associated with the acceptance of policies to reduce air pollution, focusing on subjective air quality assessment, risk perception, self-efficacy, and perceived costs of the policies with a representative Italian sample (N=1008). Results showed that subjective air quality assessment was positively associated with policy acceptance and that this association was mediated by the perception of the risk of air pollution. Moreover, our findings revealed the importance of individual differences, such as self-efficacy and political orientation. Specifically, we found an interaction between perceived costs and self-efficacy with policy acceptance as the dependent variable. We also found an interaction between subjective assessment of air pollution and political orientation on risk perception. To the best of our knowledge, there is scarce research on air pollution risk perception and policy acceptance in Europe; therefore, our results offer preliminary yet significant insights to design more effective and accepted air quality policies. Raising people’s awareness of the issue and the risks they are facing is fundamental to convincing them to take appropriate measures against air pollution and to accept effortful policies to mitigate it (Noel et al., 2021). |
13:00 | Improvement of mountain natural risks analysis: assessment of reach, seasonal exposure and presence probabilities PRESENTER: Jean-Marc Tacnet ABSTRACT. Mountain natural phenomena such as torrential floods, snow avalanches, rockfalls threaten people and infrastructures. To reduce risk, local authorities and infrastructures managers search the best strategies and actions at local and also territorial scales. Risk-informed decision making always start with risk analysis which remains the first essential step before considering and selecting those risk reduction measures. Natural risks are assessed using a classical combination of hazard, exposure and vulnerability (equivalent to combination of severity and probability in industrial technological contexts) that are assumed to be well known and easy to determine. In practice, In the context of mountain natural risks, characterizing the exposure is indeed not that easy. For a given magnitude, a phenomenon can have several possible trajectories, each of them corresponding to a sub-scenario with a given conditional probability. Mountain phenomena may occur all year round or, on the contrary, only at certain specific periods (e.g. summer, winter), with either mutual exclusions or cases of simultaneous occurrences. Because of the seasonal nature of tourism, human occupation in mountain areas is highly variable, with peaks in occupancy rates for permanent and temporary accommodation and road traffic. This paper addresses the issue of practical and operational assessment of assets exposure and especially their reach and presence probability for different phenomenon sub-scenarios in the context of multi-risk hazard. It proposes simplified and practical methodologies combining probabilistic approaches and also dependability analysis to 1) formalize possible sub-scenarios, 2) assess their conditional probabilities of phenomena sub-scenarios and 3) calculate reach probabilities of their spatial extent and seasonal presence and occurrence probabilities. Examples are given for a first single phenomenon (torrential flood) before showing how the method can be extended to deal with multiple phenomena (for instance torrential flooding and rockfall events) and the influence of seasonal occurrence and presence hypothesis on calculated risks. |
13:15 | Building Heatwave Resilience in Rotterdam: Insights from a Stress-Test Model PRESENTER: Arka Bhattacharyya ABSTRACT. In this presentation, we would like to share a stress-test model which focuses on building heatwave resilience in Rotterdam. Building resilience requires understanding vulnerabilities and to do that, we plan on simulating an extreme heatwave scenario to test whether the city can cope with it. The existing National Heatwave Plan, which serves as a warning system to inform stakeholders about an impending heatwave, might not be adequate to deal with extreme heatwave scenarios like 2003 or 2006, in absence of a robust heat action plan. Hence, we are interested in finding out how an extreme heatwave of similar magnitude like the ones in 2003 or 2006 will unfold in the present and future scenarios in Rotterdam to understand what a robust heat action plan should entail. We want to identify the vulnerabilities in the existing heat plans during such an extreme situation. More importantly, we want to simulate the extreme condition in future scenarios, while considering deep uncertainties like the extent of heatwave, its duration, climate change, population growth, economic growth, anthropogenic activities, etc. We have adopted the Decision Making under Deep Uncertainty (DMDU) framework, which goes beyond finding optimal solutions but rather focuses on finding solutions that are robust against deeply uncertain future conditions, to stress-test the existing heat action plan. The DMDU framework facilitates transdisciplinary knowledge production, which lies at the intersection of co-creation of knowledge, interdisciplinary or multidisciplinary research, and participatory design. The stress-test model will subsequently be used in developing a pathway for heatwave resilience in Rotterdam. |
13:30 | Comparative Analysis of STPA and FRAM: The Effect of Process Delays for Enhanced Safety and Resilience PRESENTER: Tobias Demmer ABSTRACT. Understanding process delays can be significant for implementing effective long-term resilience enhancing measures. And looking into those delays also helps understanding the impact of unintended side-effects arising from short-term safety measures. To do that, this work compares the methods System Theoretic Process Analysis (STPA) and Functional Resonance Analysis Method (FRAM) in regard to the implementation and representation of process delays. STPA is a famous example for the combination of control engineering and safety science. It models system failures and successes involving complex dynamic processes. FRAM tackles the same problem from the opposite direction: it was created from the resilience community to analyse system processes. We apply the two methods to the same infrastructure model from the literature and focus especially on how the methods handle delays within processes. |
13:45 | A Modelling and Computational Framework for the Assessment of the Supply Resilience of Gas Transmission Pipeline Networks PRESENTER: Masoud Naseri ABSTRACT. Natural gas transmission pipeline (NGTP) networks are infrastructures, whose operation is critical in view of growing global energy demand. Unexpected natural gas supply interruptions have occurred in the last decade, highlighting the critical role of reliability and resilience of NGTP networks. Graph theory, complex networks analysis, thermal-hydraulic and transient/steady-state gas flow models have been applied to assess the capacity and flow of gas pipeline networks. NGTP networks are considered as multi-component systems subject to single failure modes of a rate often assumed constant. The flow analyses are performed for the whole network without considering the priority of gas-receiving terminals and the associated different penalty schemes for gas not supplied. The present work proposes a framework for modelling and analysing of the capacity and flow of NGTP networks subject to multiple failure causes and considering repairs. Graph theory and Markov chain are used for modelling and a Monte Carlo Simulation is used for quantification. A numerical example is used to illustrate the overall modelling and computational framework. The results of the application of the modelling and computational framework here proposed work can inform operational management strategies for reducing the risk of service disruption and improving NGTP resilience. |
13:00 | RUL prediction using Bayesian polynomial regression PRESENTER: Kirill Ivanov ABSTRACT. Maintaining the reliability of complex systems is crucial in today's technological landscape. Maintenance strategies have evolved from corrective and time-based maintenance to condition-based maintenance and prognostics and health management. Typical remaining useful lifetime (RUL) prediction methods require substantial historical data, posing challenges in data-limited scenarios. To address this, we propose an efficient Bayesian polynomial regression approach with informative priors that predicts RUL even with sparse data. Regression parameters are continuously updated as new data are collected, ensuring accuracy and responsiveness. We validate our algorithm on simulated power module run-to-failure degradation data. |
13:15 | Decomposing a storm surge barrier to determine its probability of a failure to close PRESENTER: Leslie Mooyaart ABSTRACT. Storm surge barriers are large and important coastal defence structures which are open in normal conditions, but need to close when a storm surge is expected. To protect against sea level rise, these structures need to be enhanced. The failure to close is the dominant failure mechanism at the Maeslant barrier (Rotterdam, the Netherlands) and, hence, is the most relevant to improve. The probability of a failure to close of the Maeslant Barrier is determined with a reliability and availability analysis (RA analysis). In practice, this RA analysis is however hard to access and apply as a result of its very high and unbalanced level of detail. Literature on how to decompose a technological object for an RA analysis is scarce, often using arguments which can easily be challenged. Therefore, this study has two aims: first, ordering literature and current practices into a defendable approach to derive at a suitable decomposition of a technological object. Second, apply this method to – a part of - the Maeslant barrier. This case study reveals that decompositions rapidly increase with a higher level of detail, which can easily result in loss of the transparency and credibility of the analysis. Two main arguments to choose a higher level of detail are identified: lowering the scale of performance improvements and increasing the accuracy of the result. This study stresses the importance of object decompositions on the outcome of RA analyses. Moreover, this study can assist in deriving more credible results for the probability of a failure to close a storm surge barrier and, thus, better substantiate investments into coastal flood protection against sea level rise. |
13:30 | RAM analysis in propulsion system in submarines ABSTRACT. This paper presents a comprehensive analysis of the reliability, availability, and maintainability (RAM) of a submarine propulsion system. The study aims to enhance understanding of the critical factors influencing the performance and operational readiness of these systems. A detailed case study is conducted, employing advanced RAM methodologies to evaluate failure rates, mean down time (MDT), and reliability calculations. The analysis is supported by theoretical frameworks and empirical data, highlighting the importance of robust design and maintenance strategies. Key findings indicate that optimizing component reliability and implementing predictive maintenance can significantly improve system availability. The insights gained from this research provide valuable guidance for engineers and decision-makers in the defense sector, seeking to improve the efficiency and effectiveness of submarine propulsion systems. |
13:45 | Obsolescence vs Reliability - Availability - Maintainability: Petri nets modelling for mission-critical systems PRESENTER: Sahar Karaani ABSTRACT. Mission-critical systems like telecommunications networks and defence infrastructures are essential due to their significant role in ensuring security and safety. Maintaining their operational availability is crucial, as any downtime can have serious consequences. While inherent system availability is determined during the design phase, operational availability is influenced by reliability and maintainability, impacted by environmental factors, spare parts, and maintenance resources. Rapid obsolescence of electronic components poses a major risk to operational availability of all systems but far critically for mission-critical systems. This necessitates strategic management to mitigate these risks despite the need for substantial initial investments. This research paper explores the challenge of integrating various types of obsolescence into operational availability using Petri Nets (PN). A physical component within a system's structure is categorised into one of four generically defined classes, based on two criteria that directly affect operational availability: ''Make-Buy'' and ''Repairable-Consumable''. Each of these four classes is linked to a specific PN model, called a PN brick, which includes (i) an operational PN, noted lambda-mu and (ii) a set of five PNs that represent potential obsolescence scenarios. The complete model of a mission-critical system is constructed through a top-down analysis of its structure, assigning an appropriate PN brick to each component deemed at risk of obsolescence. This approach allows to model obsolescence interactions with the operational dynamics of (mission-critical) systems by synchronising all the individual PNs. Conclusions and perspectives will conclude the article. |
13:00 | A Hellinger distance-based stochastic model updating framework for the accreditation validation of a material thermal property under limited data PRESENTER: Adolphus Lye ABSTRACT. The paper presents a distance-based Approximate Bayesian Computation framework involving the use of the Hellinger distance to perform stochastic model updating for the subsequent accreditation validation procedure. In computing the Hellinger distance, the adaptive-binning algorithm is implemented to adaptively select an appropriate bin number to approximate the probability density of the experimental data and the model prediction. The distance function subsequently quantifies the difference in the distribution between the two statistical objects. To verify the proposed stochastic model updating framework, the approach is implemented to perform a model calibration, under polymorphic uncertainty, on a dynamic temperature model of a slab material based on experimental data. This involves the use of the Staircase Density Function to calibrate and characterise the distribution over the input variables based on limited data, thereby eliminating the element of model uncertainty. A stochastic validation of the calibrated model is then performed against a set of accreditation validation experiment data. The results showed that using the mean estimates on the inferred shape parameters of the Staircase Density Function yields a better validation performance by the resulting calibrated model, in contrast to the case where the maximum a posteriori estimate on the inferred parameters is used. |
13:15 | First Excursion Probability Estimation of a Bilinear Conservative Oscillator Subject to Gaussian Loading PRESENTER: Mauricio Misraji ABSTRACT. The study of oscillators under stochastic loading is essential in fields such as mechanical, ocean, wind, and earthquake engineering. The uncertainty arising from the intrinsic nature of the loading can be quantified using the first excursion probability, which measures the likelihood that an oscillator’s response will exceed a specified threshold during stochastic excitation. However, estimating this probability is challenging due to the need to manage a large number of random variables to represent the load, as well as potential nonlinearities in the restoring force and the non-stationary nature of both the loading and the oscillator’s response. This work presents an efficient method for estimating first excursion probabilities for nonlinear oscillators subjected to Gaussian loading, particularly for systems with a bilinear, conservative restoring force. The technique focuses on exceedance probabilities within the nonlinear response range by dividing the calculation into two components: first, estimating the probability of failure in the elastic range, and then assessing the probability of failure in the inelastic range. To estimate the probability within the elastic range, an advanced simulation-based variance reduction method is employed. This method effectively addresses scenarios where the oscillator's response remains linear. This technique also generates samples of the oscillator's maximum response in the inelastic range, which are subsequently used to construct an extreme value distribution for the inelastic maximum response. By synthesizing the response data, this distribution facilitates a more accurate and efficient estimation of first excursion probabilities across both response ranges. In this context, the variance reduction sampling method is optimally utilized to explore both elastic and inelastic ranges, while the extreme value distribution leverages this information to enhance the estimation of the first excursion probability. A numerical example is provided to illustrate the application of the proposed approach. |
13:30 | Optimized Variational Mode Decomposition Algorithm Based on Pre-Processing Algorithm PRESENTER: Jun Yao ABSTRACT. Existing optimization algorithms for Variational Mode Decomposition (VMD) are all based on multiple iterations, which pose a problem due to their large computational load, making them unsuitable for real-time fault diagnosis of engineering signals. This paper proposes an optimized VMD algorithm based on a preprocessing algorithm, which avoids iterative computation and thus improves computational efficiency. Firstly, the signal is Fourier transformed, and the spectrum is segmented based on the peak frequency of the signal, resulting in the number of decomposition layers after optimization. Secondly, based on the correlation between the quadratic penalty term and the signal noise, the optimal solution under different signal noises and different penalty terms is calculated, and the signal root mean square is used to describe the noise situation of the signal, constructing a penalty term optimization function based on signal root mean square. Finally, the optimization algorithm is validated using a diesel engine cylinder head simulation signal. The results show that the optimization algorithm can significantly reduce the computational load while ensuring the decomposition accuracy of the sine signal and avoiding mode mixing, with the computation time accounting for only 0.01% of the total time, providing a feasible optimization strategy for engineering applications. |
13:45 | Polynomial Approximations to the Sliced-Normal Density PRESENTER: Luis Crespo ABSTRACT. Sliced-Normal (SN) distributions allow for the characterization of uncertain parameters having multiple modes and strong dependencies. However, the need to accurately estimate the normalization constant has limited their applicability to problems having a moderately large number of uncertain parameters. This note develops lower polynomial bounds and polynomial approximations to the SN joint density that enable computing such a constant analytically. Furthermore, we propose a polynomial class of distributions that exhibit not only the same versatility of the SNs but also the integrability benefits of the proposed approximations. This paper presents the mathematical framework supporting such developments along with an easily reproducible density estimation example. |
13:00 | Occupational safety and health in small construction enterprises PRESENTER: Stine S. Kilskar ABSTRACT. This study was initiated as a response to the need for more knowledge about occupational safety and health (OSH) in small enterprises in the Norwegian construction industry, and the purpose was to identify whether there needs to be a particular focus on these. The aim was approached by examining whether small enterprises differ from larger ones in terms of e.g. sick leave, health issues, working environment, and safety aspects; and what specific challenges small enterprises experience that should be prioritised regarding OSH. The findings are based on a combination of different data sources, including statistics on sick leaves, injuries, exposures, and health outcomes, interviews with key personnel in eight small enterprises and one larger enterprise, a workshop with central actors in the industry, and a review of existing research literature. It was found that small enterprises often experience flexibility, closeness, and effective internal communication. On the downside, however, challenges include the wide range of customers and project types; inadequacy of resources and competence on OSH; differing experiences with coordination, follow-up and involvement in large projects; and the importance of a common safety culture – both in projects and in the industry. There are various reasons to prioritise smaller enterprises when it comes to OSH, i.e. their specific challenges related to injuries, exposure, and health outcomes, as well as challenging framework conditions. The competence on OSH in small enterprises should be improved and it important that leaders have the necessary OSH competence and that they are good role models. Clients, main contractors, and the authorities are important actors in “lifting” the smaller enterprises, e.g., through ensuring good framework conditions in projects, more similar framework conditions in the industry, as well as better involvement, inclusion and coordination. |
13:15 | Navigating Compliance Behavior: The Impact of Leadership, Job Resources, and Job Demands in Offshore Work PRESENTER: Espen Olsen ABSTRACT. The study explores how leadership, job resources, and job demands affect compliance with procedures for an Oil and Gas organization on the Norwegian Continental Shelf. The topics were examined through a semi-structured qualitative study, where eight informants from the Oil and Gas industry were interviewed. One offshore organization was chosen, and all informants were skilled workers with relevant competence and experience to answer the pre-defined questions. The theoretical job demands-resources theory was used as a basis for the study to connect leadership with compliance. Several job resources and demands were specifically focused on. A research model connecting leadership, job resources, job demands, and compliance was created based on available literature and used as the basis for interviews and subsequent results. Based on the obtained results, the following conclusions can be drawn; 1) Results generally support the validity of the research model and pre-defined assumptions; 2) Performance feedback, involvement, and workload seem to be the most influential job resources and demands on compliance; 3) Production pressure and role ambiguity are not prominent influencers of compliance; 4) Systems, continuity, availability, and work arrangement have been identified as additional important resources and demands, which should be studied further; 5) Job satisfaction seems to affect compliance, and job satisfaction can potentially mediate the influence job resources and job demands have on compliance. The results provide valuable insights into important factors to improve compliance in Norwegian Oil and Gas organizations. |
13:30 | Enhancing Safety Performance through Cultural Maturity: A Comprehensive Framework for Multinational Oil and Gas Operations PRESENTER: Christian Foussard ABSTRACT. This article presents the results of a major HSE Review conducted through four subsidiaries of an oil & gas company to provide insights and recommendations to enhance the current HSE system. The analysis was carried out for subsidiaries in North America, Middle East and Western Europe. It has required about 600 documents reviewed, more than 100 interviews conducted, about 130 hours of Field visits to evaluate HSE system implementation effectiveness, and more than 1500 Safety climate survey participants. The methodology relies on one hand, on a Safety Culture Maturity framework to analyze perceptions at different levels of the organization, and on the other hand, on expert judgment following field observations and interactions with workforce and management across all subsidiaries. The overall results highlight perceived strengths and gaps in the current different elements of Safety culture at each subsidiary and highlight specific perceived strengths and shortcoming in Safety Culture. A second section of the article discusses how maturity models can be a legitimate tool despite some inherent limitations underlined by part of the safety science research community, alongside a consideration of some common critique of the safety culture construct. The article gives an illustration of how practitioners can use maturity models and how the results support the setting of improvement plans that integrate key risk themes to drive effective & sustainable enhancement. We also argue from a safety practitioner’s point of view that using maturity models give the opportunity to drive ownership and commitment processes that may be eventually more important that the deliverables. The paper provides elements for a better understanding of how underpinnings of maturity models articulate with assessment of safety culture and explore their methodological properties to show how some theoretical weaknesses can paradoxically become competitive advantages in the practical implementation of safety improvement approaches. |
13:45 | Simulation of Short-Range Field of View for Agricultural Machinery Based on Standards Requirements PRESENTER: Lorenzo Landi ABSTRACT. The increasing demand for large and tall self-propelled agricultural machinery has raised important safety concerns. This development has significantly reduced the operators' field of view, increasing the risk of serious or fatal accidents for both them and nearby workers. It has therefore become essential to address visibility and safety issues related to the use of such machines. In response to these challenges, this article proposes an innovative method to virtually analyze and verify the operator's field of view. By using a system based on ray tracing rendering, it is possible to assess the operator's visibility in accordance with both current ISO standards and the upcoming ones. In addition to ensuring compliance with these standards, the system also allows for the simulation of realistic scenarios involving interactions between agricultural machinery and nearby workers, thereby evaluating the operator's visibility in real tasks. The proposed method enables the direct identification of issues during the 3D design phase, allowing for targeted interventions on components that cause masking effects. In cases where direct visibility is deemed insufficient, it also offers the possibility of virtually implementing indirect vision systems and evaluating their effectiveness in improving visibility. Furthermore, the limitations of the current standard regarding the field of view near the tractor with rectangular boundaries will be discussed. In this regard, the virtual system can serve as a useful tool in defining criteria and limits to be adopted in future standards. The use of simulation and virtual prototyping of cabin to assure the correct Field of View from the driver’s position can be effectively used to shorten early design stage of new tractors. |
13:00 | A Practical Approach to Service-Oriented Life Cycle Cost considering Asset Health Index and Digitalization in Power Converters PRESENTER: Vicente Gonzalez-Prida ABSTRACT. The practical application of a Life Cycle Cost (LCC) analysis on power converters for energy storage sys-tems is described in this study. The main goal is to suggest a cost-effective structure for allocating converter expenses, where factors such as the cost of electric energy and annual production must be taken into ac-count. The approach includes a number of key phases, such as calculating CAPEX and OPEX as well as classifying and estimating total costs using the Woodward model. The study also stresses the need for up-dating cost values regularly, linking LCC with business decisions such as equipment depreciation and rein-vestment. This contribution is significant since it includes the use of cloud computing for maintenance im-provements together with the adoption of IoT technology solutions for data collecting, as well as the appli-cation of Asset Health Index (AHI) for failure rate update and monitoring. Consequently, it has been posi-tively proven that the multi-detail decision-making asset management model covers the search for a sus-tainable alternative in the domains of general asset, sustainability, and economic value. The findings indi-cate that the application of this LCC methodology can significantly improve power converter management by utilizing digital technologies and assisting in the development of service-oriented business models that maximize efficiency. |
13:15 | Innovative Approach for Asset Renewaling in Electricity Transmission Networks: Navigating Aging Infrastructure and Energy Transitions PRESENTER: Yasmine Debbagh Boutarbouch ABSTRACT. Electricity transmission system operators (TSOs), such as RTE, face growing challenges in asset management. These challenges are worsened by aging infrastructure, rapid technological advancements, and increasing demand driven by the transition to a low-carbon economy. Many European networks, established after World War II, urgently require equipment renewal, while the availability of materials is limited. In this context, TSOs must balance their limited resources for upgrading aging infrastructure with the development of new connections, such as offshore wind farms. The energy transition, which leverages existing infrastructures, requires strategic planning to optimize investments in a continuously evolving electrical system. To manage high failure consequences and recurring preventive replacement costs, TSOs often use age-based preventive replacement models. These models aim to determine the optimal replacement age that minimizes the expected equivalent annual cost (EEAC). However, these models rely on simplified assumptions that overlook the complex challenges of today. Traditional replacement policies assume identical replacements, which becomes problematic when long-lived materials are no longer available. This raises crucial questions: How can one determine the optimal replacement age of an asset that must be replaced with equipment of different specifications? Can a replacement age be established for an initial asset whose lifespan follows an exponential distribution? To address these issues, this presentation proposes an innovative modeling approach for EEAC. It incorporates crucial neglected factors: replacement with different specification assets and discounting reflecting time preference. Our study leads to several conclusions, including a new optimal replacement age for initially installed assets that depends not only on their characteristics but also on those of the replacement asset. We also find a significant impact of discounting on determining this optimal age, a notable gain in terms of EEAC, which can exceed 30% with our method, and a binary conclusion in the case of an exponentially distributed initial asset. |
13:30 | Optimizing Asset Replacement Time of Railway Infrastructure, Considering CO2 Emission PRESENTER: Ahmad Kasraei ABSTRACT. Climate change is becoming a more severe global issue that impacts the whole aspect of our communities. One primary driver of climate change is rising greenhouse gas (GHG) emissions, mainly CO2. A key contributor to these emissions is the linear economy model. This model contributes to increased production and extraction of raw materials, generating a considerable amount of waste without returning to the production cycle. For instance, railways are proven sustainable modes of transport, yet environmental concerns arise due to material impacts like steel, concrete, and asphalt. These materials account for 80% of the material-related climate impact. Therefore, reducing the use of these materials in constructing, operating, and maintaining railway networks is critical to minimizing adverse environmental effects. One effective strategy is extending the useful life of railway assets, which is one step toward circular economy implementation as the solution to these issues. The aim is to develop a new optimal replacement policy by integrating RAM parameters, life cycle cost (LCC), and sustainability key performance indicators (KPI). This study proposed a multi-objective optimization approach to determine optimal asset replacement time (ORT), balancing these three pillars to enhance sustainability in railway systems. The outcome of this study can be integrated into decision-making models that combine LCC and climate impact to optimize railway asset replacement strategies. |
13:45 | Trustworthy Anomaly Detection for Industrial Control Systems via Conformal Deep Autoencoder PRESENTER: Shuaiqi Yuan ABSTRACT. Industrial control systems (ICSs) are critical infrastructures that remain highly vulnerable to both accidental and intentional anomalies, potentially leading to dangerous scenarios. While machine learning (ML) models are increasingly used for anomaly detection in ICSs, concerns about their trustworthiness persist due to their "black-box" nature, lack of effective uncertainty treatment, and absence of prediction guarantees. A key challenge is the high rate of false alarms, which can overwhelm operators and lead to unnecessary shutdowns. To address this, we propose a novel approach integrating deep autoencoders with conformal predictions to achieve high anomaly detection performance while providing statistical guarantees on false alarm rates. Our method uses conformal prediction as a post-hoc technique to enhance uncertainty treatment in a CNN-LSTM autoencoder, yielding trustworthy anomaly detection results with guaranteed false alarm rates. Recognizing temporal distribution shifts in time-series data, we incorporate temporal quantile adjustment to dynamically adapt the anomaly detection threshold, further improving temporal false alarm rate guarantees empirically. We validate the proposed model's ability to detect both accidental and attack-induced anomalies while maintaining a controlled false alarm rate using a publicly available dataset. |
13:00 | Sustainable Asset Management: Introducing a Framework for Lifetime Extension PRESENTER: Luc Hossenlopp ABSTRACT. Asset lifetime extension is becoming a regular sustainability expectation, saving material and energy resources. General business interest in sustainability is thus creating a new traction on this topic, supported by emerging experiences, technologies, definitions, standard, and regulation. Purpose of this paper is to propose a framework to strengthen the approach consistency. This will be illustrated with real-word circuit breaker products used in safety applications. This is first about splitting the possible analysis on possible saving between ex-ante and ex-post situations, i.e. over the product entire lifecycle. Ex-ante is mostly about design actions, adding to the traditional simulators some life cycle analysis tools, using Product Specific Rules definitions regarding the Reference Service Life. Ex-post is leveraging the increasingly connected products (Internet of Things), AI and circular economy to extend lifetime of installed products. Ex-post complements the generic what-if simulation by specific asset instances, improving the estimation confidence level on real-time health estimation and on remaining lifetime estimation. More specifically, impact on maintenance service policy (corrective, preventive, predictive) facilitated by the new architectures, capability to extend spare part manufacturing and product upgrades will be discussed. Framework then integrates this into sustainability consideration at a system level, ensuring that the overall system saves resources. Baseline depends on the product life cycle situation. For the ex-post, changes of maintenance and extra-resources needed will also be integrated into the carbon equation. |
13:15 | Reliability and Environmental Sustainability Relationships: A Stochastic Analysis of Product Lifespan and Global Warming Potential PRESENTER: Dshamil Efinger ABSTRACT. The ecological footprint of a product is largely determined by its duration of use and the environmental impacts that arise throughout its entire life cycle. The Global Warming Potential (GWP) is an established method for quantifying the environmental impact across the entire life cycle of a product – from development, through manufacturing and use, to disposal. To assess a product's contribution to sustainability, the GWP must be considered in relation to the actual duration of use, which is expressed as GWP per unit of time (GWP/t). The duration of use of a product is determined by both its technical reliability, i.e., its lifespan, as well as external factors such as user preferences. This paper investigates the relationships and dependencies between product reliability and environmental sustainability. By analyzing stochastic effects, it is demonstrated how reliability and usage parameters influence the GWP/t. Based on this analysis, generalizable insights and design principles for the development of reliable and sustainable products are derived. |
13:30 | Empowering Aging Populations: Social Resilience and Sustainability through 'Meet Me Life ABSTRACT. Empowering Aging Populations: Social Resilience and Sustainability through 'Meet Me Life In an era where aging populations pose significant societal challenges, the "Meet Me Life" project merges fashion, design, and social interaction to address the vulnerability of older adults through creative, intergenerational collaborations. This project emphasizes the reorganization of wardrobes, creating opportunities for meaningful exchanges between different age groups while promoting social resilience and sustainability. By exploring personal stories and clothing styles, the project aims to strengthen the older population's connection with society, fostering well-being through design-thinking practices. In addition to facilitating cross-generational dialogue, this initiative incorporates the concept of a sustainable wardrobe, where garments are repurposed to reduce textile waste and extend the lifecycle of clothing. This approach aligns with the broader goals of sustainable fashion, which emphasizes reducing the environmental impact of the fashion industry while promoting mindful consumption. This study not only addresses the pressing environmental concerns associated with fast fashion but also highlights the emotional and cultural significance of clothing in older adults’ lives. The "Meet Me Life" project offers practical applications of design pedagogy, encouraging younger generations to develop empathy and deeper understanding through their collaboration with older participants. By actively engaging in the wardrobe reconstruction process, both age groups experience the value of sustainability and the importance of fostering intergenerational relationships, which are critical for social cohesion. This paper discusses the theoretical framework of sustainable fashion, design pedagogy, and the practical execution of the project, highlighting its capacity to mitigate the risks associated with social isolation, ageism, and the loss of personal identity in modern urban contexts. The project introduces an innovative model for inclusive fashion-based interventions that enhance the quality of life and safety for aging populations, contributing to the discourse on resilience and sustainability in design and societal governance. |
13:45 | Sustainability-focused Generative AI Risk Mitigation Strategies PRESENTER: Lin Shi ABSTRACT. The rapid rise of generative AI (GenAI) has sparked the sustainability community to explore its potential applications, such as climate impact modeling and renewable energy optimization. However, deploying these GenAI-powered solutions in enterprise environments raises risk concerns. In particular, chatbots and similar GenAI applications face risks of misinformation and disinformation stemming from knowledge sources, user prompts, and the response generation process. While traditional probabilistic analysis methods often struggle to effectively assess risks in GenAI applications, the Risk-Reducing Design and Operations Toolkit (RDOT) provides a qualitative complement for addressing these challenges. In this study, we propose a framework that applies the RDOT methodology specifically to GenAI applications in the sustainability domain, drawing lessons learned from an internal enterprise GenAI application development. We outline mechanisms for structured risk identification, testing, evaluation, and specific risk mitigation techniques. By embedding these techniques in the development and testing process, we enhance the reliability of sustainability-focused GenAI solutions. We found that 34 (out of 111 or 31%) of the RDOT strategies have already been utilized in the internal GenAI application with 10 of them showing particular value in sustainability-focused GenAI application development. Another 17 (15%) were not utilized but are highly promising. Our finding addresses a gap in current practices, providing sustainability practitioners with a systematic way to navigate the challenges of deploying GenAI technologies in real-world settings. |
14:00 | Exploring sustainability practices in the Albanian business environment ABSTRACT. This paper aim is to explore sustainability in business operations, with a particular emphasis on the Albanian context. This paper aimed to provide risk based insights into the environmental, economic, and social dimensions of sustainability, and to address how businesses can integrate sustainable practices into their operations. Key topics included circular economy activities, sustainability reporting (non-financial reports), risk management and the alignment of business practices with environmental standards. The paper is based on a mixed method approach, combining qualitative data gathered through group interviews, and quantitative data gathered through a questionnaire. The study emphasized the need for developing knowledge to support sustainable entrepreneurship through innovation and upskilling, with participants—ranging from environmental experts to entrepreneurs—sharing best practices and identifying areas for improvement. Economic benefits of sustainable development are analyzed and the social aspects of sustainability are also considered, with participants discussing current practices and potential improvements within the Albanian business environment. This paper addresses sustainability challenges and explores opportunities for Albanian businesses to embrace circular economy principles and improve sustainability reporting practices. Insights from the paper will inform the design of future risk management strategies within the business sector promoting sustainable entrepreneurship and fostering positive change in Albanian business operations. |
13:00 | Driving instructor’s communication with students PRESENTER: Audun Stiansen ABSTRACT. Driving instructors have different ways of communicating with their students, but there is little research on which communication styles best facilitate higher learning (De Stefani & Gazin, 2019; Watson-Brown et al., 2020). The governing documentation for driving instruction in Norway offers no guidelines for how the teacher should communicate with the student (Glein & Lødemel, 2017). The research question for this study is: How should driving instructors communicate with their students during driving lessons to facilitate higher-order driving skills? The analysis is based on constructionist learning theory, where learning is a mental process building on the trainee's existing knowledge (Davis et al., 2017; Laurillard, 2009; Schunk, 2014). Humans are seen as active participants in the learning process, and social interaction is important for learning to happen (Laurillard, 2009; Schunk, 2014). This makes communication and discussion important pedagogical tools, The data material consists of seven in-depth interviews with driving instructors in Norway. Analytically, we divide the driving hour into three parts: before, during, and after driving. The driving instructors have a short chat with the students before driving, where they talk about the previous lesson, whether the student has done the homework and the plan for today's lesson. It varies how much the instructor talks during the driving, depending on how they know the student and his preferred way of learning. Some students can get stressed if the instructor talks a lot while the student is focusing on driving, and these students learn more from guidance before and after driving. The lesson concludes with a short conversation. According to constructionist learning theory, this conversation should be longer to achieve reflection on the student's learning, which is needed to achieve higher-order driving skills. We argue that both pre-driving and post-driving communication is important to achieve higher learning. |
13:15 | In-Vehicle Infotainment System and driver distraction PRESENTER: Isabelle Roche Cerasi ABSTRACT. The objective was to understand the safety concerns related to the use of in-Vehicle Infotainment systems and the future road safety policy implications. A total of 40 students studying the driving instructor program in Norway were asked to use four IVIS functions: 1. Select a radio channel, 2. Change the car interior temperature, 3. Choose music on a streaming platform, and 4. Enter an address on the navigation system. They drove a car with double pedal set on a specific route with a safety instructor on board. The rides were registered with the use of eye-tracking. The results show that touchscreen use requires considerable attention and distracts drivers away from the road and the traffic situation. The NASA Raw Task Load Index and the evaluation of drivers' attention show that entering an address on the navigation system led to longer and more fixations and increased the workload and the coordination between brain, eyes and hand. The average total time to solve the task was of 44s and of the 33s of the fixation time (excluded the saccades), there were an average attention of 17s on the road and traffic and 16s on the touchscreen. The average durations of the fixation points were of 0.3s on the road and 0.4s on the touchscreen. The driving process is affected, and drivers fall into a “mode” that deviates from the normal driving style. Safe driving requires that the driver continuously orients himself, distributes his attention so he is up to date on what is happening in the traffic. The longer the fixations are on the touchscreen, the more the traffic context may have changed. The risk incurred and the drivers’ ability to react to risky situations is discussed, as well as the cognitive load associated with switching sequences between road and touchscreen. |
13:30 | Pedagogical observation (peer learning) as a learning activity in driving teacher education in Norway ABSTRACT. The purpose of this article-based thesis (PhD) was to explore the learning activity of pedagogical observation (peer learning) in driving teacher education in Norway. Driving teachers are an important part of the road traffic system and the vision zero thinking, and key people communicating traffic safety strategies and attitudes towards new drivers. This thesis illuminates and discusses the following three research questions: How do students perceive pedagogical observation? What do they learn from this form of activity and which conditions are important for their learning outcomes? How do student driving teachers use pedagogical observation in their teaching practice? The findings are discussed in the light of the theory of practice architectures. The thesis relies on a sociocultural view of learning. Data has been collected in the form of 10 individual interviews, 9 field observations and 2 focus group interviews of student driving teachers. Transcribed data was analysed using thematic analysis. Student driving teachers learn to cooperate with others, and they become reflective and critical regarding the execution of driving lessons. The learning activity can result in five shared learning outcomes: collaboration with others; engage in critical inquiry and reflection; convey and articulate knowledge, understanding and skills; manage learning and how to learn; and to conduct self and peer assessments. Findings show that there must be a plan for the driving lesson and a working agreement. Students must be engaged, and the feedback must be constructive with subject knowledge. In addition, good cooperation and guidance skills is required. The thesis proposes a description of pedagogical observation which includes a mutual learning activity, the acquisition of knowledge and skills by active help and support between students, in light of their sayings, doings and relatings. Sustainability is emphasized through the focus on long-term learning outcomes and the continuous improvement of teaching practices. |
13:45 | Enhancing pedestrian safety at track crossing: a motion analysis study PRESENTER: Ahmed Yacine Lardjane ABSTRACT. This study introduces motion analysis as an innovative method for evaluating pedestrian safety at Pedestrian Tracks Crossing (PTC), addressing the limitations of traditional assessment techniques. The research employed markerless video analysis to examine the movement patterns of 22 healthy participants (mean age: 22.8±2.8) under various conditions, including standard slopes, tactile pavements, and safety markings. This innovative approach allowed for precise measurement of “classic” parameters in the field of crossing such as walking speed and new ones like gait variability, stopping times, and postural stability, providing unprecedented insights into pedestrian behavior in response to safety features. The methodology involved a series of controlled experiments where participants performed specific locomotor tasks in simulated PTC environments. The markerless video analysis technique enabled non-invasive, detailed tracking of body movements, offering a comprehensive view of how individuals interact with different safety measures. This approach overcomes both the subjective nature of questionnaires, the limited scope of observational studies and the experimental shortcomings of traditional invasive motion capture, hence providing quantifiable data on subtle behavioral changes. Results revealed significant alterations in movement patterns across different conditions. Notably, tactile pavements reduced walking speeds and improved emergency stopping times, with participants halting 100 ms faster on textured surfaces. These results highlight the potential of specific safety features to enhance situational awareness and response capabilities, effects that may be imperceptible through conventional evaluation methods. The application of motion analysis in PTC safety represents a significant advancement in risk assessment. By offering objective, detailed data on human-environment interactions, this methodology paves the way for more effective safety measure design and implementation. Furthermore, it demonstrates the potential for cross-disciplinary approaches in safety sciences, combining elements of biomechanics, environmental psychology to address complex safety challenges. |
13:00 | Outlier fault diagnosis via deep fence learning for industrial equipment PRESENTER: Shixin Jiang ABSTRACT. To address the difficulty of diagnosing outlier faults that deviate from the sample distribution outlier, a fault diagnosis method based on deep fence learning is proposed. A novel fence loss function is applied to supervise deep models for feature learning. Fence boundaries are used to identify the outlier samples during feature learning, and high penalties are imposed on the outlier samples to increase the deep model's attention to outlier samples. By shrinking the fence boundaries, intra-class samples are encouraged to cluster as far as possible toward the class center, avoiding the problem of misdiagnosis caused by intra-class samples deviating from the sample distribution. Compared to traditional deep fault diagnosis methods, the proposed method achieves a good average diagnostic performance improvement, effectively improving the fault diagnosis accuracy of industrial equipment. |
13:15 | A Topology-Based Method for Quantifying and Evaluating the System Structural Fault Tolerance Capability PRESENTER: Yi-Yang Shangguan ABSTRACT. In modern complex systems, the capability to sustain operations despite component failures is defined as the fault tolerance capability. To ensure system reliability, it is essential to quantify and evaluate the fault tolerance capability systematically. However, existing research primarily focuses on failure propagation and lacks the consideration of critical thresholds for fault tolerance, leading to inadequate guidance for fault-tolerant system. To tackle these challenges, this paper proposes a topology-based method for quantifying and evaluating the system structural fault tolerance. By modelling systems as the directed graph based on the functional logic, we introduce the all functional paths (AFP) to measure redundancy through valid input-to-output paths. The structural fault tolerance index is derived by normalizing AFP values under fault conditions against a fault-free baseline, which is applicable to both single-input single-output (SISO) and multi-input multi-output (MIMO) systems. Additionally, we define the fault tolerance threshold based on the minimum redundancy requirement and introduce the margin to assess compliance. A case study on the redundant flywheel system (RFS) demonstrates the method’s feasibility, and we calculate the structural fault tolerance index and the corresponding fault tolerance margin of the RFS in each state. The proposed framework bridges component-level faults to system-level functionality, offering actionable insights for evaluating and improving redundant system. |