SIMS 2020: 61ST INTERNATIONAL CONFERENCE OF SCANDINAVIAN SIMULATION SOCIETY
PROGRAM FOR THURSDAY, SEPTEMBER 24TH
Days:
previous day
all days

View: session overviewtalk overview

09:30-10:15 Session 10: Keynote 3: Dr. Tuula Ruokonen
09:30
Digital Twins utilization throughout the Life Cycle of Industrial Processes

ABSTRACT. In this presentation, Digital Twins for industrial processes are considered from their historical and future point of view. What is actually a Digital Twin – is their only one or several for different purposes? What enables the development of Digital Twins just now? Which benefits and challenges are there in their development and implementation?

There are many definitions for Digital Twins. Most state that a Digital Twin is a virtual representation of a physical product or process, used to understand and predict the physical counterpart’s performance characteristics. Digital Twins are used throughout the product life cycle to simulate, predict, and optimize the product and production system before investing in physical prototypes and assets.

Already 30 years ago, such systems and simulators were developed and utilized which for sure would nowadays be called Digital Twins, concrete examples including Computer Aided Design, Process Modelling and Dynamic Simulators, Advanced Process Control, Condition Monitoring, Expert and Knowledge-Based Systems, and even Remote Expert Services.

Digital Twins are presently at the top in their hype curve, and their enabling technologies develop strongly and rapidly. We are facing partly evolution, partly revolution in their development. For example increased computing power enables real time analytics, cloud-based computing enables flexible calculation capacity, mobile technology enables mobile and remote applications, wireless sensors enable additional measurements, and Artificial Intelligence and Machine Learning tools enable advance analytics. Furthermore important is the connection to Internet of Things and Industrial Internet applications development.

Potential to utilize Digital Twins in industrial processes and equipment is wide, them forming ideally a digital thread throughout the whole life cycle. The goal is efficient information management, its utilization and updating in all phases of the life cycle: product development, production planning, sales, project implementation, operations optimization, personnel training, process operation and maintenance.

Why is the utilization of Digital Twins still so difficult or even impossible? Challenges are created by separate functional processes and IT systems, and especially by organizational silos and suboptimization of goals, in different phases of the life cycle. Open questions exist still related to common data models and standards, and model updates. Own challenges come from the data ownership and principles of sharing data between related actors, equipment manufacturers, end users and service providers, related to design data and operation-time data management.

Looking forward that these challenges and open questions will be solved and the vision of up-to-date Digital Twins, utilized in the whole life-cycle, comes true and enables the performance optimization of processes and equipment in the future autonomous mills and plants.

10:30-11:50 Session 11A: Digital Twins
10:30
Adaptation framework for an industrial digital twin
PRESENTER: Antti Koistinen

ABSTRACT. Digital twins are the latest hot topic for simulation tools. Having foundations in manufacturing processes and Industrie 4.0 with discrete time-dependency. In continuous processes, the digital models have been an important part as a design tool or as an open-loop decision support tool (scenario simulation, prediction). The digital twins can be classified in term of their utilization in different tasks and areas: i.e Design twin, Performance twin and Product twin. A more comprehensive classification can be made in terms of real-time integration between the twin and the physical counterpart; Digital twin needs to have a closed-loop, automatic integration to the real process. Otherwise the correct term is a digital model (no integration) or a digital shadow (open-loop integration). Foreseen possibilities of digital twins in continuous processes have been discussed e.g. in [1]. In case of digital twins and shadows in these kinds of performance-oriented applications, it is crucial that the simulation model represents the real system continuously. Therefore, the performance of the digital model needs to be evaluated and the model need to be updated to cope with unseen or unmodeled changes. The continuous updating, or model adaptation requires efficient data-analysis and optimization tools. In general, the adaptation can be based on several techniques [2]. However, the both the real-time requirements and the model complexity (interconnected measurements and parameters) makes the adaptation problem challenging. Several methods have been presented to match the physical process with the digital model [3-6], but with limited insight on the whole problem.

10:50
Sensor fault detection with Bayesian networks
PRESENTER: Esin Iplik

ABSTRACT. Several sensors are installed in the majority of chemical reactors and storage tanks to monitor temperature profiles for safety and decision-making processes such as heat demand or flow rate calculations. These sensors fail occasionally and generate erroneous measurement data that need to be detected and excluded from the calculations. However, due to the high number of process variables displayed in the chemical plants, this task is not trivial. In this work, a Bayesian network approach to detect faulty temperature sensors is proposed. By comparing the sensor measurements with each other, the faulty sensor is detected. A modular approach is preferred, and networks are created for 10 K temperature intervals to increase flexibility and sensitivity. Created networks can be adjusted for the operating temperature ranges; hence, they can be used for any catalyst and entire life cycle. The developed method is demonstrated on an industrial scale hydrocracker unit with 92 sensor couples installed in a series of reactors. From the investigated sensors, 16 of them showed a greater difference than the 2 K threshold chosen for the fault. In addition to that, 13 sensors showed an increasing temperature difference that may lead to a fault. Two scenarios were created to calculate the energy loss due to a faulty measurement, and a 5.5 K offset error was found to cause a 5.79 TJ energy loss every year for a small scale hydrocracker.

11:10
A lightweight, agile and modular method for producing digital twins using game engines
PRESENTER: Tauno Tepsa

ABSTRACT. In virtual reality (VR), a visual model of an item of equipment item is usually produced using 2D and 3D models or laser scanned point cloud data with images. Modelling the equpment’s functionality, creating a digital twin (DT), is a demanding task if the model is expected to accurately represent all system features. The complexity of modelling can be an obstacle for small and medium-sized enterprises, discouraging the utilization of digital twins.

11:30
Testing ERP and MES with Digital Twins
PRESENTER: Juha Hirvonen

ABSTRACT. Enterprise resource planning (ERP) systems and manufacturing execution systems (MES) are becoming more and more important also for the small and medium-sized enterprises (SMEs). Even though the failure rates of the ERP projects seem to be exaggerated, the failures and problems in the integration cost lots of time and money. Therefore, there is clearly a need for test environments for the ERP and MES systems. This paper presents an approach to test these information systems by connecting them to the simulation models of the production and thus generating a digital twin of the production and the ERP/MES systems. As a proof-of-concept, two different twins are constructed with two different software.

10:30-11:50 Session 11B: Process Development and Scaling
10:30
Heat and Mass Transfer Model for Droplets with Internal Circulation
PRESENTER: Mathias Poulsen

ABSTRACT. In large droplets the internal resistance to heat or mass transfer has to be accounted for. Two respective models for low and high Reynolds numbers are investigated. The models are solved numerically for a range of modified Peclet numbers and the corresponding transfer numbers, representing either the Nusselt or Sherwood numbers, are determined. The results for each model are fitted to produce an expression that can be easily evaluated for use in a CFD code. The fits has mean deviations of 0.63% and 0.035% for the low and high Reynolds number models respectively. A proposed switching Reynolds number is used to combine the models and the combined model is compared to temperature measurements of free falling water droplets. It was found that the model is in good agreement with the data for the smallest droplets whereas it deviates as much as 40% for the larger droplets in the data set.

10:50
Improving gas distribution and evaporation in gas cooling tower at Norcem Brevik
PRESENTER: Rajan K. Thapa

ABSTRACT. During cement production at Norcem, flue gas from clinker is cooled from 400°C to 180-280°C before cleaning. The hot flue gases are cooled down by evaporation of sprayed droplets and relies on even flow through gas distributor. Today’s design make use of only one gas distribution screen which results in insufficient distribution thus incomplete evaporation of cooling water. This creates spilling formation at the bottom of the cooling tower.

In this work the effect of geometry of GCT (Gas Cooling Tower) on the flue gas distribution is investigated by using CPFD (Computational Particle Fluid Dynamics) modeling and simulation. Barracuda Virtual Reactor is used to perform the simulations of a baseline model, representative of the existing cooling tower.  Results from simulations show poor distribution of flue gas and recirculation zones occurring on both sides of distribution screen. The cause of this uneven distribution is attributed challenging geometry and poor screen performance. To counteract this problem, a new CPFD model is developed with a second screen and guide vanes. The screens are placed in the lower part of the GCT’s diffusor, whilst the guide vanes are in the diffusor’s inlet duct. The screens and guide vanes are modelled as baffle computational cells with zero thickness. For the screens, a pressure drop is applied by using a blockage factor. This represents the gas passing through the distribution screen. The implementation of a second screen eliminates the recirculation zone below the screens and improve the distribution. Gas velocity is measured in 119 evenly spaced transient data points at the level of water injection. The standard deviation of all these measurements is used as an improvement indicator.

Water is injected through 16 nozzles placed below the distribution screen. In the baseline model water particles are moving with the recirculation zone, and radial velocities are transporting them along with dust to the tower wall. This problem is reduced with two screens, but some wall separation occurs over the lower screen which causes backflow along the walls in the evaporation zone

11:10
A new Approach for Scaling Up from a Small Cold to a Large Hot Bed for Biomass Conversion in a Bubbling Fluidized Bed Reactor

ABSTRACT. Bubbling fluidized beds are simple and attractive means of achieving efficient conversion of biomass if particle segregation and the associated effects are minimized. To improve the knowledge of fluidized bed reactor design, this paper compares the behavior of a hot bed containing a certain amount of biomass with the behavior in a cold bed having the same biomass loads and particle properties. An approach for scaling up a cold bed to a large hot bed for the same volume fraction of biomass is introduced. The proposed scheme uses the bed expansion ratio as an output from the cold bed. This approach provides an accurate means of attaining dynamic similarity in bubbling behavior between two different beds without constraining the fluid and particle properties as well as the bed height.

11:30
Challenges of CAD conversion to 3D visualisation environments with respect to kinematic dependencies
PRESENTER: Philipp Braun

ABSTRACT. Due to the increasing utilization of 3D development environments for industrial use cases by emerging topics like virtual reality, new challenges in the conversion of computer-aided-design (CAD) models to 3D models have become visible. Particularly, the conversion of the kinematic constraints turns out to be complex and requires extensive manual efforts. This paper discusses these challenges with a focus on the problems in the conversion of kinematic constraints. Further, a new approach is proposed and validated regarding the automated conversion of the kinematic constraints of an exemplary CAD model. Thereby, this work contributes to an accelerated and simplified development process of industrial applications within 3D development environments.

12:30-13:50 Session 12A: Process Simulation
12:30
Predicting centrifugal compressor off-design operation in an operator training simulator

ABSTRACT. Pumps, compressors and turbines are vital parts of any process plant. Thus, accurate modeling of these machines is crucial for a dynamic process simulator, especially if the simulator is used for operator training. The operation of the machines is usually described by operating curves provided by the manufacturer, but these are strictly valid only for the set of process conditions they were created for. Accurately predicting the performance of the machines outside of the design conditions can be a challenging endeavor.

ProsDS is a software used by NAPCON to construct operator training simulators. In order to improve the performance of the compressors in these simulators, a new dynamic model for a centrifugal compressor was developed in ProsDS. The developed model was based on dimensionless operating curves defined by the head coefficient and the exit flow coefficient. Simple calculation of the outlet temperature and power consumption was also included.

The stability of the developed model was verified by dynamic ramp tests of the operating conditions. Furthermore, the accuracy of the developed model was determined by performing process data tests. The results were promising and, in most cases, an improvement from the performance of the currently implemented model, although some further development of the new model is needed to reach a fully acceptable accuracy of the performance.

12:50
Application of population balance equation for continuous granulation process in spherodizers and rotary drums

ABSTRACT. In this paper, a dynamic model for a granulation process is developed. A population balance is used to capture dynamic particle size distribution in the spherodizer model and in the rotary drum granulator model. Particle growth due to layering is assumed in the spherodizer simulation model, while particle binary agglomeration is taken as the main granulation mechanism in the rotary drum simulation model. The developed models are 2-dimensional(2D) models that are discretized in terms of its internal coordinate (particle diameter), external coordinate (axial length of the granulators). Simulations using the developed models provide valuable data on dynamic fluctuations in the outlet particle size distribution for the granulators. In addition, the simulations results give a valuable information for the control studies of the granulation process. The simulation results showed that the extension of the model from 1D model to 2D model using the discretization of the external coordinate in the model, introduces a transport delay that is important in control studies.

13:10
Solving the population balance equation for granulation processes: particle layering and agglomeration

ABSTRACT. Granulation processes are frequently used in the fertilizer industry to produce different grades of mineral fertilizers. Large recycle ratios and poor product quality control are some of the problems faced by such industries. Thus, for real time model based process control and optimization, it is necessary to find an appropriate numerical scheme that can find solution of the model  sufficiently accurate and fast. In this study, population balance principles were used to model particle granulation processes. Different numerical schemes were tested to find simple yet sufficiently accurate solution schemes for population balance equation. Numerical schemes were applied to find the solution of both  the layering term and the agglomeration term that appear in the population balance equation. The accuracies of the numerical schemes were assessed by comparing the numerical results with analytical, tractable solutions. Comparison of the accuracy of numerical schemes showed that a high resolution scheme with Koren flux limiter function might be a good choice for the  layering term term discretization, while a cell averaging technique  produce a sufficiently accurate solution for the agglomeration term discretization.

13:30
Fertilizer Granulation: Comparison of Modeling Languages
PRESENTER: Bernt Lie

ABSTRACT. Population balances describe dynamic systems with both external and internal coordinates, leading to highly distributed models which are time consuming to solve (Wand and Cameron, 2007; Litster and Ennis, 2004; Iveson et al, 2001; Ramkrishna, 2000). Vesjolaja et al. (2018) describe a population balance model for granulation of fertilizers, including both growth by layering and growth by agglomeration. As a population balance model, the model is relatively simple and homogeneous in the drum axial position as external coordinate, and particle size as internal coordinate. The two growth mechanisms require different types of discretization algorithms. In this simple implementation, the particle size is discretized into 80 different sizes. With 80 states in the model, the key output is the particle size median. The resulting model is relatively complex as a dynamic model for control, but still simple as pertaining a population balance model. In a future stage models will be extended with distribution in the external coordinate. It is of interest to compare different modelling languages wrt. solution efficiency.

The model is designed for control synthesis. Standard controllers include proportional (P) and proportional + integral (PI) controllers, which often are tuned based on some tuning rule. Controllers of mid-level complexity are based on linear approximations of the model, e.g., root locus methods, synthesis based on Nyquist, Nichols, or Bode diagrams, as well as linear quadratic controllers (LQR) and linear Model Predictive Control (MPC). Thus, it is also of interest to consider modelling languages wrt. how they can aid in controller synthesis, e.g., by providing linearized model approximation.

Currently, the model has been implemented in MATLAB. It is of interest to compare solution speed using different languages. Here, C++ is chosen as a benchmark language for fast solution; Fortran would perhaps provide even faster solution, but C++ is more readily available. Julia is a recent, free computer language (Bezanson et al., 2017) with an extensive package for solving differential equations (Rackauckas and Nie, 2017). Julia uses Just-in-Time (JIT) compilation with strong typing, and thus provides a bridge between easy-to-use script languages and compiled languages. Julia also has other advantages with simple-to-use, free packages for Automatic Differentiation (AD) and linearization (Revels et al., 2016), simple-to-use, free packages for computing with distributions (Besançon et al., 2019) such as particle size distribution, etc.

To this end, in this paper we compare the performance, code complexity, variety of features/libraries, and simplicity of coding in these three programming languages. We show that Julia is not only comparable with C++ in speed and MATLAB in code simplicity, but also offers very good packages for linearization and measures of distribution.

References

S.M. Iveson, J.D. Litster, K. Hapgood, and B.J. Ennis (2001). “Nucleation, growth and breakage phenomena in agitated wet granulation processes: a review”. Powder technology, Vol. 117, nos. 1-2, pp. 3–39.

J. Litster and B. Ennis (2004). The science and engineering of granulation processes, volume 15. Springer Science & Business Media. D. Ramkrishna (2000). Population balances: Theory and applications to particulate systems in engineering. Academic press.

L. Vesjolaja, B. Glemmestad, and B. Lie (2018). “Population balance modelling for fertilizer granulation process”, In (editors: Øi, Lars Erik, Komulainen, Tiina, Bye, Robin T., and Nord, Lars O.) Proceedings of the 59th Conference on Simulation and Modelling (SIMS 59), OsloMet, Oslo, Norway, September 26th – 27th, 2018, pp. 95–102. Published by Linköping University Electronic Press, ISBN: 978-91-7685-417-4, ISSN (on-line): 1650-3740, doi: http://doi.org/10.3384/ecp181531.

F.Y. Wang, and I.T. Cameron (2007). “A multi-form modelling approach to the dynamics and control of drum granulation processes”. Powder Technology, Vol. 179, nos. 1-2, pp. 2–11.

Jeff Bezanson, Alan Edelman, Stefan Karpinski, and Viral B. Shah (2017). “Julia: A Fresh Approach to Numerical Computing”, SIAM REVIEW, Vol. 59, No. 1, pp. 65–98.

Christopher Rackauckas, and Qing Nie (2017). “DifferentialEquations.jl – A Performant and Feature-Rich Ecosystem for Solving Differential Equations in Julia”. Journal of Open Research Software, Vol. 5, no. 15, DOI: https://doi.org/10.5334/jors.151.

Jarrett Revels, Miles Lubin, and Theodore Papamarkou (2016). “Forward-Mode Automatic Differentiation in Julia”, arXiv:1607.07892 [cs.MS], https://arxiv.org/abs/1607.07892. Mathieu Besançon, David Anthoff, Alex Arslan, Simon Byrne, Dahua Lin, Theodore Papamarkou, and John Pearson (2019). “Distributions.jl: Definition and Modeling of Probability Distributions in the JuliaStats Ecosystem”, arXiv:1907.08611v1 [stat.CO], https://arxiv.org/abs/1907.08611v1.

12:30-13:50 Session 12B: Monitoring and Control
12:30
The effect of input distribution skewness on the output distribution for project schedule simulation

ABSTRACT. Monte Carlo simulation is increasingly used to determine a probability distribution for project cost and duration. The cumulative distribution enables the project manager to determine the probability of finishing a project within budget or finishing a project on time at various stages of the project. A schedule risk simulation for a project can be updated as a project is being executed with actual values for activity duration. The probability of finishing the project by the due date can thus be captured at various stages or phases of the project and plotted on a timeline. This plot could be useful to estimate the final duration of the project. Two notable projects in which schedule simulation was used extensively are the Resend bridge project in Denmark and Sweden and the Gotthard base tunnel in Switzerland. In the former the bridge and tunnel was completed 5 months ahead of schedule and in the latter the tunnel was completed a year earlier than initially planned. Various distributions have been proposed to model the uncertainty in the duration of project activities, e.g. the BetaPert, triangular, normal and lognormal distributions. The normal distribution is symmetric and the lognormal distributions is right skewed. The BetaPert and the triangular distributions can be left skewed, symmetric or right skewed. If all the activity durations of a project are modelled with the normal distribution one would expect the total project duration to be symmetric as well. If all the activities are modelled with a right skewed distribution one would expect the total project duration distribution to also be right skewed. The skewness of the input distributions would therefore determine the skewness of the output distribution. Schedule simulation is mostly used to determine the 90% (P90) or 95% (P95) certainty duration of the project and the skewness of the output distribution will determine the value of the P90 and P95 values. The choice of input distribution is therefore important to obtain a good approximation of the P90 and P95 values. An arbitrary project network with 14 activities in series and parallel was chosen to study the effect of skewness of input distributions on the skewness of the total project duration and the P90 and P95 values of the duration. Duration values were selected for a triangular distribution in such a way that there are multiple paths of the network. The parameters of the triangular distributions for each activity were chosen such that any of the paths could provide a maximum total duration for a simulation run. Project managers seem to agree that project activity durations have right skewed distributions, therefore the parameters of the triangular distributions were also chosen to provide positive skewness, i.e. right skewed. The mean and standard deviation values for the input distributions were used to determine the parameters of sixteen other distributions. Only distributions with two parameters were considered for this study. A Monte Carlo simulation was performed using the project network and 100000 trials. The output of the simulation was the cumulative distribution for the total project duration. The duration at increments of 5% probability was provided and the 90% and 95% values were compared for the sixteen input distributions. The skewness of the project duration distribution correlated very well with the mean skewness of the 14 input distributions. The correlation coefficient was 0,97. The P90 and P95 values for the project duration also correlated well with the mean skewness of the input distributions. Correlation coefficients of 0,82 and 0,91 were obtained respectively. A strong positive correlation was also found between the mean of the standard deviations of the input distributions and the skewness of the project duration distribution. The correlation coefficient was 0,96. The differences in P90 and P95 values for the different input distributions were small for all distributions that are symmetric, i.e. normal, logistic, Laplace and uniform distributions. The differences for these 4 distributions could be related to the difference in kurtosis of the input distributions. Larger differences in P90 and P95 values were found for the extreme value distributions, i.e. the Gumbel, Fréchet and Weibull distributions. The results of this study provide useful insight into the choice of input distributions for project schedule simulation. Distributions with negative (e.g. Weibull) or zero skewness (e.g. normal, logistic) would underestimate the P90 and P95 durations for the total project. Distributions with positive skewness would lead to larger values for P90 and P95. A conservative approach for project and risk managers could therefore be to estimate the parameters of the triangular distribution for the duration of each activity of the project network since it is easier or simpler than estimating the parameters for distributions like the Fréchet or Gumbel distributions. The mean and standard deviation values can then be used to determine the parameters of any other distribution with two parameters. These right skewed distributions can then be used in the schedule simulation to obtain the cumulative distribution for the total project duration.

12:50
Development of a model to estimate parameters in a snowpack based on capacitive measurements

ABSTRACT. In the optimization process of hydropower production, it is relevant to consider some information about the snowpack’s parameters. Today, several techniques and devices to measure density, height, and snow water equivalent (SWE) in a snowpack. This paper discusses the development of linear regression models based on voltage measurements collected in a field test of a new measuring device that uses a vertical arrangement of capacitive sensors, to predict density, height, and SWE in a snowpack. The data collected grouped into six data sets and analyzed using the software for multivariable analysis Unscrambler X. From the results, three models were selected, one for each parameter. The models have a good prediction performance within the collection of samples. However, the model data sets used in the process do not have good representativity for other sampling conditions.

13:10
Real-time monitoring of wood cladding spray painting properties and nozzle condition using acoustic chemometrics

ABSTRACT. An experimental setup simulating real-time wood cladding coating monitoring of nozzle conditions and spray paint properties has been investigated. This approach is based on affixed nozzle accelerometer sensors with appropriate signal conditioning and chemometric data analysis (PCA). The aim was to develop effective visualization of different process states using PCA score plots. The present feasibility study shows that this approach can be used as a basis for further development towards a Process Analytical Technology (PAT) spray monitoring system able to work in the harsh environment of an industrial wood cladding paint box. However, there is still a significant amount of on-site industrial calibration and R&D necessary before a final method validation can be executed. The present results rely on permanently affixed PAT sensors. Further studies will a.o. focus on the degree to which replacement of acoustic accelerometer sensors necessitates recalibration of the multivariate data models employed, which is a critical success factor in industrial implementations.

13:30
Notes on the Floquet-Lyapunov Theory of Linear Differential Systems

ABSTRACT. This short remark describes some of the key points of the Floquet-Lyapunov theory of linear periodic differential systems. The basic transformation is presented, and its properties from the point of view of the source and target matrices and stability are discussed. The proposed methodology can be used in analysis, control design and simulation of systems with time-periodic characteristics. A method for designing a stabilizing state feedback control law is proposed. The methodology can be used in analysis and controller design for example in processes involving rotating machines and engines.

14:05-14:30 Session 13: SIMS 60+
14:05
Scandinavian Simulation Society 60+ ready for future challenges

ABSTRACT. Scandinavian Simulation Society (SIMS) started over 60 years ago as a society for the analog simulation: Scandinaviska Analogmaskinsällskabet (SAMS) was founded in 1959 in Västerås. The current name was taken in 1968 when SIMS moved successfully to the digital simulation. SIMS is currently a society of societies which operates in five countries Denmark, Finland, Iceland, Norway and Sweden. SAMS started in Sweden, Denmark and Norway. Finland joined in the 60s and in 1972 SIMS conference was organized the first time in Finland. In 2012, the SIMS conference went to Iceland. SIMS is the eldest active simulation society in the world. After 60 annual conferences, the 61st conference is the first virtual conference.

Applications have continuously been important parts of the conferences. Steel industry, flight simulators and atomic energy were active already in the beginning. Industry, Energy and Environment are important areas of SIMS. Applications in the energy field have extended from power plants to sustainable energy: solar, wind and geothermal, especially in Iceland. Processes, including the forest, steel and chemical industry, as well as oil & gas production have kept an important role. Increasing interests have been seen in water and wastewater treatment, biogas production and bioprocesses. The annual conferences circulate sequentially in the SIMS countries and the topics adapt in the local interests. SIMS has stimulated automation in these versatile fields, also through representative organizations like Finnish Automation, Automation region in Sweden and Norwegian Automation.

Numerical methods and combined differential and algebraic equations have formed the basis for process models first in Fortran, then Matlab and Simulink. Computational intelligence, AI and various model builders, e.g. gPROMS, became active in the 90s. The full range from PCs to supercomputers is used. The developments new simulation tools has been important during the years: Apros was introduced in 1986 and Modelica in 1996. As a spinoff, we have the active Modelica Association. Graphics becomes more advanced, open source codes are coming more popular and the integration of methodologies and tools developed for different applications areas extends the application areas.

SIMS provides strong support for PhD students by bringing together different methodologies, applications, software tools and people. Efficient simulation tools help in bringing different ideas within the education.

International cooperation has been essential throughout the years. An agreement with International Association for Mathematics and Computers in Simulation (IMACS) dates back 1976 resulting the IMACS World Congress in Oslo 1985. The European Simulation Multiconference was organized in 1991 in Copenhagen. SIMS is an active member society in the Federation of the European Simulation Societies (EUROSIM) founded in 1992. The EUROSIM congress has been organized twice by SIMS: 1998 Helsinki and 2016 Oulu. The SIMS EUROSIM 2021 is starting a new EUROSIM conference series to held every third year.

The stimulated use of new simulation tools, a wide scale of applications in industry, energy and environment, all stimulated by the automation and young generation of researchers have kept SIMS going for decades.

14:30-15:30 Session 14: Panel discussion: Future challenges and possibilities for simulation

Chair: Adj, prof. Jari Ruuska, Control Engineering, Environmental and Chemical Engineering, Faculty of Technology, University of Oulu, Finland

Panelists:

  • Prof. Peter Fritzson, Department of Computer and Information Science, Linköping University, Sweden:
  • Prof. Sebastien Engell, Process Dynamics and Operations Group, Biochemical and Chemical Engineering Department TU Dortmund, Germany
  • Dr. Tuula Ruokonen, Director, Digital Services Solutions in Valmet Technologies Oy, Finland
  • Prof. Bernt Lie, SIMS President, Department of Electrical engineering, Information Technology and Cybernetics, Faculty of Technology, Natural Sciences and Maritime Sciences, Porsgrunn, Norway
  • Senior prof. Erik Dahlquist, Past SIMS President, School of Business Society and Engineering, Division of Automation in Energy and Environmental Engineering, Västerås, Sweden
  • Adj, prof. Esko Juuso, Conference chair, Past EUROSIM President, Control Engineering, Environmental and Chemical Engineering, Faculty of Technology, University of Oulu, Finland 

Abstract

The circular economy aims to close the loop to make economy more sustainable and competitive. We have a broad range of technologies related to recycling, renewable energy, information technology, green transportation, electric motors, green chemistry, lighting, grey water, and more. The environment is restored with pollution removal and avoidance. What can we do in practice? Air has been a focus area in industry, energy and traffic. Water treatment has been developed to remove undesirable chemicals, biological contaminants, suspended solids and gases from contaminated water. In industrial processes, closed water circulation is a goal which is beneficial for the environment. Wastewater treatment is needed for purifying contaminated water before returning it to the nature. Is climate change discussion sufficient? Should we take a wide view on the ecosystem?

Thermal power plants are by far, the most conventional method of generating electric power with reasonably high efficiency. Bioenergy takes an increasing portion of the production? Oil and gas hold a strong position in overall energy usage. Biofuels provide new competing alternatives. CO2 capture has taken a high role in research. Are we going to bioeconomy? Is the thermal power a necessity in our energy balance? Sustainable or renewable energy is considered as a future source of energy: water power is well integrated in the energy system; solar and wind are getting more popular; geothermal, wave and tide energy can be locally very important. Electricity is increasingly popular both in solar and wind power. To what level it is sufficient? Where do we use energy? Industry needs high reliable levels. Is the nuclear power a solution? Adaptation is easier in domestic use, but how to do it? Heating and cooling take the highest part. Solar energy can help but needs storage. Geothermal can be used as storage. What is the potential of buildings as storages? Do we need small scale CHP? District heating systems are good solutions to bring the thermal energy to buildings.

In industry, intelligent systems have been developed for integrating data and expertise to develop smart adaptive applications. Recently, big data, cloud computing and data analysis has been presented as a solution for all kinds of problems. Can we take this as a general solution for automation? Wireless solutions are improving fast: 3G, 4G, 5G. But can we transfer signals to clouds and store the data? Is this too much? Where is the expertise? Obviously, local calculations are needed. Are they based on intelligent systems? Transport systems are analyzed as discrete event systems to find bottlenecks and avoid risks. Urban traffic is becoming an important area. Autonomous driving is a hot topic. What is needed to embed this in the urban traffic? Are there analogies with industrial systems? What are the main differences between industrial systems and transport systems? Can we use similar control solutions? What can we learn from other areas? Can we find analogies? What is common? Where do we have differences? What kind of models do we need?

Highly complicated systems with various interactions are at hand. What researchers within SIMS community can do? Do we have tools and methodologies to help in solving these problems?

E