Process Systems Engineering Team

Process Systems Engineering Team

(ProSET)

admin

Research Areas

Process Systems Engineering (PSE) is the scientific discipline of integrating scales and components describing the behavior of a physicochemical system, via mathematical modeling, data analytics, design, optimization and control. PSE provides the ‘glue’ within scientific chemical engineering, and offers a scientific basis and computational tools towards addressing contemporary and future challenges such as in energy, environment, the ‘industry of tomorrow’ and sustainability. This discipline is concerned with methods and tools to support decision-making for the creation and operation of chemical supply chains, including the discovery, design, manufacturing, processing, and distribution of chemical products. PSE deals with decision-making, at all levels and scales, by understanding complex process systems using a holistic view.

At ProSET it is our objective to meet the grand challenges of engineering using systems approaches, methods, and tools of PSE.

In the following research areas which we focus on are introduced.

Multi-scale

A process system can be generally decomposed into hierarchical levels or scales at which different physical and/or chemical phenomena take place. The first step of multiscale process modeling is to connect the molecular level with the phase level, where the main task is to model and predict the properties of fluid mixtures based on the atomic- or molecular-level information. Typically, quantum chemical (QC) computation, molecular simulation, and equations of state are used to provide such predictions. Recently, due to the ever-increasing number of available data and fast development of cheminformatics and machine learning tools, data-driven descriptor models have been developed and widely used for property predictions. With the properties of the system, it is then possible to derive the constitutive relations (e.g., kinetics and phase equilibria) and implement them into the mass, energy, and momentum conservations of each process unit. Taking into account the connections between different units, one can finally scale the system up toward the process level where process modeling, intensification, and optimization are performed to maximize the economic and environmental performances of the process. At a higher scale, supply chain design has a great importance in which different elements of a supply chain (raw material, processing, storage, and transportation) are considered and analyzed simultaneously.

Design and Synthesis

In a chemical process, the transformation of raw materials into desired chemical products usually cannot be achieved in a single step. Instead, the overall transformation is broken down into a number of steps that provide intermediate transformations. These are carried out through reaction, separation, mixing, heating, cooling, pressure change, particle size reduction or enlargement for solids. Once individual steps have been selected, they must be interconnected to carry out the overall transformation. Thus, the synthesis of a chemical process involves two broad activities. First, individual transformation steps are selected. Second, these individual transformations are interconnected to form a complete process that achieves the required overall transformation. A flowsheet or process flow diagram is a diagrammatic representation of the process steps with their interconnections.

The main purpose is developing superstructure representations at various levels of abstraction (aggregated to detailed), model the corresponding optimization problems, and develop effective solution techniques and strategies for these problems. Areas of application include synthesis of energy systems, integrated process water systems, complex distillation systems, process flowsheets and metabolic networks.

Modeling and Simulation

A process system can be generally decomposed into hierarchical levels or scales at which different physical and/or chemical phenomena take place. The first step of multiscale process modeling is to connect the molecular level with the phase level, where the main task is to model and predict the properties of fluid mixtures based on the atomic- or molecular-level information. Typically, quantum chemical (QC) computation, molecular simulation, and equations of state are used to provide such predictions. Recently, due to the ever-increasing number of available data and fast development of cheminformatics and machine learning tools, data-driven descriptor models have been developed and widely used for property predictions. With the properties of the system, it is then possible to derive the constitutive relations (e.g., kinetics and phase equilibria) and implement them into the mass, energy, and momentum conservations of each process unit. Taking into account the connections between different units, one can finally scale the system up toward the process level where process modeling, intensification, and optimization are performed to maximize the economic and environmental performances of the process. At a higher scale, supply chain design has a great importance in which different elements of a supply chain (raw material, processing, storage, and transportation) are considered and analyzed simultaneously.

Optimization

Once the basic performance of the design has been evaluated, changes can be made to improve the performance; the process is optimized. These changes might involve the synthesis of alternative structures, that is, structural optimization. Thus, the process is simulated and evaluated again, and so on, optimizing the structure. Each structure can be subjected to parameter optimization by changing operating conditions within that structure. From the project definition an initial design is synthesized. This can then be simulated and evaluated. Once evaluated, the design can be optimized in a parameter optimization through changing the continuous parameters of flowrate, composition, temperature and pressure to improve the evaluation. However, this parameter optimization only optimizes the initial design configuration, which might not be an optimal configuration. So the design team might return to the synthesis stage to explore other configurations in a structural optimization. Also, if the parameter optimization adjusts the settings of the conditions to be significantly different from the original assumptions, then the design team might return to the synthesis stage to consider other configurations in the structural optimization. In the context of process design and optimization, the problems can be considered multi-objective or multi criteria. The problems are implemented in whether sequential-modular or equation-oriented manners.

Flexibility analysis

By flexibility we mean the capability that a design has of having feasible steady state operation for a range of uncertain conditions that may be encountered during plant operation. Clearly, there are other aspects to the operability of a plant, such as controllability, safety, and reliability, which are equally important. However, flexibility is the first step that must be considered for the operability of a design.

Flexibility analysis can be used to quantify the extent to which the uncertainty or changes in parameters are tolerated by a particular process design. In other words, if a process remains feasible over a predefined range of potential deviations in process parameters, it could be concluded that the process is flexible in that range. In the feasible operation, all the constraints (heat and material balance, capacity, and equipment specifications) must be satisfied.

Planning and Scheduling

Production planning and scheduling constitute a crucial part of the overall supply chain decision level pyramid. Planning and scheduling activities are concerned with the allocation over time of scarce resources between competing activities to meet these requirements in an efficient fashion. More specifically, the planning function aims to optimize the economic performance of the enterprise, as it matches production to demand in the best possible way. The production scheduling component is of vital importance as it is the layer which translates the economic imperatives of the plan into a sequence of actions to be executed on the plant floor, so as to deliver the optimized economic performance predicted by the higher level plan Overall, recent research is directed toward finding solutions that enable efficient and accurate handling of problems of large size and increasing complexity. However, there remains significant work to be done on both model enhancements and improvements in solution algorithms, if industrially relevant problems are to be tackled routinely, and software based on these are to be used on a regular basis by practitioners in the field. In addition, new academic developments are mostly tested on complex but relatively small- to medium-size problems. Therefore, the implementation of new production and scheduling approaches in real-life industrial case studies constitutes a challenging task.

Uncertainty analysis

The handling of uncertainties in process synthesis, optimization, planning and scheduling must be addressed in research activities. Uncertainties in process parameters are evaluated through flexibility analysis, and through stochastic programming. Uncertain demands are considered in planning problems, and uncertain time duration in scheduling problems.

Machine learning

Machine Learning (ML) is a subfield of computer science that evolved from the study of pattern recognition and computational learning theory in artificial intelligence. However, the theory of probability and statistics also plays a very important role in modern machine learning. This field explores the study and construction of algorithms that can learn from and make predictions on data. ML has been playing an important role in constructing experience based models from process data, based on which useful information can be extracted, new patterns of data can be identified, predictions can be made more easily for new data samples, decisions can be made more quickly and effectively, etc. All these applications of machine learning have fundamentally changed the manufacturing style of the process industry

Digitalization

The rise of digitalization has had a profound impact on virtually all domains of society, as is best illustrated by the ubiquitous access to the internet and the various services that run on top. Clearly, digitalization is one of the mega-trends that is affecting work and life in a multitude of ways. In some cases, the ramifications are obvious and clear, in other they are quite subtle. For instance, the development coined Internet of Things (IoT) changes how devices connect and interact, but its relevance for plant infrastructure remains to be seen in full due to rather long investment cycles. The relative ease of data acquisition, storage and access today as well as the availability of powerful computing hardware like GPU’s and simple coding tools like Python to deploy Machine Learning (ML)/ Artificial Intelligence (AI) methods or even packaged commercial tools is broadening the scope for application. Similarly, Digital Twins live and operate on top of these new computing capabilities as well as on existing mechanistic and data-driven modeling and simulation methodologies. They promise efficiency gains through better asset utilization while enabling highly flexible operations.

Computational Fluid Dynamics (CFD)

Computational fluid dynamics, known as CFD, is the numerical method of solving mass, momentum, energy, and species conservation equations and related phenomena. Computer-based modeling and simulation has become an important and essential part of research and development process in science and engineering departments in academia and industry. Computational models namely represent virtual prototypes that can predict the behavior of physical systems, which might be impossible to observe on real-world prototypes. Visualization of such models is especially important for understanding very complex systems or systems with difficulties to obtain required measurements. CFD-based models thus offer relatively easy, reliable, and cost-effective options to mimic real-world problems, depending on model assumptions. They are now included in an initial stage of product research and development process prior to designing physical prototypes.