This is an old revision of the document!


Research Projects

3TU.BSR

3TU - Big Software on the Run

Description

Millions of lines of code - written in different languages by different people at different times, and operating on a variety of platforms - drive the systems performing key processes in our society. The resulting software needs to evolve and can no longer be controlled a priori as is illustrated by a range of software problems. The 3TU.BSR research program will develop novel techniques and tools to analyze software systems in vivo - making it possible to visualize behavior, create models, check conformance, predict problems, and recommend corrective actions.

Staff involved

BOSS

Behavior Oriented Service Substitution

Description

The Service Oriented Computing (SOC) paradigm aims at building complex systems by composing them from less complex systems, called services. Such a (complex) system is a distributed application often involving several cooperating enterprises. As a system is usually subject to change, individual services will be substituted by other services during the system's life-cycle. Substituting one service by another one should not affect the correctness of the overall system. Verification of correctness is challenging, as the overall system is usually not known to any of the involved enterprises. The focus of the BOSS project is to study service substitution for a set of practical relevant correctness notions. The project is funded by NWO.

Staff involved

Core

CORE - Consistently Optimised REsilient secure global supply-chains

Description

The CORE project aims to produce cost effective, fast and robust solutions for worldwide Global Supply Chain system. The project will implement an ecosystem where interoperability, security, resilience and real-time are optimized. The role of the AIS group in the project is to employ process mining and determine security risks and other supply chain threats and their impact on supply chain flows around the world. The project will demonstrate the tangible benefits to involved stakeholders (transaction, transport, regulatory and financial operators), thus facilitating its adoption by commercial entities.

Links

Staff involved

CoSeLoG

Configurable Services for Local Governments

Description

The Software as a Service (SaaS) paradigm is particularly interesting for situations where many organizations need to support similar processes. Since there are 441 municipalities in the Netherlands and they are all providing similar services and are executing similar processes, the use of SaaS technology could potentially be very beneficial for these local governments. Therefore, the aim of the CoSeLoG project is to create a cloud infrastructure for municipalities. Such a cloud would offer services for handling various types of permits, taxes, certificates, and licences.

Although municipalities are similar, their internal processes are typically different. Within the constraints of national laws and regulations, municipalities can differentiate because of differences in size, demographics, problems, and policies. Therefore, the cloud should provide configurable services such that products and processes can be customized while sharing a common infrastructure. The CoSeLoG project aims at the development and analysis of such services. For this we want to use earlier work on configurable process models done at TU/e, QUT, and UT.

One challenge is to actually describe the different variants of a particular municipal service in a single model that can be used to generate the actual configured services. Note that many different variants of a particular service may run in parallel in our cloud. Such a cloud infrastructure for municipalities enables new types of analysis as there is detailed data about the execution of different variants of a given process in different organizations.

A challenge is to develop new process mining techniques that allow for the comparison of event logs of different variants of the same process. Such techniques should highlight differences and commonalities and should assist municipalities in configuring services in a better manner.

The following municipalities are involved in this so-called Jacquard project: Bergeijk, Bladel, Coevorden, Eersel, Emmen, Gemert-Bakel, Hellendoorn, Reusel de Mierden, and Zwolle.

Links

Staff involved

DAIPEX

DAIPEX - Data and Algorithms for Integrated transport Planning and EXecution

Description

Transport companies often discover that what takes place in day-to-day transportation is not in line with their transport plans. This is largely due to the fact that the software which is employed in creating transport plans, fails to account for the real-world complexity of transportation and logistics. Approximations and abstractions used fall short of the true complexities in the real world. Direct consequences include violation of time windows, unnecessary delays, underutilized transportation capacity, etc.

This project aims to develop algorithms and software that can handle time-dependent, stochastic, planning problems, employing high-volumes of information. We will focus particularly on the complexities that arise in integrating planning problems and stochastic dependencies in Cross Chain Control Centers (4C), because in a 4C: i) the required real-life detail increases, ii) incidents are considerably larger, and iii) more communication is required as the pressure on response time increases.

Links

Staff involved

DeLiBiDa

Desire Lines in Big Data

Description

The goal of process mining is to extract process-related information from event logs, e.g., to automatically discover a process model by observing events recorded by some information system. Despite recent advances in process mining there are still important challenges that need to be addressed. In particular with respect to handling large-scale event logs. DeLiBiDa aims to develop new techniques to deal with massive event data. There are various settings where it is impossible to store events over an extended period. Therefore, we want to develop techniques for storing large event logs efficiently, for example in databases. Furthermore, we aim to develop in-database (pre)processing techniques to facilitate existing as well as new to be developed process mining technology. Finally, we plan to develop query techniques to make event-data quickly accessible for processing.

Links

Staff involved

DSC/e & NWO Graduate Program

Data Science Center Eindhoven

Description

Recent technological and societal changes led to an explosion of digitally available data. Exploiting the available data to its fullest extent, in order to improve decision making, increase productivity, and deepen our understanding of scientific questions, is one of today's key challenges. Data science is an emerging area that aims to address this challenge. It is a multi-disciplinary area, where computer science and mathematics play crucial roles. The Graduate Program on Data Science leverages the presence at the TU/e of excellent research groups in the data-science area, and to give highly talented students the opportunity to be educated in and contribute to this exciting area. The positions are funded by the NWO Graduate Program.

The Graduate Program on Data Science is part of the Data Science Center Eindhoven (DSC/e), launched in December 2013. It builds on the excellence of several research groups within the department that together cover many of the core topics in data science: algorithms, visualization, data mining, process mining, statistics and probability, stochastics, operations research, and optimization. This ensures a stimulating and excellent environment for the selected students.

The projects fall at the intersection of computer science and mathematics, and are expected to open up promising connections between these fields. Together with the intended supervisors from the relevant research group(s), the students will have the opportunity to define their own research project. The overall aim is to make fundamental advances in the area of Data Science.

Links

Staff involved

EDSA

European Data Science Academy

Description

The European Data Science Academy (EDSA) will establish a virtuous learning production cycle whereby we: a) analyse the required sector specific skillsets for data analysts across the main industrial sectors in Europe; b) develop modular and adaptable data science curricula to meet these needs; and c) deliver training supported by multi-platform and multilingual learning resources based on our curricula. The curricula and learning resources will be continuously evaluated by pedagogical and data science experts during both development and deployment.

Links

Staff involved

Fluxicon

X-ray for Business Processes

Description

Fluxicon is a spin-off of the process mining research done at TU/e. Two STW Valorisation Grants (Phase 1 & 2) have been granted to set up a process mining company that will develop easy-to-use process mining software.

Links

Staff involved

Philips Flagship

Description

The Data Science Centre Eindhoven (DSC/e) is TU/e’s response to the growing volume and importance of data and the need for data & process scientists (http://www.tue.nl/dsce/). The DSC/e has recently started a long-term strategic cooperation with Philips Research Eindhoven on three topics: data science, health and lighting. As a first concrete action, 70 PhD students are being hired for these three topics using joint funding from the TU/e and Philips, of which 18 PhD students work on the data science topic. These students form together with researchers from the TU/e and Philips a strong research community working together on scientific and industrial challenges.

The following four PhD positions will be related to the topic of process mining:

  1. Product-centric Consumer Data Analytics: Product Usage Lifecycle Analysis [part of the Data Driven Value Proposition theme]. Digital components are being added to Philips lifestyle products. The data from these products as well as from Philips touch points must be combined to optimize user experience and maintain customer satisfaction. Process mining techniques will be used to analyze the usage of products over a longer period of time.
  2. Transforming Event Data into Predictive Models [part of the Healthcare Smart Maintenance theme]. Philips has strong leadership positions in healthcare imaging and patient monitoring systems. In the healthcare domain, reducing equipment downtime and cost of ownership for hospitals is of vital importance. Smart maintenance exploits that professional equipment is connected to the internet and aims to use event and sensor data for overall cost reduction. Process mining techniques will be used to learn dynamic models that can be used for prediction and optimization.
  3. Predictive Analytics for Healthcare Workflows [part of the Optimizing Healthcare Workflows theme]. Processes play an important role in pathology and radiology. It is not just about collecting data and supporting individual activities, but also about improving the underlying end-to-end workflow processes. To improve these operational processes in terms of costs, efficiency, speed, reliability, and conformance, we can learn from the way that processes are conducted in practice. One can learn from problems in the past and compare different process variants and process instances. This project aims to obtain insight in these workflows, in order to understand what goes well and what can be improved, using a process mining approach. The cross-fertilization between process mining and visualization will provide a novel angle on workflow improvements in pathology and radiology.
  4. Radiology Workflow Optimization and Orchestration [also part of the Optimizing Healthcare Workflows theme]. Radiology, involves complex workflows, especially when seen in its clinical context. This project aims to obtain insight in these workflows and their visualization, in order to understand what goes well and what can be improved, using a visual analytics approach, where automated processing and interactive exploration are tightly integrated.

Optimization of patient care at reduced cost requires the orchestration of multiple clinical workflows. Timely getting the imaging/lab tests done and getting the results back to physicians can help quickly diagnose/treat the patient, and save lives. The rapid digitization of diagnostics in radiology and pathology calls for a data-driven optimization of the workflows. Process mining will be used to learn models for the as-is situation. However, process technology will also be used to improve the processes.

Links

Staff involved

RISE BPM

“Propelling Business Process Management by Research and Innovation Staff Exchange”

Description

RISE_BPM is the first favourably evaluated project proposal submitted by the University of Münster in cooperation with ERCIS partners within the Horizon 2020 EU funding programme. The RISE_BPM project is aimed at networking world-leading research institutions and corporate innovators to develop new horizons for Business Process Management (BPM). The project consortium, besides the University of Münster as the coordinator, includes partners from Australia, South Korea, Brazil, Austria, Spain, the Netherlands, and Liechtenstein.

RISE_BPM was set up to ensure the sustainability and further extension of the collaboration ties established during the Networked Service Society (NSS) project. NSS (Project number: APR 10/805) is a multi-national project funded by the International Bureau of the German Federal Ministry of Education and Research (BMBF). The project was conducted from July 2010 until the end of 2014 and was aimed at establishing and strengthening long-term collaboration structures with institutions in the Asian-Pacific region in the areas of Joint Research, Joint Education and Joint Industry Projects.

RISE_BPM will last for four years and started May 1st 2015.

Links

Staff involved

Process Mining in Logistics

Process Mining in Logistics is a joint project of the Data Science Center Eindhoven and Vanderlande industries.

Description

Logistics processes are notoriously difficult to design, analyze, and to improve. Where classical processes are scoped around the processing of information associated to a specific unique case, logistics deals with physical objects that are grouped and processed together with other physical objects in one process at one or more physical locations, then distributed and later on re-aggregated with other physical objects in another process at other physical locations. In essence, logistics deals with numerous processes, cases, and objects that interact with each other in a multi-dimensional fashion. On one hand, this subjects logistics processes to many external influences which can have a negative impact on process outcomes and process performance. On the other hand, when analyzing the performance of flows across networks of logistics, the multi-dimensional nature is especially prevalent and existing data-driven process analysis techniques such as process mining which assume a single viewpoint cannot be applied.

Vanderlande is the global market leader in baggage handling systems for airports and sorting systems for parcel and postal services, and also a leading supplier of warehouse automation solutions. The company recognizes the emerging trend of more data driven business models and addressed ‘big data’ a key topic on the technology roadmap. Therefore, under the umbrella of the Data Science Impuls program, the DSC/e and Vanderlande joined forces in a research project.

The project runs from September 2016 until August 2020.

Project Objectives

The goal of the joint research project of DSC/e and Vanderlande is to lift process mining to the multi-dimensional space of logistics, and to allow analyzing logistics processes and systems from all relevant angles and viewpoints. By having thorough and fast insight into logistics and business processes, improvements can be found, predicted, and implemented at Vanderlande delivered logistics solutions. We aim to achieve this lift for the entire process mining spectrum

  • from appropriate data logging and event data extraction techniques from logistics systems
  • and appropriate conceptual modeling of logistics processes and systems
  • via process discovery and process replay techniques for multi-dimensional event data
  • for online and offline deviation detection and process comparison,
  • to predictions of process outcomes and online recommendations based on event data.

Staff Involved


If you have a project item for this page, please send it to Eric Verbeek.