This is an old revision of the document!


Master Projects

Documents

Possible assignments

Process mining in Logistics - 3D Visualization and Scalable Process Mining on Big Event Data (2 Topics)

Vanderlande is the global market leader for value-added logistic process automation at airports and in the parcel market. The company is also a leading supplier of process automation solutions for warehouses. Some figures:

  • Vanderlande’s baggage handling systems move 3.7 billion pieces of luggage around the world per year.
  • Our systems are active in 600 airports including 13 of the world’s top 20.
  • More than 39 million parcels are sorted by its systems every day, which have been installed for the world’s leading parcel companies.
  • Many of the largest global e-commerce players and distribution firms have confidence in Vanderlande’s efficient and reliable solutions.

Vanderlande focuses on the optimization of its customers’ business processes and competitive positions. Through close cooperation, we strive for the improvement of our customers’ operational activities and the expansion of their logistical achievements.

For Vanderlande, it is critical that we have state-of-the-art techniques to analyze and optimize our customers’ logistics processes. Reasons are (a) the constant increasing size and complexity of our material handling solutions, (b) growing complexity of our software solutions, covering larger parts of our customers’ business processes, and © the demand for more advanced service offerings, covering logistics and business services. We believe that process mining is of high value for Vanderlande. Therefore, we work with the Eindhoven University of Technology on making process mining fit for analyzing logistics processes. In this context, we offer graduation projects on process mining and the application in our business.

Topic 1: 3D Visualisation of logistic processes

Within Vanderlande, we use a 3D library to create realistic simulation and emulation models. The topic of this graduation topic is to use this existing library within our process mining tooling; we want to visualize the logistic processes in 3D within our process mining tooling.

Deliverables:

  • Become acquainted with both our 3D library and our process mining tooling
  • Come up with a conceptual solution to combine both
  • Realize the solution in a proof-of-concept, where the existing 3D library is connected to our process mining tooling
  • Application of the proof-of-concept on at least one system to test and validate

Topic 2: Implementation of scalable process discovery and conformance checking algorithms

Our systems generate big amounts of data. This is often a limiting factor; existing process mining tooling cannot work with multiple days of systems data. Because of this, we perform our analyses on a relative limited time-period, which reduces the certainty of the outcome of our analyses. For this, we are looking for algorithms that allow for scalable process discovery and conformance checking. This is a topic that we would like to work on with a graduate student.

Deliverables of the graduation study:

  • Problem analysis
  • Conceptual solution of algorithms that allow for scalable process discovery and conformance checking
  • Realize the solution in a proof-of-concept
  • Apply the proof-of-concept on at least one system to test and validate

More Information

For more information, please contact Dr. Dirk Fahland

Want to win?

Win the Process Discovery Contest (PDC) 2018!

Project setting

The Process Discovery Contest is dedicated to the assessment of tools and techniques that discover business process models from event logs. The objective is to compare the efficiency of techniques to discover process models that provide a proper balance between “overfitting” and “underfitting”. A process model is overfitting (the event log) if it is too restrictive, disallowing behavior which is part of the underlying process. This typically occurs when the model only allows for the behavior recorded in the event log. Conversely, it is underfitting (the reality) if it is not restrictive enough, allowing behavior which is not part of the underlying process. This typically occurs if it overgeneralizes the example behavior in the event log. A number of event logs will be provided. These event logs are generated from business process models that show different behavioral characteristics. The process models will be kept secret: only “training” event logs showing a portion of the possible behavior will be disclosed. The winner is/are the contestant(s) that provides the technique that can discover process models that are the closest to the original process models, in term of balancing between “overfitting” and “underfitting”. To assess this balance we take a classification perspective where a “test” event log will be used. The test event log contains traces representing real process behavior and traces representing behavior not related to the process. Each trace of the training and test log will record complete executions of instances of the business processes. In other words, each trace records all events of one process instance from the starting state till the end state.

A model is as good in balancing “overfitting” and “underfitting” as it is able to correctly classify the traces in the “test” event log:

  • Given a trace representing real process behavior, the model should classify it as allowed.
  • Given a trace representing a behavior not related to the process, the model should classify it as disallowed.

The contest is not restricted to any modelling notation and no preference is made. Any procedural (e.g., Petri Net or BPMN) or declarative (e.g., Declare) notation is equally welcome. The context is not restricted to open-source tools. Proprietary tools can also participate.

Project description

The goal of the Master Project is to participate in the 2018 edition of the Process Discovery Contest, if possible using the techniques that were developed for the 2017 edition (see the figure below for an example model) and that allowed us to classify all traces correctly. However, these techniques did not allow us to win the 2017 edition because the models we generated were considered to be less informative than the BPMN models as discovered by the winning competitor. As a result, a possibility would be to develop a conversion from our models to BPMN models. Furthermore, the 2018 edition might bring new challenges to the Contest, which might require extensions to our techniques. As the call for the 2018 edition is not out yet, it is hard to say what kind of extensions would be needed.

For the earlier editions of the Contest, the price for the winner included a flight to the BPM conference, the lodging expenses during the conference, and a full registration for the conference. Provided that the same prices will be available for the 2018 edition, and provided that the result of the Master Project wins the Contest, it will be the master student who will pick up these prices and visit the BPM 2018 Conference, which will be held in Sydney.

Time restrictions

Given that the main goal is to participate in the 2018 edition of the Contest, it would be ideal if the master student starts just before the Contest starts. This way, the student can first get acquainted with the techniques using the 2017 edition, and then start applying and extending them for the 2018 edition. As the Contest typically starts in March, it would be ideal if the student starts February/March.

Project Team

Principal Supervisor
Renata Medeiros de Carvalho
Position: UD
Room: MF 7.146
Tel (internal):
Projects:
Courses: 2IAB0, 2IMC93, 2IMC98, JM0200
Links: Scopus page, TU/e employee page
Daily Supervisor
Eric Verbeek
Position: Scientific Programmer
Room: MF 7.062
Tel (internal): 3755
Projects: CoseLoG
Courses:
Links: Personal home page, Google scholar page, Scopus page (2nd Scopus page), ORCID page, TU/e employee page
Eric is the scientific programmer in the AIS group. As such, he is the custodian of the process mining framework ProM. In you want access to the ProM repository, or have any questions related to ProM and its development, as Eric. Recently, he has been working on a decomposition framework for both process discovery as conformance checking in ProM. Earlier, he also worked on ExSpect and Woflan.

When Portfolio Management meets Process Mining Challenges and Opportunities

FLIGHTMAP is Bicore’s flagship software solution for portfolio management. Since its launch in 2010, a growing group of international clients, such as DAF, Océ, and Fokker, have implemented FLIGHTMAP. With this tooling, they can perform roadmapping, budget and resource planning, scenario analysis, planning and tracking, and more. More information about FLIGHTMAP is available via www.flightmap.com. The figure above shows a screenshot of the tool obtained after the portfolio analysis

To keep its leading position, Bicore continuously develops its innovative functionality for decision support and the ease of use and embedding. Next to this core functionality, FLIGHTMAP has additional modules, such as the HUB for connecting to external systems. The major next steps in development are upgrades to the latest front-end technologies and a stepwise migration to the cloud.

As an important step to make FLIGHTMAP smarter in decision support, we are looking to leverage new insights from process mining and data analytics into FLIGHTMAP’s functionality.

Bicore is looking for candidates for a Master Thesis internship in the area of process and data mining to bring new insights and link them to FLIGHTMAP’s next releases.

The assignment should look into the best way to harvest best practices in portfolio analysis and portfolio reviews, as well as project selection. Since FLIGHTMAP is running for more than 7 years, a lot of historical data is available to work with. Where possible, we would like to link the results of mining to specific recommendations.

In discussion with academic expert sin this fields, the application of process mining techniques are much more common in the traditional transaction systems domain. Applying them in the decision support domain of FLIGHTMAP gives rise to interesting research questions, as well as to practical relevance.

You will work closely with the development and delivery team of FLIGHTMAP to align, and contribute to the FLIGHTMAP roadmap.

Contact

For more information, contact Massimiliano de Leoni.

Data Science: Developing a Self-Standing Dynamic reporting tool

A huge amount of (transaction) data is generated on a daily basis in ASML Development and Engineering department. The data is scattered in different sources. The challenge would be extracting data from relevant sources and creating a self-standing dynamic reporting tool (dashboard) demonstrating performance of (Supplier Quality) Engineers in different granularity levels (Department, Section, Individual) based on a set of pre-defined KPIs.

Are you a master student in Software Engineering (Data Science) with a passion on real-life data challenges? Then we are looking for you!

Job Description

You will be working closely with a team of Supplier Quality Engineers. You will get to know the mission of Supplier quality engineers and way of working in relationship with different stakeholders. You will extract relevant data from different data sources and integrate them to different levels of granularities to examine meeting certain KPIs.

You explore different ways of visualization in term of a dynamic tool (dashboard) with all relevant properties such as flexibility, scalability, etc. You have also the freedom to explore the data and create meaning out of it beyond the boundaries of a dashboard creation or this set of KPIs as long as it is self-standing. You are also free to take lead in defining KPIs not only in terms of lag but also lead KPIs (Balanced Score Card approach).

Education

Are you a master student in Software Engineering (Data Science) with a passion on real-life data challenges? Then we are looking for you!

You like to explore, learn and build cool stuff. You love working with data and that extracting relevant and easy to understand information from it. You have programing skills, Web information retrieval, Data-mining, Data Engineering, algorithms, visualization, statistics for big data, You are motivated for a challenge and are self-assured to drive a project.

You are a conceptual thinker. You are fluent in English and have good communication (reporting) skills. Your grades are excellent and you have a strong motivation for ASML as your future employer. As this is an open assignment, please attach a motivational letter with a proposal and a recent grade list to your application.

This is an apprentice internship for 5 days a week with duration of 3 to 5 months. The start date is as soon as possible.

Please keep in mind that we can only consider students (who are enrolled at a school during the whole internship period) for our internships and graduation assignments.

Other Information

What ASML offers

Your internship will be in one of the leading Dutch corporations, gaining valuable experience in a highly dynamic environment. You will receive a monthly internship allowance of 500 euro (maximum), plus a possible housing or travel allowance. In addition, you’ll get expert, practical guidance and the chance to work in and experience a dynamic, innovative team environment.

ASML: Be part of progress

We make machines that make chips – the hearts of the devices that keep us informed, entertained and safe; that improve our quality of life and help to tackle the world’s toughest problems.

We build some of the most amazing machines that you will ever see, and the software to run them. Never satisfied, we measure our performance in units that begin with pico or nano.

We believe we can always do better. We believe the winning idea can from anyone. We love what they do – not because it’s easy, but because it’s hard.

ASML: Be part of the progress

ASML is leading in the worldwide development, production and sales of high-end lithography systems for the semiconductor industry. Almost 17,000 people worldwide work at ASML at offices in the United States, Asia and at the corporate headquarters in Veldhoven. ASML employees share a passion for technology with a customer focus. At ASML, we work collectively to further develop and implement complex and high-quality technological systems. Working at ASML is therefore challenging and dynamic, with ambitious objectives and high standards key to our continuing success. But hard work here pays off: ASML invests in the development of its people and successes are shared. ASML promises mutual commitment to our growth and yours.

Join ASML’s expanding multidisciplinary teams and help us to continue pushing the boundaries of what’s possible. How will you be part of progress?

Contact

For more information, contact Renata Medeiros de Carvalho.

Real-Time Model Discovery of the Service Order Process Using Stream Process Mining

Kropman Installatietechniek is a Dutch company established in 1934 and has become one of the leading companies of the Dutch installation industry. With about 800 employees, 12 regional locations and an annual turnover of more than 100 million Euro, Kropman is an integral service provider with a multidisciplinary approach. Kropman is mainly active in office buildings, health care and industry. It offers design, construction and maintenance in the field of facility installations. Kropman also has a separate business for process installations and cleanrooms: Kropman Contamination Control. The maintenance (services) is a fast growing business line. The order process is fully supported within an ERP environment. The service order process is not a trouble-free process: the process takes too long and flows more often than necessary. The company aims at increasing the throughput of the SO process and decreasing the amount of process deviations by applying process mining and data mining techniques.

To have a better overview of the process, Kropman is aiming at:

  • Exposing bottlenecks happening in the SO process
  • Detecting deviations from the supposed model in the real time they are happening using stream process mining techniques
Contact

For more information, contact Marwan Hassani

Example completed master projects

Bram in 't Groen

VDSEIR - A graphical layer on top of the Octopus toolset

Description

In his work, Bram in 't Groen introduces a graphical representation for DSEIR (a language used in the Octopus toolset for designing embedded systems) called Visual DSEIR (VDSEIR). By using VDSEIR, users of the toolset can create specifications in DSEIR by means of creating graphical models, removing the need for those users to know how to program in the Octopus API. Bram in 't Groen provides a model transformation from VDSEIR to DSEIR that makes use of an intermediate generator model and a parser that is automatically generated from an annotated JavaCC grammar. The graphical representation for DSEIR consists of several perspectives and it contains a special form of syntactic sugar, namely hierarchy. It is possible to create hierarchical models in the graphical representation without having support for hierarchy in the original DSEIR language, because these hierarchical models can be translated into non-hierarchical DSEIR models. This way, additional expressiveness is created for the user, without modifying the underlying toolset.

Type

AIS / External / ESI

Borana Luka

Model merging in the context of configurable process models

Description

While the role of business process models in the operation of modern organizations becomes more and more prominent, configurable process models have recently emerged as an approach that can facilitate their reuse, thereby helping to reduce costs and effort. Configurable models incorporate the behavior of several model variants into one model, which can be configured and individualized as necessary. The creation of configurable models is a complicated process, and tool support for it is in its early steps. In her thesis, Borana Luka evaluates two existing approaches to process model merging which are supported by tools and test an approach to model merging based on the similarity between models. Borana’s work resulted in a paper presented in the 2011 International Workshop on Process Model Collections.

Type

AIS / Internal / CoSeLog project involving 10 municpalities

Links

Staff involved

Cosmina Cristina Niculae

Guided configuration of industry reference models

Description

Configurable process models are compact representations of process families, capturing both the similarities and differences of business processes and further allowing for the individualization of such processes in line with particular requirements. Such a representation of business processes can be adopted in the consultancy sector and especially in the ERP market, as ERP systems represent general solutions applicable for a range of industries and need further configuration before being implemented to particular organizations. Configurable process models can potentially bring several benefits when used in practice, such as faster delivery times in project implementations or standardization of business processes. Cosmina Niculae conducted her project within To-Increase B.V., a company that specializes in ERP implementations. She developed an approach to make configuration much easier, implemented it, and tested it on real-life cases within To-Increase.

Type

AIS / External / To-Increase

Links

Staff involved

Dennis Schunselaar

Configurable Declare

Description

Declarative languages are becoming more popular for modeling business processes with a high degree of variability. Unlike procedural languages, where the models define what is to be done, a declarative model specifies what behavior is not allowed, using constraints on process events. In his thesis, Dennis Schunselaar studies how to support configurability in such a declarative setting. He takes Declare as an example of a declarative process modeling language and introduces Configurable Declare. Configurability is achieved by using configuration options for event hiding and constraint omission. He illustrated our approach using a case study, based on process models of ten Dutch municipalities. A Configurable Declare model is constructed supporting the variations within these municipalities.

Type

AIS / Internal

Links

Staff involved

Erik Nooijen

Artifact-Centric Process Analysis, Process discovery in ERP systems

Description

In his thesis, Erik Nooijen developed an automated technique for discovering process models from enterprise resource planning (ERP) systems. In such systems, several different processes interact together to maintain the resources of a business, where all information about the business resources are stored in a very large relational database. The challenge for discovering processes from ERP system data, is to identify from arbitrary tables how many different processes exist in the system, to extract event data for each instance of each process in the system. Erik Nooijen identified a number of data mining techniques that can solve these challenges and integrated them in a software tool, so that he can automatically extract for a process of the ERP system an event log containing all events of that process. Then classical process discovery techniques allow to show the different process models of the system. Erik conducted his project within Sligro where he is actively using his software to improve the company's processes. The thesis resulted in a workshop paper presented at the International Conference on Business Process Management 2012 in Tallinn, Estonia.

Type

AIS / External / Sligro

Links

Staff involved

Older entries >>


If you have a master project item for this page, which may include possible master project assignments, on-going master projects, and completed master projects, please send it to Eric Verbeek.