To prevent spam users, you can only post on this forum after registration, which is by invitation. If you want to post on the forum, please send me a mail (h DOT m DOT w DOT verbeek AT tue DOT nl) and I'll send you an invitation in return for an account.
Performance with large logs
after having played a little bit with test log files, scoped to 1-2 traces and around 100 events (works perfect, nice tool
), I am trying to use a real log of mine.
The csv file is 108Mb large, containing around 1million event, and it took 6 hours 25minues to convert it into an 231mb XES file using XESame. As this operation is not performed regularly, this was not really an issue, I just let it run during the night.
Some problem comes when using ProM. Importing the file takes around 2mn. Applying the alpha algorithm worked in like 5mns.
But trying to visualize the log (summary) is endless (I killed the process after some time..)
So my question is did any of you had any experience with large logs, and how did you managed id ? Split the log (which is not really good, as the underlying process holds some very long running events that could be split across the entire file) ? Is there a way in ProM to speed up the process (ignore certain steps, ...), or to get a summary of the log in another way ?