Using ProModel to teach process management inadvertently necessitates that students become more proficient with many tools centered on data and working with people. Of course, students learn many aspects of ProModel such as the need to understand parts of a process; these include locations, entities, arrival rates, process logic (LEAP), variables, and attributes.
They also learn about graphics, Statfit, batch/group, create, order, wait until, logic statements that operationalize business rules, and many other commands that help to model a process. However, when conducting a successful large process improvement project using discrete event simulation for an organization, students need to become proficient with many other tools to best utilize ProModel over the course of semester long projects.
Project Management Methodology
Understanding and being able to set up a project using project management methodology is critical to having a successful ProModel project. As in any project the scope and expected outcomes need to be delineated. To design the work break down structure for the project it is also critical to understand what tasks need to be accomplished to produce the final output, and when those tasks need to be completed.
Tasks include developing the scope and expected outcomes working with the project sponsor, analyzing and preparing data for entry into ProModel, base model construction, verification and validation of the base model, determining what treated models to build, statistical analysis of the outputs from each model relative to the base model, cost/benefit analysis, and a report delineating findings and recommendations. Each team in the class I guide has to complete a Project Execution Plan (PEP) and then discuss in their final paper how well they met their time gates, why they did or did not meet those dates, and what they did to catch up if they did not meet those time lines. There are times in the project where they learn the lesson of not utilizing the ‘student syndrome’.
To do a good job of all the tasks mentioned above, students have to become accomplished at relationship management. They have to visit with their sponsor not only about the scope and expected outcomes, but what data and information is needed to complete the project. There will be missing data, acronyms that need to be explained, assumptions that have to be made and supported due to the missing data and information, uncertainty about the proper rule to guide the logic, and many other items to discuss on at least a weekly bases with the sponsor. Oftentimes it is being uncomfortable talking to a sponsor that leads to procrastination and missed time gates.
Data Sets and Simulation
Of course there is the ever present need to be able to make sense out of large sets of data and be able to convert them to information that ProModel can utilize. When dealing with nearly 12,000 different types of entities for one process, being processed through a job shop with 1400 unique process centers, the data sets become large. The route array that informs ProModel which machine which entity goes to when can become 12,000 rows and 200 columns, and the duration array can become too large and have to be split into four arrays, each with 3000 rows and 1400 columns.
There are many Excel tools that help the students explore their data sets. These tools include but are not limited to: filters, pivot tables, different types of lookup commands, find and replace, if statements, count statements, the ‘and’ function to build many lines of logic quickly, and different types of conditional formatting.
Once the base model and treated models are created and have generated 30 replications the students determine if the treatments actually made a difference by conducting a hypotheses test, if the null (the means of the samples have a high probability of being drawn from the same population) is rejected they proceed to the cost/benefit analysis. If the null is accepted, they try other treatments. If the treatment was successful, Statfit is utilized to determine the distribution of the output, at which point a Monte Carlo simulation is utilized to generate a larger sample of deltas between the base and a treated model to determine the distribution of deltas used to generate the net benefit. That benefit could be number of extra units built, decrease in throughput time, time in system, net present value, or some other form of benefit.
While the models are being built, the team is also working on their presentations and written reports. Thus, as they are discovering assumptions that they need to make, they are putting them into the oral and written report, thereby learning the value of parallel processing. By the time the last statistical analysis is completed, the presentation and paper are completed and ready for presentation for the teams that do a good job of following their PEP.
As demonstrated above there are many tools that ProModel users need to be proficient with when conducting a successful discrete event simulation using ProModel. However, perhaps it is not only ProModel and the ancillary tools that need to be taught and modeled when teaching a discrete event class, but the willingness to say, “I do not know how to do that, lets do some research and discover how”. That is the most important trait that modelers need to have, the willingness and perseverance to learn new tools and apply them in unique ways to capture unique opportunities.
Meet Professor Scott Metlen, Ph.D.
Dr. Scott Metlen earned his Ph.D. in Business Administration at the University of Utah in 2002 and is currently an Associate Professor of Production Operations Management at the University of Idaho. Dr. Metlen teaches Quality Management and Systems and Simulation, both are aspects of Process Management. Prior to his academic carrier, Dr. Metlen spent 20 years managing products and processes in agriculture and food processing. Through a gift from the Micron Foundation, he has the resources to oversee at least twenty process improvement projects for various organizations per year through the classes he teaches. These projects provide meaningful experiential learning for the 40 to 80 students involved.