An Introduction to Manufacturing Process Simulation at Rose-Hulman Institute of Technology

Rose-Hulman McCormack_Jay

Dr. Jay McCormack; Rose-Hulman – Associate Professor Mechanical Engineering

I teach classes in both the mechanical engineering and engineering design programs at Rose-Hulman Institute of Technology in the areas of design and manufacturing. Rose is a small college in Terre Haute, Indiana focused on STEM. In the mechanical engineering program, one of the courses that we find differentiates our graduates is a junior level course on design for manufacturing. Instead of focusing on the sciences of manufacturing processes (which is clearly very valuable also), our course focuses on the application of design principles to facilitate the manufacturing of a given product, comparison of various manufacturing methods, and supporting design best practices related to manufacturing such as geometric dimensioning and tolerancing.

Two years ago, one member of the teaching team, the original creator of our design for manufacturing class, proposed integrating a process simulation project into the course. Our students are exposed to many manufacturing methods and work in depth with a few, but never had any exposure to manufacturing process. This is arguably appropriate content for any mechanical engineer but, a robotics minor is popular with our mechanical engineers, and many of the robotics students end up in positions related to manufacturing. Additionally, many biomedical engineers take the course and end up working as process engineers for medical device manufacturers. Therefore, the need was there for students to get a first exposure to manufacturing process and process simulation. I had some experience with process simulation more than a decade ago and experience with lean manufacturing so I was elected (appointed) by the teaching team to develop the project. I was familiar with ProModel products, but it had been a while since I used any simulation tools. I evaluated Process Simulator, a tool from ProModel that installs as a plug in to Microsoft Visio, and several other products, but found that Process Simulator allowed me to get students from zero to their first model quickly. After choosing Process Simulator and discussing the options to get the software from ProModel, I started developing the project.

Project Outcomes

My first objective was to develop the learning outcomes for the project. There were a few factors driving the learning outcomes:

  1. The students were almost exclusively novices. Virtually none of the 175 students in the course had any experience with any process simulation software, so the learning outcomes had to include low Bloom’s level items focused on both manufacturing topics and Process Simulator concepts.
  2. The design for manufacturing course itself has a number of bottlenecks involving other projects. Several hands on course projects involve specialized equipment and technician time. These projects require creative scheduling to get all students equal access to these resources in an eight-week period. Consequently, the learning outcomes and process simulation project were scoped to allow students to work in a self-directed manner with a given set of tutorial videos, feedback from their instructor, and a due dates that varied for project teams.

At the successful completion of this project, students will be able to:

  1. Define manufacturing process terms – batch, process, inventory, WIP, workstation, buffer, cycle time.
  2. Define fundamental Process Simulator concepts – entity, resource, activity, routing, arrival, setting simulation properties, batching, buffers, and priority.
  3. Apply Process Simulator to model a manufacturing process using the fundamental concepts.
  4. Redesign a manufacturing process using Process Simulator.

Even with just those basic concepts the students were able to create useful Process Simulator models of a given manufacturing process. Additionally, the model was sufficiently complex to require creative experimentation and exploration in order to make improvements.

Project Overview

The objective given to students was to use Process Simulator to model the performance of a factory, suggest improvements to the factory, and measure the impact of the improvements. Excerpts from the project description follow and a link to the complete project description is located at the end of this article.

Scenario

You are an engineer at HOBO Inc. (Hands On Bottle Opener, Inc.), producers of a line of extruded, one-handed bottle openers (The Blue Collar, Figure 1) that appeal to customers through durability, reliability, and functionality. You were on the new product development team that designed a new, beautiful, and refined one-handed bottle opener (The Executive, Figure 2) that will allow you to enter an untapped market. The new design is fabricated using an investment casting process that fits well with the geometric complexity and modest volume of production planned for the new model. Because investment casting is not part of HOBO’s core competency, you will outsource the casting. HOBO will receive a shipment of boxes of unfinished casting trees from the fabricator every morning. Each bottle opener will be sawn from the casting tree then tumbled to remove burrs and to produce better surface finish. Sawing, tumbling, and the subsequent inspection step are among our core competencies, so we plan to perform these operations in house on an existing production line. Additionally, we have two workers that are available on the day shift to be used as much as they are needed. (Note that they will not be fired if they are not used all day. We have work for them elsewhere in the factory.) 

This seems like a great opportunity to try the new process simulation software that you are evaluating for purchase. You gathered some baseline data about the operation (see the Factory Description) based on the verbal description by other engineers and managers. The information that you gathered is in the section called Baseline Factory Description.

Baseline Factory Description

There are four workstations. In order, they are:

  1. Receiving
  2. Sawing
  3. Tumbling
  4. Inspection

There are buffers to store work in process (WIP) located before sawing and before tumbling. A flowchart representing the process is shown in Figure 3. A process box for each workstation and the buffers is shown in Table 1. A more complete description of each is found after the table.

Rose-Hulman_Material Flow

Figure 3. Flow of Materials through the Baseline Factory

Deliverables

In order to earn a C Use Process Simulator software to provide a baseline estimate of net income (revenue – expenses) for the first year of operation. The Baseline Factory must follow all the process rules and procedures outlined in the Baseline Factory Description section. Write a memo summarizing the findings.
In order to earn a B Design meaningful improvements to the Baseline Factory. Describe the Improved Factory in the memo by capturing each of the suggested improvements.
In order to earn an A Use Process Simulator to model the Improved Factory. Report the improvement in yearly net income in the memo.

Details about the baseline factory are provided in subsequent sections, as is a set of tutorial videos that guide students through basic concepts. The videos are a mix of guided examples recorded by me and videos provided by ProModel.

Takeaways

This project was first developed and used in academic year 2018-2019. We were pleased with the enthusiasm that students approached the project and engaged in competition to produce the most profitable manufacturing process compared to their peers. We revised the project for 2019-2020 to include a grading scale that further encouraged exploration and a set of tutorial videos walking students through a given omelet station Process Simulator model.

All of the students received a base level exposure to process simulation, but we were pleased to see that a number of students dove deeper into manufacturing process issues. Students challenged the notion that inspection was required to wait until the last process step, unknowingly suggesting the use of quality at the source, a fundamental lean concept. Those students were able to see the positive impact of quality at the source in their Process Simulator models. Other students had insights about the impact of batch work and how batches served as mechanisms for covering root cause process issues. Those students reduced batch sizes where possible and identified the root cause problems.

The complete project description, scoring rubric, and tutorial video list is linked here. You are welcome to reuse it, modify it, make it better, and/or fix mistakes. If you do, let me know at mccormac@rose-hulman.edu. We look forward to featuring Process Simulator as part of our design for manufacturing course in future years and finding new ways to challenge students to explore manufacturing processes and process simulation.

Bio

Dr. Jay McCormack is an Associate Professor of mechanical engineering at Rose-Hulman Institute of Technology. Dr. McCormack’s teaching and professional development interests are in the areas of design and manufacturing. He teaches courses for the mechanical engineering and engineering design programs as well as the institute’s multidisciplinary design course. Before joining Rose-Hulman, Dr. McCormack was a faculty member at the University of Idaho where he worked with the state’s manufacturing extension partnership. He co-founded Pittsburgh-based CAD tool developer DesignAdvance Systems Inc. after graduating from Carnegie Mellon with a PhD in mechanical engineering.

Top 8 Benefits of Proactive Patient Flow Optimization

DanH avatar_34902-300x300

Dan Hickman ProModel CTO

Unpredictably high numbers of scheduled admissions and an uncertain number of available beds.

Stressed staff due to ED boarding, long patient wait times, and off-service placements.

Length of stay and cost per case metrics exceed CMS value-based care efficiency measures.

Sound familiar? 

Patient flow optimization is one of the most cost-effective ways to improve operational effectiveness, the patient stay experience and your hospital’s bottom line. Here’s how.

Top 8 Reasons to Implement Patient Flow Optimization Today

  1. Decrease the Length of Stay (LOS). Find “hidden discharges” (potential candidates for discharge based on diagnosis codes and average LOS metrics) in your current census.
  2. Improve Bottleneck and ADT Issue Visibility. Simply having data does not empower decision makers. In fact, too much data can cause clinical operations staff to ignore it altogether. A patient flow optimization system delivers visual data all hospital staff can easily digest and use to make informed decisions that benefit the hospital and the patients.
  3. Right-size Staffing. By coupling accurate census predictions with staff needs, your health system will experience lower labor costs based on predictable admit, discharge and transfer (ADT) cycles, optimal staffing sizes and diminished demand for expensive nursing agency personnel.
  4. Enhance the Patient Journey. Minimize patient frustration by admitting the vast majority of inpatients to on-service units, even during peak periods.
  5. Capture Additional Revenue. Decreasing length of stay increases bed capacity, so fewer patients leave the hospital without being seen.
  6. Increase Access to Care. Patient flow optimization decreases ED boarding duration, speeds up admissions, and lowers left without being seen (LWBS) rates.
  7. Lower Infrastructure Costs. With patient flow optimization, health systems make optimal use of the existing hospital’s physical footprint, avoiding unnecessary costly build outs.
  8. Staff Satisfaction. Welcome to the stress-free huddle. FutureFlow Rx gives your staff a personal heads-up on issues affecting admissions, discharges and transfers, so they can be addressed at huddle meetings. Prescriptive corrective actions from the patient flow optimization system further empower staff with recommendations based on data and simulation.

 

About FutureFlow Rx™ Patient Flow Optimization

FutureFlow Rx by ProModel uses historical patient flow patterns, real-time clinical data, and discrete event simulation to reveal key trends, provide operational insights, and deliver specific corrective action recommendations to enhance the patient stay experience, lower costs and drive additional revenues. Our platform accurately predicts future events, helping hospitals make the right operational decisions to reduce risk, decrease LOS and improve operational margins. Schedule a demo.

dashboard 300 dpi

 FutureFlow Rx’s dashboard consists of  key performance indicator (KPI) “cards”. The left side of each card shows the last 24 hours; the right side predicts the “Next 24”; and clicking the upper right “light bulbs” provides prescriptive actions to improve the predicted future.

 

New Year, New Ideas

2016 brings a new year and it looks to be one of major changes, opportunities and more in the healthcare biz.  Are you ready to try some new methods of improving your medical practice, hospital or clinic?  Simulation has been used by healthcare professionals for over 25 years.  Because of our years of experience in the Healthcare industry, ProModel has assembled a collection of demonstration models that quickly show you ways to simulate that you may never have considered.

Our first demo model is a simple one,  A Clinical Access Time Model. This model demonstrates the ability to model parking lots and access times for patients.  It is a very basic model which shows the capabilities of ProModel’s MedModel Simulation tool.

Stay tuned over the next couple of blogs and we will share other MedModel demos with you.  Below is a list of the many simulations to come.

  1. Appointment Routine
  2. Use of Independent Arrivals
  3. California City Planning ER & Other Services
  4. Radiology Clinic with Costing Features
  5. Day Surgery
  6. Emergency Department
  7. Emergency Departments with Scenarios
  8. Comparing Defibrilators
  9. Hospital
  10. Eye Clinic
  11. Generic Lab
  12. General Hospital ICU Comparison
  13. Managed Care
  14. Nursing Unit
  15. Operating Room Suite
  16. Pediatric Clinic
  17. Pharmacy
  18. Radiology Clinic
  19. Retaining an Exam Room
  20. Urology Clinic
  21. Womans Diagnostic Clinic
  22. X Ray Clinic

These and many other solution videos are available on our YouTube Channel.

If you would like more information about ProModel solutions contact us.

 

Demystifying Big Data

Rob Wedertz – Director, Navy Programs

Rob Wedertz – Director, Navy Programs

We live in a data-rich world.  It’s been that way for a while now.  “Big Data” is now the moniker that permeates every industry.  For the sake of eliciting a point from the ensuing paragraphs, consider the following:

FA-18 / Extension / Expenditure / Life / Depot / Operations / Hours / Fatigue

Taken independently, the words above mean very little.  However, if placed in context, and with the proper connections applied, we can adequately frame one of the most significant challenges confronting Naval Aviation:

A higher than anticipated demand for flight operations of the FA-18 aircraft has resulted in an increased number of flight hours being flown per aircraft.  This has necessitated additional depot maintenance events to remedy fatigue life expenditure issues in order to achieve an extension of life cycles for legacy FA-18 aircraft.

120613-N-VO377-095  ARABIAN GULF (June 13, 2012) An F/A-18C Hornet assigned to the Blue Blasters of Strike Fighter Squadron (VFA) 34 launches from the flight deck of the Nimitz-class aircraft carrier USS Abraham Lincoln (CVN 72). Lincoln is deployed to the U.S. 5th Fleet area of responsibility conducting maritime security operations, theater security cooperation efforts and combat flight operations in support of Operation Enduring Freedom. (U.S. Navy photo by Mass Communication Specialist 2nd Class Jonathan P. Idle/Released)

(U.S. Navy photo by Mass Communication Specialist 2nd Class Jonathan P. Idle/Released)

The point here is that it is simply not enough to aggregate data for the sake of aggregation.  The true value in harnessing data is knowing which data are important, which are not, and how to tie the data together.  Often times subscribing to the “big data” school of thought has the potential of distraction and misdirection.  I would argue that any exercise in “data” must first begin with a methodical approach to answering the following questions:

“What challenge are we trying to overcome?”

“What are the top 3 causes of the challenge?”

“Which factors are in my control and which ones are not?”

“Do I have access to the data that affect the questions above?”

“How can I use the data to address the challenge?”

weeds sept 2015 blog graphic

While simply a starting point, the above questions will typically allow us to frame the issue, understand the causal effects of the issue, and most importantly facilitate the process of honing in on the data that are important and systematically ignore the data that are not.

To apply a real-world example of the methodology outlined above, consider the software application ProModel has provided to the U.S. Navy – the Naval Synchronization Toolset (NST).

“What challenge are we trying to overcome?”

Since 2001, the U.S. Navy has participated in overseas contingency operations (Operation Enduring Freedom and Operation Iraqi Freedom) and the legacy FA-18 aircraft (A-D) has consumed more its life expectancy at a higher rate.  Coupled with the delay in Initial Operating Capability (IOC) of the F-35C aircraft, the U.S. Navy has been required to develop and sustain a Service Life Extension Program (SLEP) to extend the life of legacy FA-18 aircraft well beyond their six thousand hour life expectancy and schedule and perform high flight hour inspections and major airframe rework maintenance events.  The challenge is: “how does the Navy effectively manage the strike fighter inventory (FA-18) via planned and unplanned maintenance, to ensure strike fighter squadrons are adequately sourced with the right number of FA-18s at the right time?”

“What are the top 3 causes of the challenge?”

  • Delay in IOC of the F-35C
  • Higher flight hour (utilization) and fatigue life expenditure
  • Fixed number of legacy FA-18 in the inventory

“Which factors are in my control and which ones are not?”

 In:

  • High flight hour inspection maintenance events
  • Airframe rework (depot events)

Out:

  • Delay in IOC of the F-35C
  • Fixed number of legacy FA-18 in the inventory

“Do I have access to the data that affect the questions above?”

Yes.  The planned IOC of the F-35C, flight hour utilization of FA-18 aircraft, and projected depot capacity and requirements are all data that is available and injected into the NST application.

“How can I use the data to address the challenge?”

Using the forecasted operational schedules of units users can proactively source FA-18 aircraft to the right squadron at the right time; balanced against maintenance events, depot rework requirements, and overall service life of each aircraft.

Now that the challenge has been framed, the constraints have been identified, and the data identified, the real work can begin.  This is not to say that there is one answer to a tough question or even that there is a big red “Easy” button available.  Moreover, it has allowed us to ensure that we do not fall victim to fretting over an issue that is beyond our control or spend countless hours wading through data that may not be germane.

NST was designed and developed with the points made above in mind.  The FA-18 is a data-rich aircraft.  However, for the sake of the users, NST was architecturally designed to be mindful of only the key fatigue life expenditure issues that ultimately affect whether the aircraft continues its service life or becomes a museum piece.  In the end, NST’s users are charged with providing strike fighter aircraft to units charged with carrying out our national security strategy.  By leveraging the right data, applying rigor to the identification of issues in and out of their control, and harnessing the technology of computational engines, they do precisely that.

ProModel Salutes Founder Charley Harrell for Years of Service…and Getting This Whole Thing Started!

Dr. Charles Harrell founded ProModel in 1988 and was the original developer of the Company’s simulation technology (ProModel PC). Today he serves on the Board of Directors and has been actively involved in new product development, acting as chief technology advisor.  Charley is also an associate professor of Engineering and Technology at Brigham Young University and the author of several simulation books.

Retirement PicsIn May, Charley officially retired from the company and to honor his innovative and productive career ProModel held two celebrations this summer at two of our locations in Orem Utah and Allentown Pennsylvania.  The events were attended by ProModel staff and many of Charley’s long time colleagues who have been with him from the start.

In recent years, Charley has written about his team’s original vision for ProModel back in 1988, “We set out to revolutionize the use of simulation in the business world by introducing the first graphically oriented simulation tool for desktop computers.  We were all convinced that we offered a unique product—a simulation tool that was developed and supported by engineers and specifically designed for engineers.”

Describing the success of ProModel Corporation, Charley writes, “In addition to the impressive growth in ProModel’s predictive simulation technology, it has also been gratifying to see the breadth of application of our technology, not just in fortune 500 companies, but also in the area of healthcare, education, homeland security, military readiness and humanitarian aid.”

The entire ProModel family would like to thank Charley for his years of service, guidance and friendship and we wish him all the best in the future! He has made ProModel what we are today.

In 2013 ProModel celebrated its 25th  Anniversary and Charley shared his memories and appreciation for ProModel in this thoughtful BLOG POST

In the OR with Dale Schroyer

Dale%20Schroyer

Dale Schroyer – Sr. Consultant & Project Manager

I generally find that in healthcare, WHEN something needs to happen is more important than WHAT needs to happen.  It’s a field that is rife with variation, but with simulation, I firmly believe that it can be properly managed.  Patient flow and staffing are always a top concern for hospitals, but it’s important to remember that utilization levels that are too high are just as bad as levels that are too low, and one of the benefits of simulation in healthcare is the ability to staff to demand.

Check out Dale’s work with Robert Wood Johnson University Hospital where they successfully used simulation to manage increased OR patient volume: 

About Dale

Since joining ProModel in 2000, Dale has been developing simulation models used by businesses to perform operational improvement and strategic planning. Prior to joining ProModel Dale spent seven years as a Sr. Corporate Management Engineering Consultant for Baystate Health System in Springfield, MA where he facilitated quality improvement efforts system wide including setting standards and facilitating business re-engineering teams. Earlier he worked as a Project Engineer at the Hamilton Standard Division of United Technologies.

Dale has a BS in Mechanical Engineering from the University of Michigan and a Masters of Management Science from Lesley University. He is a certified Six Sigma Green Belt and is Lean Bronze certified.

NEW! ProModel’s Patient Flow Solution:

http://patientflowstudio.com/

ProModel Healthcare Solutions:

http://www.promodel.com/Industries/Healthcare

Teaching Simulation to Graduate Students Using ProModel Products and Real-World Problems

ProModel Guest Blogger:  Larry Fulton, Ph.D. & MSStat – Assistant Professor of Health Organization Management at Texas Tech University Rawls College of Business.  After serving 25 years in military medicine, Dr. Fulton began a second career in teaching and research.

Larry Fulton, Ph.D. & MSStat

Larry Fulton, Ph.D. & MSStat

Teaching introductory Monte Carlo, Discrete Event, and Continuous simulation to business graduate students requires at least two components beyond a good set of reference materials:   realistic or real-world problems and an excellent modeling platform allowing for relatively rapid development.  In the case discussed here, the real-world scenarios derived from interests and background of the professor and students (portfolio analysis, sustainability, and military medicine), while ProModel products addressed the platform requirements. Each of the case study  scenarios served to underscore various simulation building elements, while ProModel supported rapid  product development for a 14-week, lab-intensive course that included some  reviews of probability, statistics, queuing, and                                                    stochastic processes.

Scenario 1:  Monte Carlo Simulation (Portfolio Analysis)

Business students generally have an affinity for portfolio analysis, and I do as well. Using ProModel  features, one of the earliest student projects involves fitting univariate distributions to return rates to several funds and simulating results of investment decisions of various time horizons.  Students discuss methods that might account for covariance as well as autoregressive components in these simulations.  While developing the simulations, students also determine sample size requirements to bracket mean return on investment within a specified margin of error and confidence interval and use random numbers seeds.

Scenario 2:  Continuous Simulation using Rainwater Harvesting

Students in this course are generally from a semi-arid region (Central Texas), which has significant water shortages (so much so that desalinization is being considered.)  I rely 100% on rainwater harvesting for my home water supply, so extending this to each student’s particular home location is trivial. The “Tank Submodule” provides an easy mechanism for developing the simulations.  Students develop conceptual models of rainwater mechanism as well as flowcharts.  They gather rainfall data from the National Oceanic and Atmospheric Administration regarding rainfall and evaluate various roof sizes (capture space), demand figures based on occupants, and tank sizes. They also learn about the importance of order statistics (the distribution of the minimum in the tank) versus measures of central tendency that often dominate discussions of simulation. Finally, they incorporate tools and techniques to improve and assess V&V.

­­­Scenario 3:  Discrete Event Simulation using Military Scenarios

While serving as the Chief of Operations Research Branch for the Center for Army Medical Department Strategic Studies, I encouraged the use of MedModel for multiple DES projects.  The team built strategic models (resource constrained and unconstrained) for analyzing medical requirements for strategic operations. These same models serve as a basis for a team-based MedModel student capstone project.  The primary entity for these models was the patient with attributes of severity, injury type, and evacuation type.  The primary processes involved collection, treatment and evacuation. Resources included ground ambulances, air ambulances, medics, intensive care units, and operating rooms.   Locations were geographic locations throughout the entire of Afghanistan.  Evacuation paths were built, and treatment logic (triage, ground evacuation, air evacuation, etc.) provided the flow.

Bottom Line:  The ProModel products are outstanding for use in both teaching and industry.

 Larry Fulton Bio:

http://www.depts.ttu.edu/rawlsbusiness/people/faculty/hom/larry-fulton/

Teaching Process Management Using ProModel

ProModel Guest Blogger:  Scott Metlen, Ph.D. – Business Department Head and Associate Professor at University of Idaho

Scott Metlen, Ph.D.

Scott Metlen, Ph.D.

Understanding process management, the design, implementation, management and control, and continuous improvement of the enterprise wide set of an organizations processes is the key to well deployed strategies. It was not until Tim Cook made Apple’s total set of processes world class including all supply chain linked processes (Brownlee, 2012) that Apple hit its amazing climb to become the world’s highest valued company; even though the company had cutting edge products before his arrival. Gaining effective understanding of process management is not easy due to the strategic variability inherent in the portfolio of products that companies sell, and in markets they service. This strategic variability (Rajan, 2011) in turn drives variability in many processes that an organization uses to operate. For instance, different markets require different marketing plans supported by different processes.  Order processes often vary by product and target market. Employee skill sets differ by product requiring different hiring and training processes. Different products, whether it be services or goods that have a slight variation require, at the very least, an adjustment to the production process. Adding to, and often caused by the variability just mentioned, are multiple process steps, each with different duration times and human resource skills.  Depending on what product is currently being produced, process steps, process step order and duration time, interdependency between the process steps, and business rules all vary. Where a product is in its life cycle will drive the experience curve, again creating variation across products. In addition, the numerous interfaces with other processes all vary depending on the product being produced. All of these sources of variability can make process management hard to do, teach, and learn. One tool that helps with process management in the face of variance is discrete event simulation and one of the best software suites to use is ProModel. ProModel is a flexible program with excellent product support from the company.

Effective process management is a multi-step process. The first step of process management is to determine the process flow while at the same time determining the value and non-value added process steps. Included in the process flow diagram for each step are the duration times by product and resources needed at each step, and product routes. Also needed at this time are business rules governing the process such as working hours, safety envelopes, quality control, queueing rules, and many others. Capturing this complex interrelated system begins by visiting the process and talking with the process owner and operators. Drawing the diagram and listing other information is a good second step, but actually building and operating the process is when a person truly understands the process and its complexities.  Of course many of the processes we want to improve are already built and are in use. In most cases, students will not be able to do either of these. However, building a verified and validated simulation model is a good proxy for doing the real thing, as the model will never validate against the actual process output unless all of the complexity is included or represented in the model. In the ‘Systems and Simulation’ course at the University of Idaho students first learn fundamentals of process management including lean terms and tools. Then they are given the opportunity to visit a company in the third week of class as a member of a team to conduct a process improvement project. In this visit students meet the process owner and operators. If the process is a production process, they walk the floor and discuss the process and the delta between expected and actual output. If the process is an information flow process, such as much of an order process, the students discuss the process and, again, the delta between expected and realized output. Over the next six weeks students take the preliminary data and begin to build a simulation model of the current state of the process. During this time period students discover that they do not have all the data and information they need to replicate the actual process. In many cases they do not have the data and/or information because the company does not have that information or how the model is operated is not the same as designed. Students then have to contact the process owner and operators throughout the six weeks to determine the actual business rules used and/or make informed assumptions to complete their model.

Once the model has been validated and the students have a deep understanding of the process, students start modeling process changes that will eliminate waste in the system, increase output, and decrease cost. Examples of methods used to improve the process include changing business rules, adding strategically placed buffers and resources, and reallocating resources. To determine the most effective way to improve the process, a cost benefit analysis in the form of an NPV analysis is completed. The students use the distribution of outputs from the original model to generate appropriate output and then compare that output to output pulled from the distributions of each improvement scenario. This comparison is then used to determine a 95% confidence interval for the NPV and the probability of the NPV being zero or less. Finally, several weeks before the semester is finished, students travel to the company to present their findings and recommendations.

Student learning on these projects is multifaceted. Learning how to use ProModel is the level that the students are most aware of during the semester, as it takes much of their time. However, by the end of the semester they talk about improving their ability to manage processes, work in teams, deal with ambiguity, manage multiple projects, present to high level managers, and maintain steady communication with project owners.

Utilizing external projects and discrete event simulation to teach process management has been used in the College of Business and Economics at the University of Idaho for the past six years. As a result, the Production and Operation area has grown from 40 to 150 students and from five to 20 projects per semester. More importantly, students who complete this course are being sought out and hired by firms based on the transformational learning and skill sets students acquired through the program.

References:

Rajan Suri. Beyond Lean: It’s About Time. 2011 Technical Report, Center for Quick Response Manufacturing, University of Wisconsin-Madison.

Brownlee, John. Apples’s Secret Weapon 06/13/2012. http://www.cnn.com/2012/06/12/opinion/brownlee-apple-secret/index.html?hpt=hp_t2. 12/301/2014.

Scott Metlen Bio:

http://www.uidaho.edu/cbe/business/scottmetlen

 

Happy Holidays!

President & CEO ProModel Corporation

Keith Vadas – President & CEO ProModel Corporation

The ProModel family would like to wish everyone a very joyous holiday season and a prosperous 2015!  We thank you for all your support and business this past year.  As always, our goal is to help you meet or exceed your performance goals.  We hope that our people and solutions were able to assist you in that endeavor this past year.

2014 was a busy year for ProModel filled with exciting new products like Process Simulator Pro, revamped new releases of ProModel, MedModel and Enterprise Portfolio Simulator, and of course our custom solutions designed for a host of clients across all industries. As most of you know, we have an extraordinary team of consultants and software developers always available to help your organization meet the next business challenge. Looking ahead, 2015 is shaping up to be another BIG year here at ProModel as we continue to develop new products including Healthcare solutions and other business improvement tools. 

Thank you, and I wish you and your families a happy holiday and a joyful New Year.

 

Team ProModel Conquers Ragnar Once Again!

Team ProModel at the Finish Line

Team ProModel at the Finish Line

The 2014 Ragnar Relay Recap…according to Jay Wisnosky, Tim Shelton, and Pat Sullivan

So there’s this event  where 12 people team together, split up runners into two separate vans and then run a 200 mile relay. It’s called the Ragnar Relay.  Yes, that’s how it first gets explained to you…

Then you get more information like, “you’ll have to run about 15-18 miles tops. It’s tough and there’s a lot of hills, but it’s a lot of fun.”  Fun?

https://www.ragnarrelay.com/

“12 friends, 2 vans, 2 days, 1 night, 200 mile relay…unforgettable stories.”  This is Ragnar. Pat Sullivan’s blog about Ragnar began with this quote last year, and I think it summarizes the event for the rest of us still.

But to get a true picture of Ragnar, you really have to put yourself in a white, 15 person passenger van with 5 other people. It’s close quarters in there. It goes from clean one minute to trashed the next and never smells good or is quiet enough to sleep. Some people are your co-workers, some are friends, and some are complete strangers. You then have to imagine you are about run anywhere from 4 to 8 miles – it’s now YOUR turn. Whatever routine you had to get ready to run at home is gone…replace that with stretching in a van surrounded by running shoes and gym bags. This is when you start to get nervous because you’re in unfamiliar territory, you’re excited, but also tired, and there’s a good chance you have to go to the bathroom from all that water you’ve been drinking. This is when you hope you trained enough. This is when you tell yourself that after this leg, you still have two more to go…and you probably won’t be sleeping between them. This is when you say, “what did I get myself….” and then one of your teammates asks, “what do you need? Some water? Something to eat.” And you relax, knowing that the collection of people in that van are with you -they have your back and will help you through it, even if you are wishing you trained for this a lot harder than you actually did.

Kelly handing off to Jason

Kelly handing off to Jason

Another year, another ProModel Ragnar team built on commitment, dedicated teamwork and a great mixture of veteran leadership and new, eager faces.  From October  24-26, Team ProModel meshed as a team in one of America’s most grueling endurance races. The Chattanooga to Nashville Ragnar Relay undoubtedly demanded an often extraordinary level of dedication and sacrifice.  The twelve person 2014 team consisted of team captain Tim Shelton, (ProModel Sr Army Program Manager), Pat Sullivan (ProModel VP for Army Programs),  Dan Hickman (ProModel CTO), Clay Gifford (ProModel Developer for DST), Jay Wisnosky (ProModel Technical Writer for DST) Brian Brown, Susan Whitehead, Sheri Shamwell, Mickelle Penn, Kelly Parker,  Lisa Reyes and Jason Mcormick.  And of course, with a great deal of support and commitment from Keith Vadas and Carl Napoletano…and the incredible effort of Christine Bunker (ProModel marketing) and                                                         our awesome driver (Chief Reyes).

Lisa Reyes kicked off Race day at 07:30 Friday morning at a beautiful waterfront setting on the Tennessee River in Chattanooga.  Each runner was scheduled to follow for three legs during the estimated 34 hours to complete the race.  We planned for each of our 12 runners to complete 16-19 miles each.  The two vans of Team ProModel met briefly through the race, with 5 intersection points where the baton was handed over from one van to the next.

Lisa Reyes going uphill

Lisa Reyes going uphill

Miles and miles passed with each runner facing his or her own set of obstacles. Some ran steep hills (Brian Brown climbed 1300 feet in elevation over 8+ miles with his first leg), or through the wee hours of the night with the sounds of dogs barking (and growling sound machines coming from another van) as Mike Penn would come to experience. Others came down the other side of those steep hills and endured the bright autumn mid afternoon sun – which Pat Sullivan can now vouch that 9 miles of beautiful Tennessee countryside is sometimes blurred by surprising heat.  However, Team ProModel banded together to support each other, as well as other runners from other teams.

 

 

Dan Hickman feeling strong...on his first leg

Dan Hickman feeling strong…on his first leg

There were plenty of laughs in between – often times over snack choices, foot odor, getting passed on the course by 12 year olds, bathroom strategies, sore muscles that make you walk funny, and delusions caused by lack of sleep. We spotted the little known Ragnasaurus, our vans were “branded” with magnets and paint from other teams, some people gained nicknames, and we all learned the value of fast restaurant service and having a bed instead of a gym floor to rest.

Team ProModel made it 198 miles through the mountains, into the rolling hills of Tennessee and eventually to the Music City that is, Nashville. This group grew to become teammates and friends, after starting out with one common goal in mind – just run and have fun! Thanks again for the great support and allowing us to represent ProModel…know you would have been proud.

Susan Whitehead with the Ragnar Bear

Susan Whitehead with the Ragnar Bear

Tim Shelton running his last leg

Tim Shelton running his last leg

 

 

 

 

Lisa Reyes, Brian Brown, Dan Hickman, Tim Shelton, Kelly Parker

Lisa Reyes, Brian Brown, Dan Hickman, Tim Shelton, Kelly Parker  

Knight Runner

Knight Runner

Dan Hickman, Clay Gifford, Pat Sullivan, Tim Shelton, Jay Wisnosky

Dan Hickman, Clay Gifford, Pat Sullivan, Tim Shelton, Jay Wisnosky

Dan Hickman hands off to Tim Shelton

Dan Hickman hands off to Tim Shelton

Team ProModel 2014