Top 8 Benefits of Proactive Patient Flow Optimization

DanH avatar_34902-300x300

Dan Hickman ProModel CTO

Unpredictably high numbers of scheduled admissions and an uncertain number of available beds.

Stressed staff due to ED boarding, long patient wait times, and off-service placements.

Length of stay and cost per case metrics exceed CMS value-based care efficiency measures.

Sound familiar? 

Patient flow optimization is one of the most cost-effective ways to improve operational effectiveness, the patient stay experience and your hospital’s bottom line. Here’s how.

Top 8 Reasons to Implement Patient Flow Optimization Today

  1. Decrease the Length of Stay (LOS). Find “hidden discharges” (potential candidates for discharge based on diagnosis codes and average LOS metrics) in your current census.
  2. Improve Bottleneck and ADT Issue Visibility. Simply having data does not empower decision makers. In fact, too much data can cause clinical operations staff to ignore it altogether. A patient flow optimization system delivers visual data all hospital staff can easily digest and use to make informed decisions that benefit the hospital and the patients.
  3. Right-size Staffing. By coupling accurate census predictions with staff needs, your health system will experience lower labor costs based on predictable admit, discharge and transfer (ADT) cycles, optimal staffing sizes and diminished demand for expensive nursing agency personnel.
  4. Enhance the Patient Journey. Minimize patient frustration by admitting the vast majority of inpatients to on-service units, even during peak periods.
  5. Capture Additional Revenue. Decreasing length of stay increases bed capacity, so fewer patients leave the hospital without being seen.
  6. Increase Access to Care. Patient flow optimization decreases ED boarding duration, speeds up admissions, and lowers left without being seen (LWBS) rates.
  7. Lower Infrastructure Costs. With patient flow optimization, health systems make optimal use of the existing hospital’s physical footprint, avoiding unnecessary costly build outs.
  8. Staff Satisfaction. Welcome to the stress-free huddle. FutureFlow Rx gives your staff a personal heads-up on issues affecting admissions, discharges and transfers, so they can be addressed at huddle meetings. Prescriptive corrective actions from the patient flow optimization system further empower staff with recommendations based on data and simulation.

 

About FutureFlow Rx™ Patient Flow Optimization

FutureFlow Rx by ProModel uses historical patient flow patterns, real-time clinical data, and discrete event simulation to reveal key trends, provide operational insights, and deliver specific corrective action recommendations to enhance the patient stay experience, lower costs and drive additional revenues. Our platform accurately predicts future events, helping hospitals make the right operational decisions to reduce risk, decrease LOS and improve operational margins. Schedule a demo.

dashboard 300 dpi

 FutureFlow Rx’s dashboard consists of  key performance indicator (KPI) “cards”. The left side of each card shows the last 24 hours; the right side predicts the “Next 24”; and clicking the upper right “light bulbs” provides prescriptive actions to improve the predicted future.

 

New Year, New Ideas

2016 brings a new year and it looks to be one of major changes, opportunities and more in the healthcare biz.  Are you ready to try some new methods of improving your medical practice, hospital or clinic?  Simulation has been used by healthcare professionals for over 25 years.  Because of our years of experience in the Healthcare industry, ProModel has assembled a collection of demonstration models that quickly show you ways to simulate that you may never have considered.

Our first demo model is a simple one,  A Clinical Access Time Model. This model demonstrates the ability to model parking lots and access times for patients.  It is a very basic model which shows the capabilities of ProModel’s MedModel Simulation tool.

Stay tuned over the next couple of blogs and we will share other MedModel demos with you.  Below is a list of the many simulations to come.

  1. Appointment Routine
  2. Use of Independent Arrivals
  3. California City Planning ER & Other Services
  4. Radiology Clinic with Costing Features
  5. Day Surgery
  6. Emergency Department
  7. Emergency Departments with Scenarios
  8. Comparing Defibrilators
  9. Hospital
  10. Eye Clinic
  11. Generic Lab
  12. General Hospital ICU Comparison
  13. Managed Care
  14. Nursing Unit
  15. Operating Room Suite
  16. Pediatric Clinic
  17. Pharmacy
  18. Radiology Clinic
  19. Retaining an Exam Room
  20. Urology Clinic
  21. Womans Diagnostic Clinic
  22. X Ray Clinic

These and many other solution videos are available on our YouTube Channel.

If you would like more information about ProModel solutions contact us.

 

Demystifying Big Data

Rob Wedertz – Director, Navy Programs

Rob Wedertz – Director, Navy Programs

We live in a data-rich world.  It’s been that way for a while now.  “Big Data” is now the moniker that permeates every industry.  For the sake of eliciting a point from the ensuing paragraphs, consider the following:

FA-18 / Extension / Expenditure / Life / Depot / Operations / Hours / Fatigue

Taken independently, the words above mean very little.  However, if placed in context, and with the proper connections applied, we can adequately frame one of the most significant challenges confronting Naval Aviation:

A higher than anticipated demand for flight operations of the FA-18 aircraft has resulted in an increased number of flight hours being flown per aircraft.  This has necessitated additional depot maintenance events to remedy fatigue life expenditure issues in order to achieve an extension of life cycles for legacy FA-18 aircraft.

120613-N-VO377-095  ARABIAN GULF (June 13, 2012) An F/A-18C Hornet assigned to the Blue Blasters of Strike Fighter Squadron (VFA) 34 launches from the flight deck of the Nimitz-class aircraft carrier USS Abraham Lincoln (CVN 72). Lincoln is deployed to the U.S. 5th Fleet area of responsibility conducting maritime security operations, theater security cooperation efforts and combat flight operations in support of Operation Enduring Freedom. (U.S. Navy photo by Mass Communication Specialist 2nd Class Jonathan P. Idle/Released)

(U.S. Navy photo by Mass Communication Specialist 2nd Class Jonathan P. Idle/Released)

The point here is that it is simply not enough to aggregate data for the sake of aggregation.  The true value in harnessing data is knowing which data are important, which are not, and how to tie the data together.  Often times subscribing to the “big data” school of thought has the potential of distraction and misdirection.  I would argue that any exercise in “data” must first begin with a methodical approach to answering the following questions:

“What challenge are we trying to overcome?”

“What are the top 3 causes of the challenge?”

“Which factors are in my control and which ones are not?”

“Do I have access to the data that affect the questions above?”

“How can I use the data to address the challenge?”

weeds sept 2015 blog graphic

While simply a starting point, the above questions will typically allow us to frame the issue, understand the causal effects of the issue, and most importantly facilitate the process of honing in on the data that are important and systematically ignore the data that are not.

To apply a real-world example of the methodology outlined above, consider the software application ProModel has provided to the U.S. Navy – the Naval Synchronization Toolset (NST).

“What challenge are we trying to overcome?”

Since 2001, the U.S. Navy has participated in overseas contingency operations (Operation Enduring Freedom and Operation Iraqi Freedom) and the legacy FA-18 aircraft (A-D) has consumed more its life expectancy at a higher rate.  Coupled with the delay in Initial Operating Capability (IOC) of the F-35C aircraft, the U.S. Navy has been required to develop and sustain a Service Life Extension Program (SLEP) to extend the life of legacy FA-18 aircraft well beyond their six thousand hour life expectancy and schedule and perform high flight hour inspections and major airframe rework maintenance events.  The challenge is: “how does the Navy effectively manage the strike fighter inventory (FA-18) via planned and unplanned maintenance, to ensure strike fighter squadrons are adequately sourced with the right number of FA-18s at the right time?”

“What are the top 3 causes of the challenge?”

  • Delay in IOC of the F-35C
  • Higher flight hour (utilization) and fatigue life expenditure
  • Fixed number of legacy FA-18 in the inventory

“Which factors are in my control and which ones are not?”

 In:

  • High flight hour inspection maintenance events
  • Airframe rework (depot events)

Out:

  • Delay in IOC of the F-35C
  • Fixed number of legacy FA-18 in the inventory

“Do I have access to the data that affect the questions above?”

Yes.  The planned IOC of the F-35C, flight hour utilization of FA-18 aircraft, and projected depot capacity and requirements are all data that is available and injected into the NST application.

“How can I use the data to address the challenge?”

Using the forecasted operational schedules of units users can proactively source FA-18 aircraft to the right squadron at the right time; balanced against maintenance events, depot rework requirements, and overall service life of each aircraft.

Now that the challenge has been framed, the constraints have been identified, and the data identified, the real work can begin.  This is not to say that there is one answer to a tough question or even that there is a big red “Easy” button available.  Moreover, it has allowed us to ensure that we do not fall victim to fretting over an issue that is beyond our control or spend countless hours wading through data that may not be germane.

NST was designed and developed with the points made above in mind.  The FA-18 is a data-rich aircraft.  However, for the sake of the users, NST was architecturally designed to be mindful of only the key fatigue life expenditure issues that ultimately affect whether the aircraft continues its service life or becomes a museum piece.  In the end, NST’s users are charged with providing strike fighter aircraft to units charged with carrying out our national security strategy.  By leveraging the right data, applying rigor to the identification of issues in and out of their control, and harnessing the technology of computational engines, they do precisely that.

ProModel Salutes Founder Charley Harrell for Years of Service…and Getting This Whole Thing Started!

Dr. Charles Harrell founded ProModel in 1988 and was the original developer of the Company’s simulation technology (ProModel PC). Today he serves on the Board of Directors and has been actively involved in new product development, acting as chief technology advisor.  Charley is also an associate professor of Engineering and Technology at Brigham Young University and the author of several simulation books.

Retirement PicsIn May, Charley officially retired from the company and to honor his innovative and productive career ProModel held two celebrations this summer at two of our locations in Orem Utah and Allentown Pennsylvania.  The events were attended by ProModel staff and many of Charley’s long time colleagues who have been with him from the start.

In recent years, Charley has written about his team’s original vision for ProModel back in 1988, “We set out to revolutionize the use of simulation in the business world by introducing the first graphically oriented simulation tool for desktop computers.  We were all convinced that we offered a unique product—a simulation tool that was developed and supported by engineers and specifically designed for engineers.”

Describing the success of ProModel Corporation, Charley writes, “In addition to the impressive growth in ProModel’s predictive simulation technology, it has also been gratifying to see the breadth of application of our technology, not just in fortune 500 companies, but also in the area of healthcare, education, homeland security, military readiness and humanitarian aid.”

The entire ProModel family would like to thank Charley for his years of service, guidance and friendship and we wish him all the best in the future! He has made ProModel what we are today.

In 2013 ProModel celebrated its 25th  Anniversary and Charley shared his memories and appreciation for ProModel in this thoughtful BLOG POST

In the OR with Dale Schroyer

Dale%20Schroyer

Dale Schroyer – Sr. Consultant & Project Manager

I generally find that in healthcare, WHEN something needs to happen is more important than WHAT needs to happen.  It’s a field that is rife with variation, but with simulation, I firmly believe that it can be properly managed.  Patient flow and staffing are always a top concern for hospitals, but it’s important to remember that utilization levels that are too high are just as bad as levels that are too low, and one of the benefits of simulation in healthcare is the ability to staff to demand.

Check out Dale’s work with Robert Wood Johnson University Hospital where they successfully used simulation to manage increased OR patient volume: 

About Dale

Since joining ProModel in 2000, Dale has been developing simulation models used by businesses to perform operational improvement and strategic planning. Prior to joining ProModel Dale spent seven years as a Sr. Corporate Management Engineering Consultant for Baystate Health System in Springfield, MA where he facilitated quality improvement efforts system wide including setting standards and facilitating business re-engineering teams. Earlier he worked as a Project Engineer at the Hamilton Standard Division of United Technologies.

Dale has a BS in Mechanical Engineering from the University of Michigan and a Masters of Management Science from Lesley University. He is a certified Six Sigma Green Belt and is Lean Bronze certified.

NEW! ProModel’s Patient Flow Solution:

http://patientflowstudio.com/

ProModel Healthcare Solutions:

http://www.promodel.com/Industries/Healthcare

Teaching Simulation to Graduate Students Using ProModel Products and Real-World Problems

ProModel Guest Blogger:  Larry Fulton, Ph.D. & MSStat – Assistant Professor of Health Organization Management at Texas Tech University Rawls College of Business.  After serving 25 years in military medicine, Dr. Fulton began a second career in teaching and research.

Larry Fulton, Ph.D. & MSStat

Larry Fulton, Ph.D. & MSStat

Teaching introductory Monte Carlo, Discrete Event, and Continuous simulation to business graduate students requires at least two components beyond a good set of reference materials:   realistic or real-world problems and an excellent modeling platform allowing for relatively rapid development.  In the case discussed here, the real-world scenarios derived from interests and background of the professor and students (portfolio analysis, sustainability, and military medicine), while ProModel products addressed the platform requirements. Each of the case study  scenarios served to underscore various simulation building elements, while ProModel supported rapid  product development for a 14-week, lab-intensive course that included some  reviews of probability, statistics, queuing, and                                                    stochastic processes.

Scenario 1:  Monte Carlo Simulation (Portfolio Analysis)

Business students generally have an affinity for portfolio analysis, and I do as well. Using ProModel  features, one of the earliest student projects involves fitting univariate distributions to return rates to several funds and simulating results of investment decisions of various time horizons.  Students discuss methods that might account for covariance as well as autoregressive components in these simulations.  While developing the simulations, students also determine sample size requirements to bracket mean return on investment within a specified margin of error and confidence interval and use random numbers seeds.

Scenario 2:  Continuous Simulation using Rainwater Harvesting

Students in this course are generally from a semi-arid region (Central Texas), which has significant water shortages (so much so that desalinization is being considered.)  I rely 100% on rainwater harvesting for my home water supply, so extending this to each student’s particular home location is trivial. The “Tank Submodule” provides an easy mechanism for developing the simulations.  Students develop conceptual models of rainwater mechanism as well as flowcharts.  They gather rainfall data from the National Oceanic and Atmospheric Administration regarding rainfall and evaluate various roof sizes (capture space), demand figures based on occupants, and tank sizes. They also learn about the importance of order statistics (the distribution of the minimum in the tank) versus measures of central tendency that often dominate discussions of simulation. Finally, they incorporate tools and techniques to improve and assess V&V.

­­­Scenario 3:  Discrete Event Simulation using Military Scenarios

While serving as the Chief of Operations Research Branch for the Center for Army Medical Department Strategic Studies, I encouraged the use of MedModel for multiple DES projects.  The team built strategic models (resource constrained and unconstrained) for analyzing medical requirements for strategic operations. These same models serve as a basis for a team-based MedModel student capstone project.  The primary entity for these models was the patient with attributes of severity, injury type, and evacuation type.  The primary processes involved collection, treatment and evacuation. Resources included ground ambulances, air ambulances, medics, intensive care units, and operating rooms.   Locations were geographic locations throughout the entire of Afghanistan.  Evacuation paths were built, and treatment logic (triage, ground evacuation, air evacuation, etc.) provided the flow.

Bottom Line:  The ProModel products are outstanding for use in both teaching and industry.

 Larry Fulton Bio:

http://www.depts.ttu.edu/rawlsbusiness/people/faculty/hom/larry-fulton/

Teaching Process Management Using ProModel

ProModel Guest Blogger:  Scott Metlen, Ph.D. – Business Department Head and Associate Professor at University of Idaho

Scott Metlen, Ph.D.

Scott Metlen, Ph.D.

Understanding process management, the design, implementation, management and control, and continuous improvement of the enterprise wide set of an organizations processes is the key to well deployed strategies. It was not until Tim Cook made Apple’s total set of processes world class including all supply chain linked processes (Brownlee, 2012) that Apple hit its amazing climb to become the world’s highest valued company; even though the company had cutting edge products before his arrival. Gaining effective understanding of process management is not easy due to the strategic variability inherent in the portfolio of products that companies sell, and in markets they service. This strategic variability (Rajan, 2011) in turn drives variability in many processes that an organization uses to operate. For instance, different markets require different marketing plans supported by different processes.  Order processes often vary by product and target market. Employee skill sets differ by product requiring different hiring and training processes. Different products, whether it be services or goods that have a slight variation require, at the very least, an adjustment to the production process. Adding to, and often caused by the variability just mentioned, are multiple process steps, each with different duration times and human resource skills.  Depending on what product is currently being produced, process steps, process step order and duration time, interdependency between the process steps, and business rules all vary. Where a product is in its life cycle will drive the experience curve, again creating variation across products. In addition, the numerous interfaces with other processes all vary depending on the product being produced. All of these sources of variability can make process management hard to do, teach, and learn. One tool that helps with process management in the face of variance is discrete event simulation and one of the best software suites to use is ProModel. ProModel is a flexible program with excellent product support from the company.

Effective process management is a multi-step process. The first step of process management is to determine the process flow while at the same time determining the value and non-value added process steps. Included in the process flow diagram for each step are the duration times by product and resources needed at each step, and product routes. Also needed at this time are business rules governing the process such as working hours, safety envelopes, quality control, queueing rules, and many others. Capturing this complex interrelated system begins by visiting the process and talking with the process owner and operators. Drawing the diagram and listing other information is a good second step, but actually building and operating the process is when a person truly understands the process and its complexities.  Of course many of the processes we want to improve are already built and are in use. In most cases, students will not be able to do either of these. However, building a verified and validated simulation model is a good proxy for doing the real thing, as the model will never validate against the actual process output unless all of the complexity is included or represented in the model. In the ‘Systems and Simulation’ course at the University of Idaho students first learn fundamentals of process management including lean terms and tools. Then they are given the opportunity to visit a company in the third week of class as a member of a team to conduct a process improvement project. In this visit students meet the process owner and operators. If the process is a production process, they walk the floor and discuss the process and the delta between expected and actual output. If the process is an information flow process, such as much of an order process, the students discuss the process and, again, the delta between expected and realized output. Over the next six weeks students take the preliminary data and begin to build a simulation model of the current state of the process. During this time period students discover that they do not have all the data and information they need to replicate the actual process. In many cases they do not have the data and/or information because the company does not have that information or how the model is operated is not the same as designed. Students then have to contact the process owner and operators throughout the six weeks to determine the actual business rules used and/or make informed assumptions to complete their model.

Once the model has been validated and the students have a deep understanding of the process, students start modeling process changes that will eliminate waste in the system, increase output, and decrease cost. Examples of methods used to improve the process include changing business rules, adding strategically placed buffers and resources, and reallocating resources. To determine the most effective way to improve the process, a cost benefit analysis in the form of an NPV analysis is completed. The students use the distribution of outputs from the original model to generate appropriate output and then compare that output to output pulled from the distributions of each improvement scenario. This comparison is then used to determine a 95% confidence interval for the NPV and the probability of the NPV being zero or less. Finally, several weeks before the semester is finished, students travel to the company to present their findings and recommendations.

Student learning on these projects is multifaceted. Learning how to use ProModel is the level that the students are most aware of during the semester, as it takes much of their time. However, by the end of the semester they talk about improving their ability to manage processes, work in teams, deal with ambiguity, manage multiple projects, present to high level managers, and maintain steady communication with project owners.

Utilizing external projects and discrete event simulation to teach process management has been used in the College of Business and Economics at the University of Idaho for the past six years. As a result, the Production and Operation area has grown from 40 to 150 students and from five to 20 projects per semester. More importantly, students who complete this course are being sought out and hired by firms based on the transformational learning and skill sets students acquired through the program.

References:

Rajan Suri. Beyond Lean: It’s About Time. 2011 Technical Report, Center for Quick Response Manufacturing, University of Wisconsin-Madison.

Brownlee, John. Apples’s Secret Weapon 06/13/2012. http://www.cnn.com/2012/06/12/opinion/brownlee-apple-secret/index.html?hpt=hp_t2. 12/301/2014.

Scott Metlen Bio:

http://www.uidaho.edu/cbe/business/scottmetlen