Top 8 Benefits of Proactive Patient Flow Optimization

DanH avatar_34902-300x300

Dan Hickman ProModel CTO

Unpredictably high numbers of scheduled admissions and an uncertain number of available beds.

Stressed staff due to ED boarding, long patient wait times, and off-service placements.

Length of stay and cost per case metrics exceed CMS value-based care efficiency measures.

Sound familiar? 

Patient flow optimization is one of the most cost-effective ways to improve operational effectiveness, the patient stay experience and your hospital’s bottom line. Here’s how.

Top 8 Reasons to Implement Patient Flow Optimization Today

  1. Decrease the Length of Stay (LOS). Find “hidden discharges” (potential candidates for discharge based on diagnosis codes and average LOS metrics) in your current census.
  2. Improve Bottleneck and ADT Issue Visibility. Simply having data does not empower decision makers. In fact, too much data can cause clinical operations staff to ignore it altogether. A patient flow optimization system delivers visual data all hospital staff can easily digest and use to make informed decisions that benefit the hospital and the patients.
  3. Right-size Staffing. By coupling accurate census predictions with staff needs, your health system will experience lower labor costs based on predictable admit, discharge and transfer (ADT) cycles, optimal staffing sizes and diminished demand for expensive nursing agency personnel.
  4. Enhance the Patient Journey. Minimize patient frustration by admitting the vast majority of inpatients to on-service units, even during peak periods.
  5. Capture Additional Revenue. Decreasing length of stay increases bed capacity, so fewer patients leave the hospital without being seen.
  6. Increase Access to Care. Patient flow optimization decreases ED boarding duration, speeds up admissions, and lowers left without being seen (LWBS) rates.
  7. Lower Infrastructure Costs. With patient flow optimization, health systems make optimal use of the existing hospital’s physical footprint, avoiding unnecessary costly build outs.
  8. Staff Satisfaction. Welcome to the stress-free huddle. FutureFlow Rx gives your staff a personal heads-up on issues affecting admissions, discharges and transfers, so they can be addressed at huddle meetings. Prescriptive corrective actions from the patient flow optimization system further empower staff with recommendations based on data and simulation.

 

About FutureFlow Rx™ Patient Flow Optimization

FutureFlow Rx by ProModel uses historical patient flow patterns, real-time clinical data, and discrete event simulation to reveal key trends, provide operational insights, and deliver specific corrective action recommendations to enhance the patient stay experience, lower costs and drive additional revenues. Our platform accurately predicts future events, helping hospitals make the right operational decisions to reduce risk, decrease LOS and improve operational margins. Schedule a demo.

dashboard 300 dpi

 FutureFlow Rx’s dashboard consists of  key performance indicator (KPI) “cards”. The left side of each card shows the last 24 hours; the right side predicts the “Next 24”; and clicking the upper right “light bulbs” provides prescriptive actions to improve the predicted future.

 

Lead Materiel Integrator – Decision Support Tool: Provides Total Asset Visibility and Planning Capability Not Previously Available

tim_iraq

Tim Shelton Sr. ProModel Program Manager for Army Programs

With over 6,000 current users, the Lead Materiel Integrator (LMI) Decision Support Tool (DST), developed by ProModel Corporation, is the Army’s sole equipment distribution and redistribution tool.  From sourcing equipment and improving readiness at the tactical level, to cost-based decision making at the strategic level, DST provides the total asset visibility and planning capability that was previously absent to Army staff and materiel managers.

“If you are a Logistician in today’s Army and not talking PSDs [Proposed Sourcing Decisions] you are irrelevant.”  – MG Hurley FORSCOM G4

DST not only consumes and displays authoritative data from multiple systems of record (e.g., LMP, GCSS-A, PBUSE, JMAR, DPAS, AE2S), but it also displays the “due-in” and “due-out” transactions.  This comprehensive picture is needed to provide the complete asset visibility and transactional tracking that enables Army planners and materiel managers to make timely, cost-effective distribution and redistribution decisions.

As the Army has continued to restructure, the predictive analytic capability of the tool has become the backbone of Army planning for equipment redistribution.  DST’s blue sky planning capability allows Army planners to match scheduled force structures with contoured authorizations.  Once that process is completed, the tool runs course-of-action (COA) excursions (i.e., simulations) to identify the best solution for achieving key performance metrics for redistribution of materiel.  Users can define strategies and adjust a multitude of variables to determine the optimal solution.  A few of these variables may include: second-destination transportation cost, modernization of equipment, and pure fleeting of equipment.

The impact of the Army restructuring effort is measured in the redistribution of hundreds of thousands of pieces of equipment. Using ProModel’s application framework, DST allows Army planners to run multiple, future COA excursions.  With the tool’s auto-optimization feature, these excursions optimize misaligned equipment across every major command within the Army.  Then, the blue sky planning feature sources from inactive units, and from pure excesses, to fill any confirmed shortages across the Army.  The flexibility of the tool allows newly activated or converting units to be sourced as priority fills.  This type of in-depth analysis, which previously would have taken months to accomplish, can now be accomplished in hours.  The LMI DST is facilitating the change in an ever-changing Army.

Remembering Rob Bateman

Photo March 2009

Charles Harrell, Founder ProModel Corporation

Last month we lost a long-time member of the ProModel family and, for many of us, a beloved friend. Following a sudden incident of heart failure while working out in the gym, Rob Bateman passed away on October 11th 2015.  We at ProModel will remember him as a warm, energetic, impassioned leader and friend whose life was devoted to the pursuit of excellence and selfless service. His absence will continue to be profoundly felt in the months and years to come.

I first met Rob just over 25 years ago when he was doing graduate studies at BYU. He took a simulation class from me and I could tell he was excited about the potential benefits of simulation. So after completing a stint with the US state department as a foreign-service officer in 1990, Rob began working as a ProModel distributor. With his international background and grasp of simulation, Rob eventually become the Vice President of International Operations and later established an independent company (Dynamisis A.G.) for directing all international operations for ProModel.

DSC_1650

Robert Bateman

(April 4, 1958 – October 11, 2015)

Rob was an extraordinary individual with remarkable talents. He was one of those individuals who was always on-the-go and seemed to cram more into one day than most of us manage to accomplish in several days. At the same time, he maintained a zest for life and could frequently be seen buzzing around in his sports car with his signature driving cap or biking into work in his cycling shorts and helmet.

Here are a few of the many talents Rob displayed:

  • He was very knowledgeable…about everything. No matter what subject was being discussed, he always had something intelligent to contribute to the discussion. On top of his formal education, which culminated with a Ph.D. in Public Administration/Political Science, Rob filled his spare moments reading books or one of his 14 magazines he subscribed to.
  • As a consummate teacher he was passionate about getting people exposed to simulation. He wrote the first textbook on ProModel for use in college courses. For the past decade, when he wasn’t working with distributors to promote ProModel he was teaching at the local university.
  • He was an effective mentor and gave many individuals their first start in their careers. When several international distributors were asked what they remember about Rob, they invariably said he treated them as valued partners and became someone they could always turn to for advice.
  • He was resourceful and knew how to get by on very little sleep, food and comforts. When there wasn’t sufficient budget or resources to support an initiative he believed in, he somehow always managed to find the means needed to get the job done.
  • He was a real cosmopolitan and world traveler. If you ever called Rob, you would be just as likely reaching him at some airport as in his office. And there didn’t seem to be any country where he felt uncomfortable or couldn’t speak the language.
  • He was a friend to all and he never let business stand in the way of personal relationships. He took time to express an interest in others and always sensed if one was having a bad day or dealing with problems at home. He would do whatever he could to lift them up and help them keep things in perspective.
  • Finally, Rob was funny and had an infectious sense of humor. He could tell endless stories of his travel exploits where he encountered bizarre situations like returning to his car only to find all of his tires stolen. Though Rob took his commitments seriously, he never took himself too seriously.

Here are a few memories related by some of the distributers who worked with him.

A Distributor in Germany and Austria relates, “On my first trip to Utah to visit with Rob as a new ProModel rep, I had the feeling I was meeting with an old friend. I was impressed by his hospitality and the time that he gave me.”

A Brazilian distributor recalled meeting Rob the first time 21 years ago and thinking to himself, “Who is this guy who can conduct a meeting with high level business leaders, comfortably use legal and business terms in both German and English and then turn around the following day and teach a simulation course in Spanish to a group of engineers. How can one person have so many skills?”

As another of his co-workers commented, “I’ve been in rooms with him teaching and negotiating with Nigerians, Germans, Japanese, Brazilians, Mexicans, and more. No matter the nationality, Rob could relate and connect. He was confident, knowledgeable, and personable.”

On a personal note, one co-worker related: “This past year Rob joined the cycling team that I belong to called Team C4C (“Cycle for Cure”). The team was formed to raise money for health-related charities such as the Huntsman Cancer Center and the National MS Society. Rob immediately identified with the purpose of the group and quickly became one of the strongest riders on the team.”

This same co-worker related how Rob was instrumental in helping him complete a grueling ‘Ultimate Challenge’ cycling event saying, “I will always cherish a picture I have of Rob and me crossing the finish line together at Snowbird after riding 100 miles and climbing 10,000 feet in one day. I could not have made it without his encouragement along the way.”

For all those who have been influenced by his exemplary life, Rob will always be remembered as a leader, mentor and friend. Perhaps it is fitting that he pursued a career in simulation modeling since he seems to have understood the impact that models can have, not only on organizations, but on the people around him. Those who knew Rob, know that he was a model of the best that a human being can be, and for that he will always be remembered.

One of ProModel’s Biggest Supporters Reflects on a Career in Academic Optimization!

ProModel Guest Blogger:  Linda Ann Riley, Ph.D., Adjunct Professor of Engineering, University of New Haven; former Associate Dean and recently retired professor from the School of Engineering, Computing and Construction Management at Roger Williams University.

Riley photo

Linda Ann Riley, Ph.D.

When Sandra Petty, Academic Coordinator at ProModel Corporation, invited me to contribute to ProModel’s guest blog, it gave me an opportunity to reflect on an academic career with one ever present constant, the ProModel’s suite of simulation software products.  My Universities may have changed, yet each year for the past twenty or so I have taught at least one, (many times far more) discrete-event simulation courses to an undergraduate, graduate, corporate or government audience.  Regardless of the class, a Ph.D. or a freshman undergraduate, I have continued to use ProModel since its early days as one of the first Window’s based simulation products.  As ProModel Corporation has introduced new products, MedModel, ServiceModel, Process Simulator and Portfolio Simulator, my students have had an invaluable opportunity to be exposed to some of the best simulation products in the industry.

Each simulation class that I teach involves an external project where students work with non-proprietary data from industry, government or non-profit entities. Working only with the ProModel Student Package, I have seen some of the most impactful and innovative uses of ProModel simulation software. From modeling casino floor slot machine layout to nuclear reactor evacuation scenarios, the variety of applications for the software has been virtually limitless.  The simulation skillset acquired by students is one of the primary factors companies have cited when hiring students with ProModel experience.  Through the years, the aerospace, health care, automotive, logistics and defense industries have identified significant value in students graduating with exposure to ProModel’s suite of products.

I too, have benefited from using ProModel software.  For my entire career, my research has focused on productivity/process analysis and optimization.  For the past twenty years, ProModel software has played a central role as an application tool for this research.  As ProModel Corporation has evolved with additional products and capabilities, so too has my research.  In the early years, I focused on health care process and facility layout improvement using MedModel to simulate patient queuing alternatives, throughput strategies and identification of system waste. From there, my research moved to rare event simulation such as security breaches, hazardous materials transportation incidents and hybrid simulation that incorporated both a discrete-event and continuous element.  At that time, I used external code and output from other programs as inputs to ProModel. During this period of research, I also focused with Ph.D. students on new approaches to multi-objective evolutionary algorithms as well as meta-heuristics for optimizing large-scale discrete-event simulations using SimRunner as a starting point.

More recently, my research has concentrated on managing and controlling risk in complex infrastructure projects using discrete-event simulation for stochastic scheduling.   In the construction industry, traditional project management and scheduling approaches for highly-complex construction projects typically use methods such as CPM (critical path method), PERT (program evaluation and review technique) or Monte Carlo simulation. For the most part, these methods rely on deterministic, tractable mathematical models underlying the schedule. The ability to accurately predict project schedule outcome and manage performance volatility is essential to controlling risk.  Prior to ProModel Corporation introducing Project and Portfolio Simulator, I would simulate the stochastic nature of these schedules in ProModel.

Even though I have recently retired from a full-time academic career, I will continue to teach discrete-event simulation using ProModel in an adjunct faculty capacity.  Looking to the future, my research will focus primarily on the incorporation and design of intelligent interfaces that identify and apply algorithms for the optimization problem and constraints under study. This implies perhaps an additional layer of code incorporated into the optimization process. Ultimately, this intelligent interface could “learn” to recognize common optimization scenarios, select starting and stopping rules, and potentially also interface with the system improvement framework.

As a further extension to the intelligent interface, dynamic algorithmic visualization capabilities might be incorporated into the optimization procedures.  Immersive technologies are used in many simulation arenas.  Incorporating immersive visualization into optimization would serve to bring a transparency between the modeling and optimization processes. This would allow users and decision makers to interactively view, and potentially redirect the optimization process. In essence, this feature would provide the decision maker the ability to immerse him or herself into the model, thus “directing” both the simulation and optimization processes.

In retrospect, discrete-event simulation and the ProModel Corporation have played a central role in my development as both a teacher and researcher.  I look forward to what the future holds for both the company and the field of discrete-event simulation.

About Dr. Linda Ann Riley

Contact Information: linda.ann.riley@gmail.com

Linda Ann Riley, Ph.D. presently serves as an Adjunct Professor of Engineering for the University of New Haven’s graduate program in Engineering and Operations Management. She recently retired as full professor from the School of Engineering, Computing and Construction Management at Roger Williams University (RWU) where she worked for twelve years. At RWU, she held the positions of Associate Dean, Engineering Program Coordinator and Professor of Engineering. She has over thirty years of teaching experience in both engineering and business and is the recipient of a number of corporate, university and national excellence in teaching awards. Dr. Riley is the author/co-author of over 100 articles, technical and research reports, and book contributions. Her area of scholarly interest involves the optimization of stochastic systems using simulation and evolutionary algorithms.

In addition, Dr. Riley is an active researcher with notable success in grant writing, grant and contract management, creating collaborative research partnerships and research administration. She is responsible for developing and writing over 150 competitive research/consulting proposals and has been awarded or procured contracts for clients in excess of twenty-five million dollars. Prior to her position at Roger Williams University, Dr. Riley spent 17 years at New Mexico State University (NMSU) holding positions as Director of the University Center for Economic Development Research and Assistance, Assistant Director for the Center for Business Research and Services and Director of the Advanced Modeling and Simulation Laboratory. She also held faculty positions in both the Colleges of Business and Engineering at NMSU.

In addition to teaching and research, Dr. Riley is active in consulting. She has extensive consulting experience in organizational productivity/process improvement implementing six sigma, lean, system dynamics, simulation and optimization approaches. She has extensive experience in the design, communication and implementation of strategic and economic development plans. Also, she worked for a number of years with the National Laboratories on technology commercialization strategies.

Dr. Riley is actively involved in attracting women and under-represented groups into science, engineering, mathematics and technology fields. She is a national speaker on the challenges of attracting women and under-represented groups into these fields and served as Chair of the American Society for Engineering Education Northeastern Section and National Chairperson of American Society of Mechanical Engineers Diversity Metrics Committee. Dr. Riley is a member of several professional business and engineering societies and has served as reviewer and/or editorial board member for business, healthcare and engineering journals.

Dr. Riley received her undergraduate degree from Boston University, earned an M.B.A from Suffolk University, completed a post-graduate fellowship at Brown University and earned her M.S. in Industrial Engineering and a Ph.D. in Business with a major field in Logistics from New Mexico State University. Also, for eleven years, Dr. Riley held the position of Vice-Chair of the Board for a large financial institution. In conjunction with this position, she completed 56 credits of Board of Directors courses and was awarded the Friedrich W. Raiffeisen and Edward W. Filene Awards.

Demystifying Big Data

Rob Wedertz – Director, Navy Programs

Rob Wedertz – Director, Navy Programs

We live in a data-rich world.  It’s been that way for a while now.  “Big Data” is now the moniker that permeates every industry.  For the sake of eliciting a point from the ensuing paragraphs, consider the following:

FA-18 / Extension / Expenditure / Life / Depot / Operations / Hours / Fatigue

Taken independently, the words above mean very little.  However, if placed in context, and with the proper connections applied, we can adequately frame one of the most significant challenges confronting Naval Aviation:

A higher than anticipated demand for flight operations of the FA-18 aircraft has resulted in an increased number of flight hours being flown per aircraft.  This has necessitated additional depot maintenance events to remedy fatigue life expenditure issues in order to achieve an extension of life cycles for legacy FA-18 aircraft.

120613-N-VO377-095  ARABIAN GULF (June 13, 2012) An F/A-18C Hornet assigned to the Blue Blasters of Strike Fighter Squadron (VFA) 34 launches from the flight deck of the Nimitz-class aircraft carrier USS Abraham Lincoln (CVN 72). Lincoln is deployed to the U.S. 5th Fleet area of responsibility conducting maritime security operations, theater security cooperation efforts and combat flight operations in support of Operation Enduring Freedom. (U.S. Navy photo by Mass Communication Specialist 2nd Class Jonathan P. Idle/Released)

(U.S. Navy photo by Mass Communication Specialist 2nd Class Jonathan P. Idle/Released)

The point here is that it is simply not enough to aggregate data for the sake of aggregation.  The true value in harnessing data is knowing which data are important, which are not, and how to tie the data together.  Often times subscribing to the “big data” school of thought has the potential of distraction and misdirection.  I would argue that any exercise in “data” must first begin with a methodical approach to answering the following questions:

“What challenge are we trying to overcome?”

“What are the top 3 causes of the challenge?”

“Which factors are in my control and which ones are not?”

“Do I have access to the data that affect the questions above?”

“How can I use the data to address the challenge?”

weeds sept 2015 blog graphic

While simply a starting point, the above questions will typically allow us to frame the issue, understand the causal effects of the issue, and most importantly facilitate the process of honing in on the data that are important and systematically ignore the data that are not.

To apply a real-world example of the methodology outlined above, consider the software application ProModel has provided to the U.S. Navy – the Naval Synchronization Toolset (NST).

“What challenge are we trying to overcome?”

Since 2001, the U.S. Navy has participated in overseas contingency operations (Operation Enduring Freedom and Operation Iraqi Freedom) and the legacy FA-18 aircraft (A-D) has consumed more its life expectancy at a higher rate.  Coupled with the delay in Initial Operating Capability (IOC) of the F-35C aircraft, the U.S. Navy has been required to develop and sustain a Service Life Extension Program (SLEP) to extend the life of legacy FA-18 aircraft well beyond their six thousand hour life expectancy and schedule and perform high flight hour inspections and major airframe rework maintenance events.  The challenge is: “how does the Navy effectively manage the strike fighter inventory (FA-18) via planned and unplanned maintenance, to ensure strike fighter squadrons are adequately sourced with the right number of FA-18s at the right time?”

“What are the top 3 causes of the challenge?”

  • Delay in IOC of the F-35C
  • Higher flight hour (utilization) and fatigue life expenditure
  • Fixed number of legacy FA-18 in the inventory

“Which factors are in my control and which ones are not?”

 In:

  • High flight hour inspection maintenance events
  • Airframe rework (depot events)

Out:

  • Delay in IOC of the F-35C
  • Fixed number of legacy FA-18 in the inventory

“Do I have access to the data that affect the questions above?”

Yes.  The planned IOC of the F-35C, flight hour utilization of FA-18 aircraft, and projected depot capacity and requirements are all data that is available and injected into the NST application.

“How can I use the data to address the challenge?”

Using the forecasted operational schedules of units users can proactively source FA-18 aircraft to the right squadron at the right time; balanced against maintenance events, depot rework requirements, and overall service life of each aircraft.

Now that the challenge has been framed, the constraints have been identified, and the data identified, the real work can begin.  This is not to say that there is one answer to a tough question or even that there is a big red “Easy” button available.  Moreover, it has allowed us to ensure that we do not fall victim to fretting over an issue that is beyond our control or spend countless hours wading through data that may not be germane.

NST was designed and developed with the points made above in mind.  The FA-18 is a data-rich aircraft.  However, for the sake of the users, NST was architecturally designed to be mindful of only the key fatigue life expenditure issues that ultimately affect whether the aircraft continues its service life or becomes a museum piece.  In the end, NST’s users are charged with providing strike fighter aircraft to units charged with carrying out our national security strategy.  By leveraging the right data, applying rigor to the identification of issues in and out of their control, and harnessing the technology of computational engines, they do precisely that.

Simulating Impatient Customers

ProModel Guest Blogger: Dr. Farhad Moeeni, Professor of Computer & Information Technology, Arkansas State University

Dr. Farhed Moeeni - Prof. of Computer & Information Technology, Arkansas State University

Dr. Farhad Moeeni 

Simulation is one of the required courses for the MBA degree with MIS concentration at Arkansas State University.  The course was developed a few years ago with the help of a colleague (Dr. John Seydel).  We use Simulation Using Promodel, Third Ed. (Harrell, et al., Ghosh and Bowden, McGraw-Hill) for the course.  In addition, students have access to the full-version of the Promodel software in our Data Automation Laboratory. The course has attracted graduate students from other areas including arts and sciences, social sciences and engineering technology who took the course as elective or for enhancing research capability.  Students experience the entire cycle of simulation modeling and analysis through                                          comprehensive group projects with a focus on business decision making.

Most elements of waiting lines are shared by various queuing systems regardless of entity types such as human, inanimate, or intangible.  However, a few features are unique to human entities and service systems, two of which are balking and reneging.  One of the fairly recent class projects included modeling the university’s main cafeteria with various food islands. Teams were directed to also model balking and reneging, which was challenging. The project led to studying various balking and reneging scenarios and their modeling implications, which was very informative.

Disregarding the simple case of balking caused by queue capacity, balking and reneging happens because of impatience.  Balking means customers evaluate the waiting line, anticipate the required waiting time upon arrival (most likely by observing the queue length) and decide whether to join the queue or leave. In short, balking happens when the person’s tolerance for waiting is less than the anticipated waiting time at arrival.  Reneging happens after a person joins the queue but later leaves because he/she feels waiting no longer is tolerable or has utility.  Literature indicates that both decisions can be the result of complex behavioral traits, criticality of the service and service environment (servicescape). Therefore, acquiring information about and modeling balking or reneging can be hard.  However, it offers additional information on service effectiveness that is hard to derive from analyzing waiting times and queue length.

For modeling purposes, the balking and reneging behavior is usually converted into some probability distributions or rules to trigger them. To alleviate complexity, simplified approaches have been suggested in the literature.  Each treatment is based on simplifying assumptions and only approximates the behavior of customers. This article addresses some approaches to simulate balking.  Reneging will be covered in future articles.  Scenarios to model balking behavior include:

  1. On arrival, the entity joins the queue only if the queue length is less than a specified number but balks otherwise.
  2. On arrival, the entity joins the queue if the queue length is less than or equal to a specified number. However, if the length of the queue exceeds, the entity joins the queue with probability  and balks with probability  (Bernoulli distribution).
  3. The same as Model 2 but several (Bernoulli) conditional probability distribution is constructed for various queue lengths (see the Example).
  4. On arrival, a maximum tolerable length of queue is determined from a discrete probability distribution for each entity. The maximum number is then compared with the queue length at the moment of arrival to determine whether or not the entity balks.

The first three approaches model the underlying tolerance for waiting implicitly.  Model 4 allows tolerance variation among customers to be modeled explicitly.

A simulation example of Model 3 is presented. The purpose is to demonstrate the structure of the model and not model efficiency and compactness.  The model includes a single server, FCFS discipline, random arrival and service.  The conditional probability distributions of balking behavior are presented in the table. The data must be extracted from the field.  The simulation model is also presented below. After running the models for 10 hours, 55 (10% of) customers balked. Balking information can be very useful in designing or fine-tuning queuing systems in addition to other statistics such as average/maximum waiting time or queue length, etc.

Condition

Conditional Probability Distribution

Probability of Joining the Queue (p) Probability of Balking (1-p)
Queue Length <= 4 1.00 0
5<=Queue Length <= 10 0.7 0.3
Queue Length > 10 0.2 0.8

Prof Mooeini Sim Chart

About Dr. Moeeni:

Dr. Farhad Moeeni is professor of Computer and Information Technology and the Founder of the Laboratory for the Study of Automatic Identification at Arkansas State University. He holds a M.S. degree in industrial engineering and a Ph.D. in operations management and information systems, both from the University of Arizona.

His articles have been published in various scholarly outlets including Decision Sciences Journal, International journal of Production Economics, International Journal of Production Research, International Transactions in Operational Research, Decision Line, and several others. He has also co-authored two book chapters on the subject of automatic identification with applications in cyber logistics and e-supply chain management along with several study books in support of various textbooks. .

He is a frequent guest lecturer on the subject of information systems at the “Centre Franco Americain”, University of Caen, France.

Current research interests are primarily in the design, analysis and implementation of automatic identification for data quality and efficiency, RFID-based real-time location sensing with warehousing applications, and supply chain management. Methodological interests include design of experiments, simulation modeling and analysis, and other operations research techniques. He is one of the pioneers in instructional design and the teaching of automatic identification concepts within MIS programs and also is RFID+ certified.

Dr. Moeeni is currently the principle investigator of a multi-university research project funded by Arkansas Science and Technology Authority, Co-founder of Consortium for Identity Systems Research and Education (CISRE), and on the Editorial Board of the International Journal of RF Technologies: Research and Applications.

Contact Information

moeeni@astate.edu

ProModel Solutions Presented at the 2015 Patient Flow Summit

In May ProModel joined a diverse and talented group of healthcare professionals in Las Vegas to share best practices for improving process and positively impacting the quality of patient care.  Presenters provided views on a wide variety of patient flow issues including population health management, RTLS systems, healthcare reform, readmissions, surgical variability and ED processes.

Not only did ProModel have an exhibit at the event where we were able to officially unveil our new Patient Flow RX solution, but we were also very lucky to have ProModel client and user David Fernandez MHA there to give an insightful and informative presentation on his successful use of simulation in the healthcare world.  Fernandez is VP of Cancer Hospital, Neuroscience and Perioperative Services at Robert Wood Johnson University Hospital and his presentation “Let My Patients Flow! Streamlining the OR Suite” described his use of lean management principles and simulation modeling to improve patient flow in the OR.

David Fernandez MHA, Robert Wood Johnson University Hospital discusses his use of simulation for improving patient flow in the OR

David Fernandez MHA, Robert Wood Johnson University Hospital discusses his use of simulation for improving patient flow in the OR

Among numerous other presentations, keynote speaker Eugene Litvak PhD, President & CEO of Institute for Healthcare Optimization was there to address the application of queuing theory to healthcare processes, as he believes it is a methodology that will correctly address the challenge of hospitals to match random patient demand to fixed capacity.

The Patient Flow Summit helped hospital leaders from all over the world learn the latest about optimizing capacity, streamlining operations, improving patient care, and increasing fiscal performance.

Presenters provided views on a wide variety of patient flow issues

Presenters provided views on a wide variety of patient flow issues

ProModels (L) Kurt Shampine, VP and (R) Dan Hickman, CTO – unveiling Pateint Flow Rx!

ProModels (L) Kurt Shampine, VP and (R) Dan Hickman, CTO – unveiling Pateint Flow Rx!