ProModel and MedModel 2014

Kevin Field

Kevin Field – Sr. Product Manager

In regards to this release, I would like to start out by saying, in the words of Nacho Libre, “It’s pretty dang exciting, huh?

With ProModel and MedModel 2014 we’ve tried to keep our current customers in mind as well as new customers. For current customers, the new logic windows with Intellisense and Syntax Guide should help you build models faster and easier. And being able to import graphics from third party graphic programs like Photoshop, Gimp, Paint.Net, etc. should even be more useful now that you can rotate all graphic types in the application. The improvements to the Debug window are a direct result of our work on the new logic windows.

For our new customers, the redesigned Getting Started panel (formerly known as the Control Panel) brings a lot of model building resources to the forefront. We have added new demo models and refreshed several of our previous ones. Did anyone even know we had a Quickstart video, showing you how to build a simple model and analyze results in 10-15 minutes? The most exciting part might be the How To videos our Support team has been producing for several months now. All of our customers will find these extremely helpful.

In this blog I am going to casually comment on some of the new features with the assumption that you have already reviewed What’s New in 2014 and perhaps even viewed the webinar I gave on this release. If not, you might want to consider doing so, otherwise…you can blissfully continue on with me…

New Logic Windows

It’s amazing what a few simple colors can do to help your logic be more readable. As we were developing version 9.1, I found it more and more difficult to go back to 8.6 and “drag” myself through the dreary old plain black text 🙂 It’s funny how refreshing it was to get back to 9.1! Not only the color but also line numbers really make it easy to quickly get around in the logic.

1

 

 

 

 

 

 

 

And if you don’t like our default color scheme or want to have something a little easier on the eyes, simply customize the colors in the Logic Color Selection dialog.

2

 

 

 

 

 

 

We also want to encourage good formatting in the new Logic windows by utilizing white space (spaces, line breaks, etc.) and indentation. Don’t be afraid of it! By automatically indenting and out-denting after begin and end brackets, we hope to make co-workers everywhere more willing to leap in and review your logic with you! Auto-formatting is something we are looking to improve moving forward.

Another thing we have made steps to do is deprecate certain logic elements. Begin, End, and the # comment are the main ones. Don’t worry though, they are not completely gone! They won’t show up in the Intellisense list but they will still compile if used in logic. Begin and End are easier to read and enter in logic if you use the “squiggly” brackets { and } instead. And we want to use the # character for other things like the new #region statement.

In fact, #region is one of my favorite new additions to 2014. I love the ability it gives you to section your logic and collapse it with a label describing what’s inside that hidden portion of your logic. I hope you’ll find it quite useful.

Intellisense and Syntax Guide

These new features are probably the heroes of this release. Intellisense brings every statement, function, location, entity, variable, subroutine (I’m saying every model element!) right to your fingertips. You should almost never have to remember your element names or copy them from one place in logic to another, or even leave a module to go look it up. Besides that, the days of completely typing any logic or element name are gone. This should increase your productivity by at least 10-15% 🙂 Are you with me on this?!

3

Intellisense coupled with the Syntax Guide should nearly make the Logic Builder obsolete. There may be a few things we need to add in order to make that happen. Please feel free to share any suggestions you may have. We tried to make both unobtrusive to your logic creation and editing. Because of this, we didn’t add an option to hide Intellisense or the Syntax Guide.

4

 

 

 

Debug Window

MORE THAN 3 LINES!! I think that’s all I need to say on that.

Ok, I’ll also say that debugging should almost be a joyous endeavor as you are now able to anticipate what logic may get executed next and better understand where you came from.

Routing Probability

I’m going to refer you to the webinar I gave on this new feature. In it I give a great example (if I do say so myself) of how simple it is to set up a routing probability for scenario analysis. One thing to remember, in order to use an array in the routing Probability field, the array must be initialized through an import.

Getting Started Panel

The new panel that appears when you start the program may primarily be geared toward new users, however current customers may find it just as useful. Access to the How To videos, online Help, and additional training resources (like dates of future ProModel training classes and a link to ProModel University, our online self-paced training).

5

If you haven’t taken advantage of your M&S contract and utilized our Technical Support team then perhaps the Getting Started panel will help facilitate this. They are a tremendous resource to assist you in understanding different techniques for modeling aspects of your systems, troubleshooting your models and helping you get out of the paper bag you may have coded yourself into, or just a friendly person to talk to 🙂 We like to call them your “ProModel Friend”.

Speaking of the Support team, they have done a tremendous job of generating a lot of How To and Solution videos for quite some time now. The short videos range from 2-5 minutes and offer useful insight into modeling techniques and other useful software tips. Let us know if you have any suggestions for more videos!

New Graphic Libraries

A final word about our new graphic libraries. In order to create new libraries containing EMF (vector-based) files, which scale nicely when zoomed, we had to support the rotation, flipping, and sizing of these image types within ProModel. This makes it so you don’t have to generate an image for every possible rotation or flip you need to have for your animation. This reduces the size of the graphic library and thus your model footprint as well. So with this new capability, you should be using a third party graphics program like Photoshop or Gimp (which is free) to create your graphics. (Or perhaps get your coworker to do it, just don’t tell them that I suggested it.)

I can’t talk about the new graphic libraries without mentioning Laif Harwood, a member of our Support team. Laif gets credit for creating all the new graphics in the libraries. And a fine job he did! So if you want any tips on how to do it for yourself, give our Support team a call!

Well, that’s all I have steam for yammering about today. Remember…you have a ProModel Friend that’s just an email (support@promodel.com) or phone call away (888-PROMODEL).

 

Finding Impartiality in S/W Applications

Rob Wedertz - SME, NST

Rob Wedertz – Director, Navy Programs

As long as I can remember I’ve been a fan of and often used the expression, “I don’t want to build the microwave, I just want to press cook”.  (I’ve never been able to remember when or from whom I first heard it – my apologies for the lack of attribution).  While I’ve sometimes been fascinated by the inner workings and origin of things, in the end I’ve come to adopt the modern world and the pace at which it moves.  I simply don’t have the time of day to dig into the underbellies of things and investigate the underpinnings of how they work nor their interdependencies.  My aversion to such activities was upended when I joined ProModel and led (as a PM) our development team’s efforts to support Verification, Validation, and Accreditation at the behest of our sponsor’s modeling & simulation accreditation agent.  While I do not intend to “build the microwave” here, I would like to describe how I learned that the act of “pressing cook” must be accompanied by complete and total impartiality of the software application.

Software, in a variety of formats, is often used to tell a story.  When it comes to entertainment-based software, and for the sake of the longevity of it, the story should be a very good one.  Thus the reason many folks spend countless hours trying to “level up” (it’s all about the journey, not the destination).  During my college days, I was exposed to Pascal and learned that the methodology (computer language) for telling a story was via if, then, else, while, etc. statements.  Truth be told, I didn’t particularly enjoy trying to figure out how to make the computer say “hello” via that methodology.  Again, I am more of a “show me the story” kind of person, than a “how did you make the story” kind of person.  In that regard I’m quite fond of the software that exists today.  My iPad is a bevy of mindless apps that keep my 5 year old entertained while putting miles on the family wagon.  However, when it comes to decision-support software, the stuff under the hood REALLY does matter and is often as equally important as the story itself.  Through the VV&A journey we’ve traveled to date, I’ve become more and more focused on inner-workings of “the microwave”, both out of necessity and surprisingly out of curiosity.

Our software applications tell stories that often culminate in multi-million dollar and in some cases, billion dollar implications, not necessarily to the good.  Not only must the story be stringently accurate, it must also be 100% impartial (or agnostic) to those who might be directly impacted by the results.  We accomplish that impartiality by ensuring that we never begin our development processes with an end result in mind.  That is not to say that we do not begin with an end-state in mind (i.e. – what is that you want to know?)  The difference is nuanced in print, but significant when it comes to applying the right level of acumen and forethought into software development.  The true genius of leveraging software applications to solve complex problems is that once you’ve figured out “where and why it hurts”, you can use predictive analytics, modeling, and regression analysis to attack the root of the ailment.  In my simplistic mind, our software is being used to treat the condition rather than the symptom.

The rigor that has been applied to the VV&A of our specific DoD program of record is staggering when compared to similar applications.  And it should be.  While many software developers are not particularly fond of documenting source code and defining why a certain script was used, in the end it has made both our customers and us extremely confident about our methodologies, processes, and coding standards.  Frankly, (although I’d never admit it to the folks who raked us through the coals) we’ve become a better development team because of it.  Combine the penultimate requirements associated with VV&A with our use of the Agile/SCRUM development methodology, we’ve accomplished the delivery of an application that withstands painstaking scrutiny and is adaptive enough to answer evolving customer demands and utility.  In the end, the vetting our software application has endured at the hands of the accreditation agent is not the value added propositions our customer demanded, although it was a necessary evolution.  What really matters is that we’ve produced a traceable software application that is impartial.  It may not always give you the answer you want, but it will always give the answer that you need – the truth.

REAL PORTFOLIO MANAGEMENT IS MORE IMPORTANT THAN EVER!

One study of the performance of mega projects over the past decade in the oil and gas industry reveals a 78 percent rate of failure. Large projects in the process industries have much poorer outcomes. More megaprojects than ever are being developed and each one brings its own complexity.

  • Projects close to the margin must be dropped
  • Resource constraints must be part of the project selection process

The disadvantages to these very large projects are huge, but of course, so are the rewards. Therefore these projects must be scrutinized and evaluated very carefully. Need to make an accurate risk/reward determination for each “mega project” in your portfolio? Learn how one petroleum pipeline organization did it with Portfolio Simulation.

Check out Portfolio Scheduler:

http://portfoliostud.io/#

Project/Portfolio Risk Evaluation:

http://www.promodel.com/pdf/ML-ProjectReview-PipelineRiskEvaluation.pdf

The ProModel Training Experience

RPriceHere at ProModel we realize that successful use of our tools usually begins with great training. To that end, we have a variety of training options available. The course you choose will depend on your product and situation. These options are described on our Training page. This post is about our classroom based trainings, our facilities, and what you can expect if you choose to join us! Regardless of your experience with business travel, it’s usually nice to know what to expect when you reach your destination.

We have regularly scheduled classes held in Allentown, Pennsylvania, and Orem, Utah. These classes usually last two or three days (depending on the course) and run from 8:30 am to 5:00 pm local time, with an hour break for lunch.

Our classrooms are set up with a computer for each student and a projector screen at the front. Your instructor will demonstrate and explain new concepts and then allow you time for hands-on implementation of the exercises on your training computer. If you bring your own laptop or wireless device, you are welcome to use our classroom Wi-Fi connection to access the internet during breaks.

Usually we have between three to six students in a class at a time, so you’ll have plenty of time and attention from our instructors, as well as an opportunity to get to know other ProModel customers and hear of their experiences and applications of the tool. We provide drinks and snacks throughout the day, but then “set you free” to grab lunch on your own. Frequently students will explore new restaurants together, but we understand that some clients need time on their lunch hour to catch up with business at the office.

We’ll start with a walkthrough of the Orem Training Facilities. You’ll want to fly into the Salt Lake City International Airport (SLC). Our office is less than an hour south of the SLC airport. For a Google map with directions to our office (and other local amenities), click here. You know you’re in the right place if you see this building:

It might be white and covered with snow, but don’t worry about the weather, that just means good skiing in the mountains. Seriously, though, Utah is well prepared for snowy conditions and getting around in winter weather is not usually a problem. The snow typically melts within a day or two in “the valley” (where we are) and sticks around up in “the mountains.” Our offices are in the east side of building C. You can take the elevator or the stairs to the third floor. As you exit the elevator (or stairs), you’ll be in our lobby. The entrance to our Orem training room is right there in the lobby.

The training room is equipped with computers for each student.

And a beautiful view out the window of Mount Timpanogos (which rises to 11,749 ft):

Allentown Office

If you plan on joining us in Pennsylvania, you can view a Google map with our location and surrounding amenities by clicking here. If flying, you may want to consider flying in to Lehigh Valley (ABE) – a very short drive to the office, Philadelphia (PHL), or Newark (EWR).

You know you’re in the right place when you see this building:

Our offices are on the third floor (just like in Orem–we must like the third floor). Just head down the hall and you’ll see the entrance to the Allentown training room on your left.

In both offices we have kitchen facilities you are welcome to use, including a microwave, fridge, K-cup coffee machine, and complimentary snacks.

We hope this information helps you feel welcome and excited for a visit to our training facilities. If you have any questions about travel, accommodations, training content or schedules, please don’t hesitate to call or email.

General Training course information can be found here and additional company facility and travel information can be found here.

Rochelle Price, Director of Training Services

rprice@promodel.com

801-223-4667

Enhancing Flow in Healthcare Design with Simulation

Guest Blog Post  - Written by Noah M. Tolson - AIA, LEED AP BD+C, Lean Green Belt -Principal and Practice Area Leader, Planning

Guest Blog Post by Noah M. Tolson – Array Architects Principal and Practice Area Leader, Planning – AIA, LEED AP BD+C, Lean Green Belt

Discrete Event Simulation (DES), which has been utilized across industries for several decades, provides a virtual environment to track and visualize patients, equipment and providers as they move through the steps of care. It is an important tool in supporting Lean Design in the healthcare environment.

In order to achieve the desired physical environment, healthcare architects rely on a vast amount of data – and the tools for harnessing that data are becoming more advanced. Just as BIM (Building Information Modeling) optimizes early decision-making in the design phase, so to can Discrete Event Simulation (DES) influence the design of workflow and patient flow prior to construction.

At Array, we have found that this virtual environment gives us the ability to test a multitude of ‘what if’ scenarios with our clients to understand the impact that different layouts have on workflow, patient flow and resource utilization. The result is an increased confidence that the design will support current needs, as well as provide insight on incorporating flexibility into the design to accommodate the inevitable changes that will come in the future.

There are three key advantages DES provides over other methods of analysis:

 

1)      Real life variability can be applied to critical key measurements such as:  patient demand, times to complete key tasks, wait times for key resources or simply waiting for care, etc.

 

2)      Naturally occurring constraints exist when the demand for services/resources exceeds capacity. DES models allow constraints to be included by identifying the interdependencies between resources available and resources required.

 

3)      DES models simulate the passing of time (into the future) and record key metrics such as wait times, processing times, resource utilization and equipment utilization as they relate to varying patient demand and varying patient acuity. This helps with the daunting task of attempting to predict when the care is to be provided and by whom.

 

Utilizing Lean Design, architectural teams appropriately spend time observing and recording work flow and patient flow to document and understand the current state. Accurately predicting the future state work flow and patient flow has always been difficult to project and arrive at consensus because workflow analysis has traditionally been based on averages. While averages are a good place to start, they don’t tell the entire story. This is where DES, due to the advantages described above, is highly valuable. Using a DES tool like ProModel’s MedModel, Process Simulator or Patient Flow Simulator to model the various hospital processes in the new structure, we can provide much more insight into the effectiveness of potential designs. DES helps evaluate the workflow, resources and patient demand more realistically and simultaneously which allows healthcare decision-makers to be more confident in the design solutions.

Case Study

MedModel was recently used to help one healthcare organization evaluate a newly constructed 220,000 SF outpatient facility. The facility was intended to centralize the services of affiliated specialty practices and education & research centers. This five-floor clinic would allow 65 providers from 13 different practices throughout a specific region to converge in one patient-friendly location. Quality and service was expected to increase greatly by having referring physicians in one collaborative environment. Spreadsheet models were initially used to study the consolidation and facility design project, but provided only static information that relied heavily on the use of averages. This made it hard to accurately study the many complex processes that occur continuously in an outpatient setting. With the limited data available, physicians and administration had difficulty reaching consensus on space requirements and efficient room utilization. At issue, could the newly consolidated practices operate comfortably on the first and third floors, or did they need additional space? A MedModel solution proposed first and third floor designs of the clinic in order to analyze capacity and resources against the current data on patient flow from all the converging practices. The simulation examined the individual practices over a five day period (Monday through Friday) during regular business hours. The measuring criteria consisted of the following:

Exam room utilization

Physician and staff utilization

Number of patients in check-in

Time spent in check-in queue

Patient activity times

Number of patients in imaging queue

After multiple scenarios were run, the output data confirmed there would be sufficient room for the consolidation of practices on the first and third floors. In fact, the analysis showed that on certain days of the week there were not enough providers to fill the capacity on those two floors.

This is one example that illustrated the advantages of DES. Array and ProModel have used similar methods to more accurately project operational outcomes and compare design solutions.

Mitigating the Hawthorne Effect with Bruce Gladwin

Bruce Gladwin, PMP - Vice President, Consulting Services

Bruce Gladwin, PMP – Vice President, Consulting Services

With over 25 years of experience in the simulation field, Bruce has worked with major corporations worldwide developing hundreds of models across a wide range of industries. In his 19-year tenure with ProModel, he has served as a Product Manager, Senior Consultant, and Simulation Trainer. Bruce was named VP of Consulting Services in 2005 and has oversight responsibility for ProModel’s Consulting and Customer Service Operations.

Bruce received a BS in Systems Engineering from the University of Arizona and an MBA from Brigham Young University. He is a certified expert in Lean production principles and received his Six Sigma Black Belt certification while employed at General Electric’s Power Systems division.

Key projects include:

  • Capacity analysis for GE Energy Products Europe – determined the maximum production capacity for gas turbine components at GE’s European manufacturing facility resulting in a savings of $9.6M  in capital avoidance.
  • Design of a green-field manufacturing site for production of GE industrial generators – resulted in a savings of $1.2M in capital avoidance and identified the need for an accelerated operator training program.
  • Design and analysis of a disassembly process for the Russian-built SS25 Intercontinental Ballistic Missile (ICBM) in support of the 1991-92 START treaties between the US and the Soviet Union

Check out Bruce’s presentation on the Hawthorne Effect from the 2013 Winter Simulation Conference and his work with a major home improvement retailer…

To Expand or Not to Expand? Medical Clinic Simulation with Jennifer Cowden

JCowden Profile Pic

Jennifer Cowden – Sr. Consultant

Less is More

I once worked with a programmer whose motto was “Pay me by the line of code,” and, not surprisingly, his code was often lengthy, inefficient, and hard to follow.  I’ve always preferred the opposite approach;  it is an interesting challenge to try to get the same functionality into as few lines of code (or alternately, as few process records) as possible.  Also, employing reusable blocks of code cuts down on the opportunities for mistakes and overall debugging time.   When I was an applications engineer at an automation company, I often had to get assembly lines modeled in a very short turn around.

Luckily, ProModel’s macro and subroutine modules made implementing reusable code very simple.  For the medical clinic model demonstrated in this post, we took flexibility a step further by using the “ALL” option in the process edit table.  Even though this model was built to simulate eleven different clinic layouts individually, and contains over 500 patient locations, this model contains a total of only seven process records.  Adding new clinic layouts now takes a fraction of the time and can be done with minimal code adjustments.  If you have a repetitive process, or one that needs to be flexible to add workstations quickly, this methodology could save you modeling time as well.

Check out Jennifer’s work on the Medical Clinic simulation model:

 

About Jennifer

Before joining ProModel in 2013, Jennifer spent 15 years in the automation industry working for a custom turnkey integrator. As an Applications Engineer she built simulation models (primarily using ProModel) to demonstrate throughput capacity of proposed equipment solutions for a variety of customers. Jennifer’s experience covers a wide range of industrial solutions – from power-and-free conveyor systems to overhead gantries and robotic storage and retrieval systems. She has also created applications in the pharmaceutical, medical device, automotive, and consumer appliance industries.

Jennifer has a BS in Mechanical Engineering and a Master of Science in Mechanical Engineering from the Georgia Institute of Technology.

ProModel is excited to release AST 6.9!

AST (ARFORGEN Synchronization Toolset) is a custom predictive analytic software platform used by the US Army Forces Command (FORSCOM) to source and synchronize Army resources.

AST is now the authoritative system FORSCOM uses to conduct its unit planning and sourcing process in Army Force Generation (ARFORGEN). AST provides the Army with the means to view the predicted impact of today’s sourcing decisions on tomorrow’s utilization of Army personnel moving through ARFORGEN. AST “on screen” capabilities consolidates data from multiple sources, applies existing or “what if” business rules, predicts the outcome, and automatically depicts results thereby eliminating lengthy manual, linear, and presentation based methods previously employed. AST cuts development time for single Courses of Action from days to minutes, while enabling multiple Courses of Action within the same timeframe.

Some of the new features in AST 6.9 include: Improved Sourcing, Army Reserve, Army National Guard, and HQDA integration (tasks, etc.), additional Army Special Forces integration (risk), improved executive-level reporting (scorecard), improved Unit Cycle management, and dozens more enhancements.

ProModel also recently completed a “Financial Costing” proof of principal for FORSCOM that integrated data from AST and the U.S. Army Force Generation Costing Tool (ACT) for analysis in ProModel’s Enterprise Portfolio Simulator (EPS).

Read more about ProModel Custom Solutions and our work with the US Army:

http://www.promodel.com/custom-solutions.asp

http://www.promodel.com/industries/government-department-of-defense.asp#tabbed-nav=tab3

ProModel at the AUSA Winter Symposium and Exposition

RPS_Business_Portrait

Pat Sullivan – VP, Army Programs

With over 5,700 attendees, and over 200 exhibitors, the annual AUSA Winter Symposium and Exposition kept the ProModel team very well occupied. According to Keith Vadas, ProModel’s CEO, the 2014 AUSA (Association of the United States Army) symposium (held during February 19-21, 2014 in Huntsville, AL) was by far the most productive that ProModel has attended. When asked by LTG(R) Roger Thompson, AUSA Vice President for Membership and Meetings, if ProModel would come back if AUSA decided on Huntsville for next year, Keith responded with an emphatic “Absolutely!”

Taking advantage of the efficiency of having the undivided attention of an AUSA audience, which was four times larger than that of last year’s winter conference, Team Redstone hosted an exceptional small-business seminar the day before the conference. The seminar was hosted by a team comprised of NASA, Army Materiel Command, Missile Defense Agency, and the Strategic Missile Defense Command, along with the Army’s Office of Small Business. This was a great networking opportunity, and it revealed some great information about opportunities for ProModel in DOD and NASA.

On Wednesday, February 18th, the ProModel team entered the exhibit hall with great excitement and a superb opportunity to demonstrate how our custom DOD solutions and Commercial Off the Shelf (COTS) products are evolving. Many of the attendees expressed that they were on a continual quest for accurate budgeting projections. The Enterprise Portfolio Simulator (EPS) cost module, which is being piloted as a module of the ARFORGEN Synchronization Toolset (AST) at Forces Command, demonstrated a clear visualization of such projections. This EPS capability assists the Army (and it can assist any organization) in applying cost data at the tactical level.  The EPS module then rolls that data up in a package that reflects enterprise budget estimates, which in turn reflect a variety of demand or demand-fulfillment scenarios.

Four Star General Dennis Via, Commander of the US Army Materiel Command (center right) and Major General (Ret) Freeman from Deloitte (center) visit the ProModel booth and discuss the positive impact that DST-SM is having on the Army Materiel Command.

Four Star General Dennis Via, Commander of the US Army Materiel Command (center right) and Major General (Ret) Freeman from Deloitte (center) visit the ProModel booth and discuss the positive impact that DST-SM is having on the Army Materiel Command.

Another highlight was the demonstration of, and interest in, our COTS products like Process Simulator and EPS. DOD elements and industry are seeking ways to gain greater efficiency and to stretch their limited resources. While force structure is being reduced, missions and the need for continual modernization are not. The expectation of those funding DOD is that the military will be increasingly efficient in the execution of prescribed tasks. Therefore, an understanding of how to generate efficiency through Lean practices and events, and of how to predict equipment life-cycle costs in a peacetime environment, is paramount. Additionally, leaders in DOD expressed how they must apply Lean principles to their processes, identify trade-offs, and understand the downstream impacts of change.

Process and portfolio management are significant across the government sector, and they will become even more necessary during this time of decreasing budgets. EPS and Process Simulator, coupled with ProModel’s customized solutions (AST, LMI DST, and NST), provide the foundation for rapid process improvement, budget estimation, and program management. Thanks to the exceptional hospitality of the Tennessee Valley and the great response by our AUSA hosts, ProModel found in Huntsville some fertile ground that will grow much more than cotton.

Major General Collyar, CG at AMCOM, stops by our booth at the AUSA Winter Symposium to talk with ProModel CEO Keith Vadas (right) and ProModels Director of Navy Programs Robert Wedertz (left)

Major General Collyar, CG at AMCOM, stops by our booth at the AUSA Winter Symposium to talk with ProModel CEO Keith Vadas (right) and ProModels Director of Navy Programs Robert Wedertz (left)

Healthcare Guest Blog from Array Architects

Co-Authors:

Florangela Papa, LEED AP – Project Architect and Planner, Array Architects

Ryan Keszczyk – Intern Architect, Array Architects

Array Logo Lockup-Inline_CMYK-01We are Healthcare Architects.  When designing for the healthcare industry, we must respond to the increasing complexity of demands and restrictions based on spacing limitations, budgets, and resources.  As our healthcare clients adapt to their changing needs and experience a shift in operations and process, we needed to find a way to use real-time information and data to generate both tangible and quantifiable statistics that could be used to steer design.  These criteria led us to search for new tools that would allow us to analyze this data in a way that could improve our design process.  Simulation modeling is a tool that has drastically impacted the design process, increasing the value, flexibility, and quality of our designs while staying within the confines and restrictions of each individual project.  The once static historical data on spreadsheets and charts can now be analyzed in a visually dynamic way.  Using this technology we are able to visually see system bottlenecks and flawed areas of the process that have the most potential to improve the design, all in a virtual environment, before the project is too far along in the design process.  With simulation we can:

-Analyze existing conditions and identify areas within the project scope that need development and offer the most value and improvement to the facility.

-Create project specific analyses and solutions that become the guiding force of a design, rather than using standard baseline benchmarks.

-Identify and analyze system flows and processes that can be improved with the introduction of lean design practices.

-Quickly test different scenarios which give the client the ability to weigh the outcomes and make an informed and confident decision.

Simulation modeling is used in early stages of design to influence programmatic developments.  For example, using simulation modeling we are able to specifically calculate the number of patient rooms a department may need to minimize wait time and further improve quality of care.  Through the modeling process we are able to ensure that the critical elements are precise, not just for a typical day, but have the ability to perform in  “worst case scenario” circumstances.   With the broad scope of a project determined, modeling can be used at a focused scale to evaluate the inter-dependencies of individual elements within the system and influence the design accordingly (i.e. patient room flow, nurse station flow, etc.)

Simulation Modeling is often considered both an art and a science.  Models can be developed to produce extremely rigorous and complex systems, but also need to strike the right balance of simplicity and usefulness.  As architects we needed a simple tool that could give us the benefits of simulation, without requiring too extensive of a statistics and engineering expertise – this isn’t our strength, nor is it how we are compensated.  After evaluating different software, Array selected ProModel’s Process Simulator because:

– It has a user-friendly interface with visually dynamic graphics.

– ProModel offers effective training and tutorials backed by great technical support

– There is a variety of graphics (graphs, charts, histograms, time plots) that are easily customizable to meet the needs of the project through the output viewer.

– The software has the ability to create simple or complex models, which allows the flexibility to model for a variety of project types.  We are able manage many projects on our own, but can also team with ProModel’s experts as we tackle more complex problems.