Simulation Ensures Patient Safety During Hospital Move

Northwest Community Hospital is an acute care hospital in Arlington Heights Illinois, right outside of Chicago.  The staff at NCH had the very complex and delicate task of arranging and accomplishing the move of 150 patients over to a newly constructed facility on campus.  This is a welcome but difficult situation that many healthcare organizations find themselves in today as technology improvements and rising patient populations demand growth.

See how NCH achieved a flawless transition through predictive analytics and simulation:

Power of Predictive Analytics for Healthcare System Improvement and Patient Flow

Hospitals are currently under intense pressure to simultaneously improve the effectiveness and efficiency of healthcare delivery in an environment where operating costs are being reduced, downsizing and consolidation is the norm, and cost for care is increasing while revenue is decreasing.  At the same time the systemic effects of peak census and varying demand on patient LOS are creating capacity issues and unacceptable patient wait times…leading to a major decline in patient satisfaction.

The amount of proposals to enhance a hospitals quality care are as numerous as the healthcare professionals dedicated to the cause.  What hospitals need however is the ability to quickly and accurately evaluate the impact of those various operational proposals and to experiment with system behavior without disrupting the actual system – and ProModel’s simulation technology is allowing them to do just that.

The predictive analytic capability of ProModel simulation will allow healthcare professionals to test assumptions and answer those patient flow “what if” questions in a matter of minutes and days, not weeks and months.  Simply put, it’s providing a decision support system to assist healthcare leaders in making critical decisions quickly with a higher degree of accuracy and confidence.

Simulation will also help healthcare staff quickly identify room availability and recognize high risk patient flow bottlenecks before extreme problems occur.  This invaluable knowledge will then lead to reductions in patient wait times and LOS, avoid unnecessary re-admissions and costly expansions, and most importantly – increase the overall quality of service and patient satisfaction.

Teaching Process Management Using ProModel

ProModel Guest Blogger:  Scott Metlen, Ph.D. – Business Department Head and Associate Professor at University of Idaho

Scott Metlen, Ph.D.

Scott Metlen, Ph.D.

Understanding process management, the design, implementation, management and control, and continuous improvement of the enterprise wide set of an organizations processes is the key to well deployed strategies. It was not until Tim Cook made Apple’s total set of processes world class including all supply chain linked processes (Brownlee, 2012) that Apple hit its amazing climb to become the world’s highest valued company; even though the company had cutting edge products before his arrival. Gaining effective understanding of process management is not easy due to the strategic variability inherent in the portfolio of products that companies sell, and in markets they service. This strategic variability (Rajan, 2011) in turn drives variability in many processes that an organization uses to operate. For instance, different markets require different marketing plans supported by different processes.  Order processes often vary by product and target market. Employee skill sets differ by product requiring different hiring and training processes. Different products, whether it be services or goods that have a slight variation require, at the very least, an adjustment to the production process. Adding to, and often caused by the variability just mentioned, are multiple process steps, each with different duration times and human resource skills.  Depending on what product is currently being produced, process steps, process step order and duration time, interdependency between the process steps, and business rules all vary. Where a product is in its life cycle will drive the experience curve, again creating variation across products. In addition, the numerous interfaces with other processes all vary depending on the product being produced. All of these sources of variability can make process management hard to do, teach, and learn. One tool that helps with process management in the face of variance is discrete event simulation and one of the best software suites to use is ProModel. ProModel is a flexible program with excellent product support from the company.

Effective process management is a multi-step process. The first step of process management is to determine the process flow while at the same time determining the value and non-value added process steps. Included in the process flow diagram for each step are the duration times by product and resources needed at each step, and product routes. Also needed at this time are business rules governing the process such as working hours, safety envelopes, quality control, queueing rules, and many others. Capturing this complex interrelated system begins by visiting the process and talking with the process owner and operators. Drawing the diagram and listing other information is a good second step, but actually building and operating the process is when a person truly understands the process and its complexities.  Of course many of the processes we want to improve are already built and are in use. In most cases, students will not be able to do either of these. However, building a verified and validated simulation model is a good proxy for doing the real thing, as the model will never validate against the actual process output unless all of the complexity is included or represented in the model. In the ‘Systems and Simulation’ course at the University of Idaho students first learn fundamentals of process management including lean terms and tools. Then they are given the opportunity to visit a company in the third week of class as a member of a team to conduct a process improvement project. In this visit students meet the process owner and operators. If the process is a production process, they walk the floor and discuss the process and the delta between expected and actual output. If the process is an information flow process, such as much of an order process, the students discuss the process and, again, the delta between expected and realized output. Over the next six weeks students take the preliminary data and begin to build a simulation model of the current state of the process. During this time period students discover that they do not have all the data and information they need to replicate the actual process. In many cases they do not have the data and/or information because the company does not have that information or how the model is operated is not the same as designed. Students then have to contact the process owner and operators throughout the six weeks to determine the actual business rules used and/or make informed assumptions to complete their model.

Once the model has been validated and the students have a deep understanding of the process, students start modeling process changes that will eliminate waste in the system, increase output, and decrease cost. Examples of methods used to improve the process include changing business rules, adding strategically placed buffers and resources, and reallocating resources. To determine the most effective way to improve the process, a cost benefit analysis in the form of an NPV analysis is completed. The students use the distribution of outputs from the original model to generate appropriate output and then compare that output to output pulled from the distributions of each improvement scenario. This comparison is then used to determine a 95% confidence interval for the NPV and the probability of the NPV being zero or less. Finally, several weeks before the semester is finished, students travel to the company to present their findings and recommendations.

Student learning on these projects is multifaceted. Learning how to use ProModel is the level that the students are most aware of during the semester, as it takes much of their time. However, by the end of the semester they talk about improving their ability to manage processes, work in teams, deal with ambiguity, manage multiple projects, present to high level managers, and maintain steady communication with project owners.

Utilizing external projects and discrete event simulation to teach process management has been used in the College of Business and Economics at the University of Idaho for the past six years. As a result, the Production and Operation area has grown from 40 to 150 students and from five to 20 projects per semester. More importantly, students who complete this course are being sought out and hired by firms based on the transformational learning and skill sets students acquired through the program.

References:

Rajan Suri. Beyond Lean: It’s About Time. 2011 Technical Report, Center for Quick Response Manufacturing, University of Wisconsin-Madison.

Brownlee, John. Apples’s Secret Weapon 06/13/2012. http://www.cnn.com/2012/06/12/opinion/brownlee-apple-secret/index.html?hpt=hp_t2. 12/301/2014.

Scott Metlen Bio:

http://www.uidaho.edu/cbe/business/scottmetlen

 

Flanagan Industries Brings New Facility Online Thanks To ProModel Solution

Flanagan Industries is a major contract manufacturer of aerospace hardware specializing in highly engineered and high value machined components and assemblies.  Over the years their manufacturing operations had been growing steadily to the point where they absolutely needed additional capacity . The original space was not conducive to a manufacturing environment and had become an impediment to taking on more business and staying competitive in the global economy.  So Flanagan decided to expand by opening a new facility that could house bigger and better machinery, however they needed to ensure that the move to the new location would not disrupt their current operations and customer orders.

In the video below, see how Flanagan used a ProModel Simulation Solution to successfully bring their new facility online:

 

 

 

FREE ProModel Webinar: Predictive vs. Prescriptive Analytics

Join ProModel’s CTO, Dan Hickman, and Product Manager, Kevin Jacobson (KJ), on Wednesday November 5, 2014 – 2:00 PM EST for an informative webinar on predictive vs. prescriptive analytics. 

With over 15 years in the industry, Dan has an uncanny understanding of how important both types of analyses are to the success of your business. KJ, with ProModel for over 11 years, manages the Project and Portfolio Simulation product development group. He works closely with our clients on the development of advanced PPM (Project Portfolio Management) predictive and prescriptive analytic tools. He has the hands-on experience to best illustrate how the tool works and how it can help you with your predictive and prescriptive analytic needs.

Together they will show you how ProModel’s Enterprise Portfolio Simulator with Portfolio Scheduler provides the benefits prescriptive analysis can bring to resource capacity planning and project selection. Gain an understanding of the difference between applying predictive and prescriptive analytics to your PPM data, with specific examples focusing on scenario experimentation and portfolio optimization.  KJ will demo some of the newer features of EPS that provide logical recipes for modeling  and show how these tools can help you represent your unique PPM business rules.  The new business rules capabilities of EPS provide portfolio simulation like never before.

CLICK BELOW TO REGISTER FOR THIS WEBINAR NOW!

https://www150.livemeeting.com/lrs/8002083257/Registration.aspx?pageName=k09m7ldp55z3t048&FromPublicUrl=1

 

 

Demystifying System Complexity

Charles Harrell, Founder ProModel Corporation

Charles Harrell, Founder ProModel Corporation

One can’t help but be awe struck, and sometimes even a little annoyed, by the complexity of modern society. This complexity spills over into everyday business systems making them extraordinarily challenging to plan and operate. Enter any factory or healthcare facility and you can sense the confusion and lack of coordination that often seems to prevail. Much of what is intended to be a coordinated effort to get a job done ends up being little more than random commotion resulting in chance outcomes. Welcome to the world of complex systems!

A “complex system” is defined as “a functional whole, consisting of interdependent and variable parts.” (Chris Lucas, Quantifying Complexity Theory, 1999, http://www.calresco.org/lucas/quantify.htm) System complexity, therefore, is a function of both the interdependencies and variability in a system. Interdependencies occur when activities depend on other activities or conditions for their execution. For example, an inspection activity can’t occur until the object being inspected is present and the resources needed for the inspection are available. Variability occurs when there is variation in activity times, arrivals, resource interruptions, etc. As shown below, the performance and predictability of a system is inversely proportional to the degree of interdependency and variability in the system.

Untitled-1

Suppose, for example, you are designing a small work cell or outpatient facility that has five sequential stations with variable activity times and limited buffers or waiting capacity in between. Suppose further that the resources needed for this process experience random interruptions. How does one begin to estimate the output capacity of such a system? More importantly, how does one know what improvements to make to best meet performance objectives?

Obviously, the larger the process and greater the complexity, the more difficult it is to predict how a system will perform and what impact design decisions and operating policies will have. The one thing most systems experts agree on, however, is that increasing complexity tends to have an adverse effect on all aspects of system performance including throughput, resource utilization, time in system and product or service quality.

For Charleys new blog

ProModel and Medmodel are powerful analytic tools that are able to account for the complex relationships in a system and eliminate the guesswork in systems planning. Because these simulation tools imitate the actual operation of a system, they provides valuable insights into system behavior with quantitative measures of system performance.

To help introduce process novices to the way interdependencies and variability impact system performance, ProModel has developed a set of training exercises using an Excel interface to either ProModel or MedModel. Each exercise exposes the initiate to increasingly greater system complexity and how system performance is affected. Additionally, these exercises demonstrate the fundamental ways system complexity can be mitigated and effectively managed.

ProModel is offering these exercises to students and practitioners who are seeking an introduction to simulation and systems dynamics.

 

For more information please contact ProModel Academic

Sandra Petty, Academic Coordinator  spetty@promodel.com

Same Venue, Different Challenges

Weeds Pic

Rob Wedertz – Director, Navy Programs

Just a few weeks ago, I had the privilege of attending the Tail Hook Association’s annual conference in Reno, Nevada.  It is the first time I attended the conference not as an active duty member of the Naval Aviation community, but as a vendor supporting the enterprise through our role as the software application provider of the Naval Synchronization Toolset.  Surprisingly, other than keeping much different hours and standing on the opposite side of the booth table, the conference felt much like it did every year I have attended in the past.  There were many “so what are you doing these days?” conversations with old friends and the ever-present aura of “Naval Aviation is special because…” throughout the exhibit hall.

In fact, had I not taken the opportunity to attend some of the panels and engage some of our key stakeholders in pointed conversations it would have been extremely difficult to differentiate this year’s conference from any other I had attended over the last 2 decades.  There was a new vernacular that weaved its way into this year’s conference.  Words like “sequestration”, “draw-down”, and “budget constraints” permeated the Rose A ballroom, and for the first time in many years, I sensed a palpable uncertainty among the leadership of Naval Aviation as they extolled the virtues of tail hook aviation’s role in the world theatre against the backdrop of future shoe string budgets and unknown war fighting requirements.  (Ironically, the Air Boss told a poignant story of a “nugget” strike fighter pilot from CVW-8 expertly delivering ordnance in the fight against ISIS the same day the morning news detailed the withdrawal of forces from Afghanistan as “hostilities in the Middle East come to a close”.)

Given the environment we’re in and the abundance of questions marks hovering over the next several years, it should come as no surprise that many attendees, including most of the NAE leadership took a great deal of interest in the “little” ProModel booth nestled among missile mock-ups, Joint Strike Fighter simulators, and high-tech defense hardware displays.  In fact, as one of the very few (if not the only) predictive/prescriptive analytics software vendors in attendance at Hook ’14, we were an anomaly.

Tailhook '14

ProModel’s Keith Vadas and Carl Napoletano speak with VADM Dunaway, Commander, Naval Air Systems Command

 

A common theme emerged during our discussions with visitors and through comments made during the various panel discussions – decisions must be made via actionable data, courses of action must be modeled and validated, and technology-enabled decision support applications must be agile enough to get an answer in short order.  Thus the interest in ProModel.

While the Naval Synchronization Toolset is in its infancy from a relative viewpoint (we achieved initial operating capability just a year ago), ProModel has been delivering enterprise-wide decision support tool capabilities to its customers (both private and DoD) for over 25 years.  As industries have evolved (adopted Lean Six Sigma methodologies, harnessed data collection and aggregation, and leveraged emerging technologies) so has ProModel.  We have learned, alongside our customers, that there is significant “power” in diminishing uncertainties through “what-if” analysis and exploration of alternatives via technology-enabled decision support tools like the NST.  The questions the NAE gets asked have answers and it is discovering that getting there is a matter of adopting a philosophy that centers around modeling the behavior of the system, deciding on dials (variables), and exploring the alternatives.

The NST is that system.  Through our integration efforts with Veracity Forecasting and Analysis, we have delivered a software application that establishes the demand signal (the Master Aviation Plan module), models the behavior of the system (Carrier Strike Group Schedule, Air Wing Schedules, and Squadron Schedules), models the behavior of elements (the Airframe Inventory Management module) the utilization of the FA-18 A-F inventory over time, and provides a “sandbox” environment that facilitates optimal disposition of assets in order to meet the requirements of the NAE over time.

We heard, during our attendance at Hook ’14, that the optimal management of the FA-18 inventory was one of the focal points of the NAE leadership.  And although we’ve been involved in the development efforts of the NST for more than 2 years, it is the first time that the challenges of inventory management have taken center stage at a venue that has long been unchanged and timeless.  We felt privileged to be among the professionals in attendance at Hook ’14 and even more proud to be an integral part of the solution set to Naval Aviation’s challenges going forward.  We’ll be back next year and hope that the NAE is no longer talking about it.

Busy Season at ProModel

Keith Vadas

Keith Vadas – ProModel President & CEO

I am pleased to report ProModel’s second quarter was very positive.  Like many businesses in the US we find ourselves on a serious upswing this Summer of 2014.  Our consultants are working on several projects in a variety of industries, including ship building, power management, retail, manufacturing, food processing, and government contracting.  In all of these projects our experienced team of consultants is working to improve efficiency, save money, and make better decisions for their clients.

ProModel’s DOD projects continue to thrive.  It is hard to believe it has been eight years since we started working with FORSCOM (US Army Forces Command)   on AST (ARFORGEN SYNCHRONIZATION TOOL).  LMI-DST (Lead Materiel Integrator – Decision Support Tool) with the LOGSA Team (US Army Logistics Support Activity) is also going strong.  Our agile team of software developers keeps improving the development process within ProModel and it shows. Just recently the NST Airframe Inventory Management Module was Granted Full Accreditation by the Commander, Naval Air Systems Command.

The time is also ripe for opportunities in Healthcare.  Our patient flow optimization capabilities are perfect for helping hospitals and outpatient clinics improve efficiencies.  Now that the Affordable Care Act has been around for a couple of years, its impact is being felt by healthcare organizations around the country.  The expanded insured-base, and the need for improved processes and different care models is making it absolutely necessary to consider the value of modeling and simulation.  ProModel continues to work with several facilities including Presbyterian Homes and Services, and Array Architects who enhance the flow in Healthcare Facilities design by using MedModel simulation in their design processes.

To better support our base of existing customers, we just released ProModel/MedModel 2014 in July and PCS Pro 2014 at the end of Q1.  EPS 2014 (Enterprise Portfolio Simulator) was released in Q2  and includes a new easy to use, web-based rapid scenario planning tool – Portfolio Scheduler.  You can check this tool out online at – http://portfoliostud.io/#.

There continue to be lots of exciting things happening at ProModel.  We have an outstanding team of consultants and software developers-designers just looking for an opportunity to PARTNER with you to help you meet the next business challenge, or solve the next unexpected problem.

ProModel and MedModel 2014

Kevin Field

Kevin Field – Sr. Product Manager

In regards to this release, I would like to start out by saying, in the words of Nacho Libre, “It’s pretty dang exciting, huh?

With ProModel and MedModel 2014 we’ve tried to keep our current customers in mind as well as new customers. For current customers, the new logic windows with Intellisense and Syntax Guide should help you build models faster and easier. And being able to import graphics from third party graphic programs like Photoshop, Gimp, Paint.Net, etc. should even be more useful now that you can rotate all graphic types in the application. The improvements to the Debug window are a direct result of our work on the new logic windows.

For our new customers, the redesigned Getting Started panel (formerly known as the Control Panel) brings a lot of model building resources to the forefront. We have added new demo models and refreshed several of our previous ones. Did anyone even know we had a Quickstart video, showing you how to build a simple model and analyze results in 10-15 minutes? The most exciting part might be the How To videos our Support team has been producing for several months now. All of our customers will find these extremely helpful.

In this blog I am going to casually comment on some of the new features with the assumption that you have already reviewed What’s New in 2014 and perhaps even viewed the webinar I gave on this release. If not, you might want to consider doing so, otherwise…you can blissfully continue on with me…

New Logic Windows

It’s amazing what a few simple colors can do to help your logic be more readable. As we were developing version 9.1, I found it more and more difficult to go back to 8.6 and “drag” myself through the dreary old plain black text 🙂 It’s funny how refreshing it was to get back to 9.1! Not only the color but also line numbers really make it easy to quickly get around in the logic.

1

 

 

 

 

 

 

 

And if you don’t like our default color scheme or want to have something a little easier on the eyes, simply customize the colors in the Logic Color Selection dialog.

2

 

 

 

 

 

 

We also want to encourage good formatting in the new Logic windows by utilizing white space (spaces, line breaks, etc.) and indentation. Don’t be afraid of it! By automatically indenting and out-denting after begin and end brackets, we hope to make co-workers everywhere more willing to leap in and review your logic with you! Auto-formatting is something we are looking to improve moving forward.

Another thing we have made steps to do is deprecate certain logic elements. Begin, End, and the # comment are the main ones. Don’t worry though, they are not completely gone! They won’t show up in the Intellisense list but they will still compile if used in logic. Begin and End are easier to read and enter in logic if you use the “squiggly” brackets { and } instead. And we want to use the # character for other things like the new #region statement.

In fact, #region is one of my favorite new additions to 2014. I love the ability it gives you to section your logic and collapse it with a label describing what’s inside that hidden portion of your logic. I hope you’ll find it quite useful.

Intellisense and Syntax Guide

These new features are probably the heroes of this release. Intellisense brings every statement, function, location, entity, variable, subroutine (I’m saying every model element!) right to your fingertips. You should almost never have to remember your element names or copy them from one place in logic to another, or even leave a module to go look it up. Besides that, the days of completely typing any logic or element name are gone. This should increase your productivity by at least 10-15% 🙂 Are you with me on this?!

3

Intellisense coupled with the Syntax Guide should nearly make the Logic Builder obsolete. There may be a few things we need to add in order to make that happen. Please feel free to share any suggestions you may have. We tried to make both unobtrusive to your logic creation and editing. Because of this, we didn’t add an option to hide Intellisense or the Syntax Guide.

4

 

 

 

Debug Window

MORE THAN 3 LINES!! I think that’s all I need to say on that.

Ok, I’ll also say that debugging should almost be a joyous endeavor as you are now able to anticipate what logic may get executed next and better understand where you came from.

Routing Probability

I’m going to refer you to the webinar I gave on this new feature. In it I give a great example (if I do say so myself) of how simple it is to set up a routing probability for scenario analysis. One thing to remember, in order to use an array in the routing Probability field, the array must be initialized through an import.

Getting Started Panel

The new panel that appears when you start the program may primarily be geared toward new users, however current customers may find it just as useful. Access to the How To videos, online Help, and additional training resources (like dates of future ProModel training classes and a link to ProModel University, our online self-paced training).

5

If you haven’t taken advantage of your M&S contract and utilized our Technical Support team then perhaps the Getting Started panel will help facilitate this. They are a tremendous resource to assist you in understanding different techniques for modeling aspects of your systems, troubleshooting your models and helping you get out of the paper bag you may have coded yourself into, or just a friendly person to talk to 🙂 We like to call them your “ProModel Friend”.

Speaking of the Support team, they have done a tremendous job of generating a lot of How To and Solution videos for quite some time now. The short videos range from 2-5 minutes and offer useful insight into modeling techniques and other useful software tips. Let us know if you have any suggestions for more videos!

New Graphic Libraries

A final word about our new graphic libraries. In order to create new libraries containing EMF (vector-based) files, which scale nicely when zoomed, we had to support the rotation, flipping, and sizing of these image types within ProModel. This makes it so you don’t have to generate an image for every possible rotation or flip you need to have for your animation. This reduces the size of the graphic library and thus your model footprint as well. So with this new capability, you should be using a third party graphics program like Photoshop or Gimp (which is free) to create your graphics. (Or perhaps get your coworker to do it, just don’t tell them that I suggested it.)

I can’t talk about the new graphic libraries without mentioning Laif Harwood, a member of our Support team. Laif gets credit for creating all the new graphics in the libraries. And a fine job he did! So if you want any tips on how to do it for yourself, give our Support team a call!

Well, that’s all I have steam for yammering about today. Remember…you have a ProModel Friend that’s just an email (support@promodel.com) or phone call away (888-PROMODEL).

 

Finding Impartiality in S/W Applications

Rob Wedertz - SME, NST

Rob Wedertz – Director, Navy Programs

As long as I can remember I’ve been a fan of and often used the expression, “I don’t want to build the microwave, I just want to press cook”.  (I’ve never been able to remember when or from whom I first heard it – my apologies for the lack of attribution).  While I’ve sometimes been fascinated by the inner workings and origin of things, in the end I’ve come to adopt the modern world and the pace at which it moves.  I simply don’t have the time of day to dig into the underbellies of things and investigate the underpinnings of how they work nor their interdependencies.  My aversion to such activities was upended when I joined ProModel and led (as a PM) our development team’s efforts to support Verification, Validation, and Accreditation at the behest of our sponsor’s modeling & simulation accreditation agent.  While I do not intend to “build the microwave” here, I would like to describe how I learned that the act of “pressing cook” must be accompanied by complete and total impartiality of the software application.

Software, in a variety of formats, is often used to tell a story.  When it comes to entertainment-based software, and for the sake of the longevity of it, the story should be a very good one.  Thus the reason many folks spend countless hours trying to “level up” (it’s all about the journey, not the destination).  During my college days, I was exposed to Pascal and learned that the methodology (computer language) for telling a story was via if, then, else, while, etc. statements.  Truth be told, I didn’t particularly enjoy trying to figure out how to make the computer say “hello” via that methodology.  Again, I am more of a “show me the story” kind of person, than a “how did you make the story” kind of person.  In that regard I’m quite fond of the software that exists today.  My iPad is a bevy of mindless apps that keep my 5 year old entertained while putting miles on the family wagon.  However, when it comes to decision-support software, the stuff under the hood REALLY does matter and is often as equally important as the story itself.  Through the VV&A journey we’ve traveled to date, I’ve become more and more focused on inner-workings of “the microwave”, both out of necessity and surprisingly out of curiosity.

Our software applications tell stories that often culminate in multi-million dollar and in some cases, billion dollar implications, not necessarily to the good.  Not only must the story be stringently accurate, it must also be 100% impartial (or agnostic) to those who might be directly impacted by the results.  We accomplish that impartiality by ensuring that we never begin our development processes with an end result in mind.  That is not to say that we do not begin with an end-state in mind (i.e. – what is that you want to know?)  The difference is nuanced in print, but significant when it comes to applying the right level of acumen and forethought into software development.  The true genius of leveraging software applications to solve complex problems is that once you’ve figured out “where and why it hurts”, you can use predictive analytics, modeling, and regression analysis to attack the root of the ailment.  In my simplistic mind, our software is being used to treat the condition rather than the symptom.

The rigor that has been applied to the VV&A of our specific DoD program of record is staggering when compared to similar applications.  And it should be.  While many software developers are not particularly fond of documenting source code and defining why a certain script was used, in the end it has made both our customers and us extremely confident about our methodologies, processes, and coding standards.  Frankly, (although I’d never admit it to the folks who raked us through the coals) we’ve become a better development team because of it.  Combine the penultimate requirements associated with VV&A with our use of the Agile/SCRUM development methodology, we’ve accomplished the delivery of an application that withstands painstaking scrutiny and is adaptive enough to answer evolving customer demands and utility.  In the end, the vetting our software application has endured at the hands of the accreditation agent is not the value added propositions our customer demanded, although it was a necessary evolution.  What really matters is that we’ve produced a traceable software application that is impartial.  It may not always give you the answer you want, but it will always give the answer that you need – the truth.