One of ProModel’s Biggest Supporters Reflects on a Career in Academic Optimization!

ProModel Guest Blogger:  Linda Ann Riley, Ph.D., Adjunct Professor of Engineering, University of New Haven; former Associate Dean and recently retired professor from the School of Engineering, Computing and Construction Management at Roger Williams University.

Riley photo

Linda Ann Riley, Ph.D.

When Sandra Petty, Academic Coordinator at ProModel Corporation, invited me to contribute to ProModel’s guest blog, it gave me an opportunity to reflect on an academic career with one ever present constant, the ProModel’s suite of simulation software products.  My Universities may have changed, yet each year for the past twenty or so I have taught at least one, (many times far more) discrete-event simulation courses to an undergraduate, graduate, corporate or government audience.  Regardless of the class, a Ph.D. or a freshman undergraduate, I have continued to use ProModel since its early days as one of the first Window’s based simulation products.  As ProModel Corporation has introduced new products, MedModel, ServiceModel, Process Simulator and Portfolio Simulator, my students have had an invaluable opportunity to be exposed to some of the best simulation products in the industry.

Each simulation class that I teach involves an external project where students work with non-proprietary data from industry, government or non-profit entities. Working only with the ProModel Student Package, I have seen some of the most impactful and innovative uses of ProModel simulation software. From modeling casino floor slot machine layout to nuclear reactor evacuation scenarios, the variety of applications for the software has been virtually limitless.  The simulation skillset acquired by students is one of the primary factors companies have cited when hiring students with ProModel experience.  Through the years, the aerospace, health care, automotive, logistics and defense industries have identified significant value in students graduating with exposure to ProModel’s suite of products.

I too, have benefited from using ProModel software.  For my entire career, my research has focused on productivity/process analysis and optimization.  For the past twenty years, ProModel software has played a central role as an application tool for this research.  As ProModel Corporation has evolved with additional products and capabilities, so too has my research.  In the early years, I focused on health care process and facility layout improvement using MedModel to simulate patient queuing alternatives, throughput strategies and identification of system waste. From there, my research moved to rare event simulation such as security breaches, hazardous materials transportation incidents and hybrid simulation that incorporated both a discrete-event and continuous element.  At that time, I used external code and output from other programs as inputs to ProModel. During this period of research, I also focused with Ph.D. students on new approaches to multi-objective evolutionary algorithms as well as meta-heuristics for optimizing large-scale discrete-event simulations using SimRunner as a starting point.

More recently, my research has concentrated on managing and controlling risk in complex infrastructure projects using discrete-event simulation for stochastic scheduling.   In the construction industry, traditional project management and scheduling approaches for highly-complex construction projects typically use methods such as CPM (critical path method), PERT (program evaluation and review technique) or Monte Carlo simulation. For the most part, these methods rely on deterministic, tractable mathematical models underlying the schedule. The ability to accurately predict project schedule outcome and manage performance volatility is essential to controlling risk.  Prior to ProModel Corporation introducing Project and Portfolio Simulator, I would simulate the stochastic nature of these schedules in ProModel.

Even though I have recently retired from a full-time academic career, I will continue to teach discrete-event simulation using ProModel in an adjunct faculty capacity.  Looking to the future, my research will focus primarily on the incorporation and design of intelligent interfaces that identify and apply algorithms for the optimization problem and constraints under study. This implies perhaps an additional layer of code incorporated into the optimization process. Ultimately, this intelligent interface could “learn” to recognize common optimization scenarios, select starting and stopping rules, and potentially also interface with the system improvement framework.

As a further extension to the intelligent interface, dynamic algorithmic visualization capabilities might be incorporated into the optimization procedures.  Immersive technologies are used in many simulation arenas.  Incorporating immersive visualization into optimization would serve to bring a transparency between the modeling and optimization processes. This would allow users and decision makers to interactively view, and potentially redirect the optimization process. In essence, this feature would provide the decision maker the ability to immerse him or herself into the model, thus “directing” both the simulation and optimization processes.

In retrospect, discrete-event simulation and the ProModel Corporation have played a central role in my development as both a teacher and researcher.  I look forward to what the future holds for both the company and the field of discrete-event simulation.

About Dr. Linda Ann Riley

Contact Information: linda.ann.riley@gmail.com

Linda Ann Riley, Ph.D. presently serves as an Adjunct Professor of Engineering for the University of New Haven’s graduate program in Engineering and Operations Management. She recently retired as full professor from the School of Engineering, Computing and Construction Management at Roger Williams University (RWU) where she worked for twelve years. At RWU, she held the positions of Associate Dean, Engineering Program Coordinator and Professor of Engineering. She has over thirty years of teaching experience in both engineering and business and is the recipient of a number of corporate, university and national excellence in teaching awards. Dr. Riley is the author/co-author of over 100 articles, technical and research reports, and book contributions. Her area of scholarly interest involves the optimization of stochastic systems using simulation and evolutionary algorithms.

In addition, Dr. Riley is an active researcher with notable success in grant writing, grant and contract management, creating collaborative research partnerships and research administration. She is responsible for developing and writing over 150 competitive research/consulting proposals and has been awarded or procured contracts for clients in excess of twenty-five million dollars. Prior to her position at Roger Williams University, Dr. Riley spent 17 years at New Mexico State University (NMSU) holding positions as Director of the University Center for Economic Development Research and Assistance, Assistant Director for the Center for Business Research and Services and Director of the Advanced Modeling and Simulation Laboratory. She also held faculty positions in both the Colleges of Business and Engineering at NMSU.

In addition to teaching and research, Dr. Riley is active in consulting. She has extensive consulting experience in organizational productivity/process improvement implementing six sigma, lean, system dynamics, simulation and optimization approaches. She has extensive experience in the design, communication and implementation of strategic and economic development plans. Also, she worked for a number of years with the National Laboratories on technology commercialization strategies.

Dr. Riley is actively involved in attracting women and under-represented groups into science, engineering, mathematics and technology fields. She is a national speaker on the challenges of attracting women and under-represented groups into these fields and served as Chair of the American Society for Engineering Education Northeastern Section and National Chairperson of American Society of Mechanical Engineers Diversity Metrics Committee. Dr. Riley is a member of several professional business and engineering societies and has served as reviewer and/or editorial board member for business, healthcare and engineering journals.

Dr. Riley received her undergraduate degree from Boston University, earned an M.B.A from Suffolk University, completed a post-graduate fellowship at Brown University and earned her M.S. in Industrial Engineering and a Ph.D. in Business with a major field in Logistics from New Mexico State University. Also, for eleven years, Dr. Riley held the position of Vice-Chair of the Board for a large financial institution. In conjunction with this position, she completed 56 credits of Board of Directors courses and was awarded the Friedrich W. Raiffeisen and Edward W. Filene Awards.

ProModel Salutes Founder Charley Harrell for Years of Service…and Getting This Whole Thing Started!

Dr. Charles Harrell founded ProModel in 1988 and was the original developer of the Company’s simulation technology (ProModel PC). Today he serves on the Board of Directors and has been actively involved in new product development, acting as chief technology advisor.  Charley is also an associate professor of Engineering and Technology at Brigham Young University and the author of several simulation books.

Retirement PicsIn May, Charley officially retired from the company and to honor his innovative and productive career ProModel held two celebrations this summer at two of our locations in Orem Utah and Allentown Pennsylvania.  The events were attended by ProModel staff and many of Charley’s long time colleagues who have been with him from the start.

In recent years, Charley has written about his team’s original vision for ProModel back in 1988, “We set out to revolutionize the use of simulation in the business world by introducing the first graphically oriented simulation tool for desktop computers.  We were all convinced that we offered a unique product—a simulation tool that was developed and supported by engineers and specifically designed for engineers.”

Describing the success of ProModel Corporation, Charley writes, “In addition to the impressive growth in ProModel’s predictive simulation technology, it has also been gratifying to see the breadth of application of our technology, not just in fortune 500 companies, but also in the area of healthcare, education, homeland security, military readiness and humanitarian aid.”

The entire ProModel family would like to thank Charley for his years of service, guidance and friendship and we wish him all the best in the future! He has made ProModel what we are today.

In 2013 ProModel celebrated its 25th  Anniversary and Charley shared his memories and appreciation for ProModel in this thoughtful BLOG POST

Simulating Impatient Customers

ProModel Guest Blogger: Dr. Farhad Moeeni, Professor of Computer & Information Technology, Arkansas State University

Dr. Farhed Moeeni - Prof. of Computer & Information Technology, Arkansas State University

Dr. Farhad Moeeni 

Simulation is one of the required courses for the MBA degree with MIS concentration at Arkansas State University.  The course was developed a few years ago with the help of a colleague (Dr. John Seydel).  We use Simulation Using Promodel, Third Ed. (Harrell, et al., Ghosh and Bowden, McGraw-Hill) for the course.  In addition, students have access to the full-version of the Promodel software in our Data Automation Laboratory. The course has attracted graduate students from other areas including arts and sciences, social sciences and engineering technology who took the course as elective or for enhancing research capability.  Students experience the entire cycle of simulation modeling and analysis through                                          comprehensive group projects with a focus on business decision making.

Most elements of waiting lines are shared by various queuing systems regardless of entity types such as human, inanimate, or intangible.  However, a few features are unique to human entities and service systems, two of which are balking and reneging.  One of the fairly recent class projects included modeling the university’s main cafeteria with various food islands. Teams were directed to also model balking and reneging, which was challenging. The project led to studying various balking and reneging scenarios and their modeling implications, which was very informative.

Disregarding the simple case of balking caused by queue capacity, balking and reneging happens because of impatience.  Balking means customers evaluate the waiting line, anticipate the required waiting time upon arrival (most likely by observing the queue length) and decide whether to join the queue or leave. In short, balking happens when the person’s tolerance for waiting is less than the anticipated waiting time at arrival.  Reneging happens after a person joins the queue but later leaves because he/she feels waiting no longer is tolerable or has utility.  Literature indicates that both decisions can be the result of complex behavioral traits, criticality of the service and service environment (servicescape). Therefore, acquiring information about and modeling balking or reneging can be hard.  However, it offers additional information on service effectiveness that is hard to derive from analyzing waiting times and queue length.

For modeling purposes, the balking and reneging behavior is usually converted into some probability distributions or rules to trigger them. To alleviate complexity, simplified approaches have been suggested in the literature.  Each treatment is based on simplifying assumptions and only approximates the behavior of customers. This article addresses some approaches to simulate balking.  Reneging will be covered in future articles.  Scenarios to model balking behavior include:

  1. On arrival, the entity joins the queue only if the queue length is less than a specified number but balks otherwise.
  2. On arrival, the entity joins the queue if the queue length is less than or equal to a specified number. However, if the length of the queue exceeds, the entity joins the queue with probability  and balks with probability  (Bernoulli distribution).
  3. The same as Model 2 but several (Bernoulli) conditional probability distribution is constructed for various queue lengths (see the Example).
  4. On arrival, a maximum tolerable length of queue is determined from a discrete probability distribution for each entity. The maximum number is then compared with the queue length at the moment of arrival to determine whether or not the entity balks.

The first three approaches model the underlying tolerance for waiting implicitly.  Model 4 allows tolerance variation among customers to be modeled explicitly.

A simulation example of Model 3 is presented. The purpose is to demonstrate the structure of the model and not model efficiency and compactness.  The model includes a single server, FCFS discipline, random arrival and service.  The conditional probability distributions of balking behavior are presented in the table. The data must be extracted from the field.  The simulation model is also presented below. After running the models for 10 hours, 55 (10% of) customers balked. Balking information can be very useful in designing or fine-tuning queuing systems in addition to other statistics such as average/maximum waiting time or queue length, etc.

Condition

Conditional Probability Distribution

Probability of Joining the Queue (p) Probability of Balking (1-p)
Queue Length <= 4 1.00 0
5<=Queue Length <= 10 0.7 0.3
Queue Length > 10 0.2 0.8

Prof Mooeini Sim Chart

About Dr. Moeeni:

Dr. Farhad Moeeni is professor of Computer and Information Technology and the Founder of the Laboratory for the Study of Automatic Identification at Arkansas State University. He holds a M.S. degree in industrial engineering and a Ph.D. in operations management and information systems, both from the University of Arizona.

His articles have been published in various scholarly outlets including Decision Sciences Journal, International journal of Production Economics, International Journal of Production Research, International Transactions in Operational Research, Decision Line, and several others. He has also co-authored two book chapters on the subject of automatic identification with applications in cyber logistics and e-supply chain management along with several study books in support of various textbooks. .

He is a frequent guest lecturer on the subject of information systems at the “Centre Franco Americain”, University of Caen, France.

Current research interests are primarily in the design, analysis and implementation of automatic identification for data quality and efficiency, RFID-based real-time location sensing with warehousing applications, and supply chain management. Methodological interests include design of experiments, simulation modeling and analysis, and other operations research techniques. He is one of the pioneers in instructional design and the teaching of automatic identification concepts within MIS programs and also is RFID+ certified.

Dr. Moeeni is currently the principle investigator of a multi-university research project funded by Arkansas Science and Technology Authority, Co-founder of Consortium for Identity Systems Research and Education (CISRE), and on the Editorial Board of the International Journal of RF Technologies: Research and Applications.

Contact Information

moeeni@astate.edu

In the OR with Dale Schroyer

Dale%20Schroyer

Dale Schroyer – Sr. Consultant & Project Manager

I generally find that in healthcare, WHEN something needs to happen is more important than WHAT needs to happen.  It’s a field that is rife with variation, but with simulation, I firmly believe that it can be properly managed.  Patient flow and staffing are always a top concern for hospitals, but it’s important to remember that utilization levels that are too high are just as bad as levels that are too low, and one of the benefits of simulation in healthcare is the ability to staff to demand.

Check out Dale’s work with Robert Wood Johnson University Hospital where they successfully used simulation to manage increased OR patient volume: 

About Dale

Since joining ProModel in 2000, Dale has been developing simulation models used by businesses to perform operational improvement and strategic planning. Prior to joining ProModel Dale spent seven years as a Sr. Corporate Management Engineering Consultant for Baystate Health System in Springfield, MA where he facilitated quality improvement efforts system wide including setting standards and facilitating business re-engineering teams. Earlier he worked as a Project Engineer at the Hamilton Standard Division of United Technologies.

Dale has a BS in Mechanical Engineering from the University of Michigan and a Masters of Management Science from Lesley University. He is a certified Six Sigma Green Belt and is Lean Bronze certified.

NEW! ProModel’s Patient Flow Solution:

http://patientflowstudio.com/

ProModel Healthcare Solutions:

http://www.promodel.com/Industries/Healthcare

Simulation Ensures Patient Safety During Hospital Move

Northwest Community Hospital is an acute care hospital in Arlington Heights Illinois, right outside of Chicago.  The staff at NCH had the very complex and delicate task of arranging and accomplishing the move of 150 patients over to a newly constructed facility on campus.  This is a welcome but difficult situation that many healthcare organizations find themselves in today as technology improvements and rising patient populations demand growth.

See how NCH achieved a flawless transition through predictive analytics and simulation:

Teaching Process Management Using ProModel

ProModel Guest Blogger:  Scott Metlen, Ph.D. – Business Department Head and Associate Professor at University of Idaho

Scott Metlen, Ph.D.

Scott Metlen, Ph.D.

Understanding process management, the design, implementation, management and control, and continuous improvement of the enterprise wide set of an organizations processes is the key to well deployed strategies. It was not until Tim Cook made Apple’s total set of processes world class including all supply chain linked processes (Brownlee, 2012) that Apple hit its amazing climb to become the world’s highest valued company; even though the company had cutting edge products before his arrival. Gaining effective understanding of process management is not easy due to the strategic variability inherent in the portfolio of products that companies sell, and in markets they service. This strategic variability (Rajan, 2011) in turn drives variability in many processes that an organization uses to operate. For instance, different markets require different marketing plans supported by different processes.  Order processes often vary by product and target market. Employee skill sets differ by product requiring different hiring and training processes. Different products, whether it be services or goods that have a slight variation require, at the very least, an adjustment to the production process. Adding to, and often caused by the variability just mentioned, are multiple process steps, each with different duration times and human resource skills.  Depending on what product is currently being produced, process steps, process step order and duration time, interdependency between the process steps, and business rules all vary. Where a product is in its life cycle will drive the experience curve, again creating variation across products. In addition, the numerous interfaces with other processes all vary depending on the product being produced. All of these sources of variability can make process management hard to do, teach, and learn. One tool that helps with process management in the face of variance is discrete event simulation and one of the best software suites to use is ProModel. ProModel is a flexible program with excellent product support from the company.

Effective process management is a multi-step process. The first step of process management is to determine the process flow while at the same time determining the value and non-value added process steps. Included in the process flow diagram for each step are the duration times by product and resources needed at each step, and product routes. Also needed at this time are business rules governing the process such as working hours, safety envelopes, quality control, queueing rules, and many others. Capturing this complex interrelated system begins by visiting the process and talking with the process owner and operators. Drawing the diagram and listing other information is a good second step, but actually building and operating the process is when a person truly understands the process and its complexities.  Of course many of the processes we want to improve are already built and are in use. In most cases, students will not be able to do either of these. However, building a verified and validated simulation model is a good proxy for doing the real thing, as the model will never validate against the actual process output unless all of the complexity is included or represented in the model. In the ‘Systems and Simulation’ course at the University of Idaho students first learn fundamentals of process management including lean terms and tools. Then they are given the opportunity to visit a company in the third week of class as a member of a team to conduct a process improvement project. In this visit students meet the process owner and operators. If the process is a production process, they walk the floor and discuss the process and the delta between expected and actual output. If the process is an information flow process, such as much of an order process, the students discuss the process and, again, the delta between expected and realized output. Over the next six weeks students take the preliminary data and begin to build a simulation model of the current state of the process. During this time period students discover that they do not have all the data and information they need to replicate the actual process. In many cases they do not have the data and/or information because the company does not have that information or how the model is operated is not the same as designed. Students then have to contact the process owner and operators throughout the six weeks to determine the actual business rules used and/or make informed assumptions to complete their model.

Once the model has been validated and the students have a deep understanding of the process, students start modeling process changes that will eliminate waste in the system, increase output, and decrease cost. Examples of methods used to improve the process include changing business rules, adding strategically placed buffers and resources, and reallocating resources. To determine the most effective way to improve the process, a cost benefit analysis in the form of an NPV analysis is completed. The students use the distribution of outputs from the original model to generate appropriate output and then compare that output to output pulled from the distributions of each improvement scenario. This comparison is then used to determine a 95% confidence interval for the NPV and the probability of the NPV being zero or less. Finally, several weeks before the semester is finished, students travel to the company to present their findings and recommendations.

Student learning on these projects is multifaceted. Learning how to use ProModel is the level that the students are most aware of during the semester, as it takes much of their time. However, by the end of the semester they talk about improving their ability to manage processes, work in teams, deal with ambiguity, manage multiple projects, present to high level managers, and maintain steady communication with project owners.

Utilizing external projects and discrete event simulation to teach process management has been used in the College of Business and Economics at the University of Idaho for the past six years. As a result, the Production and Operation area has grown from 40 to 150 students and from five to 20 projects per semester. More importantly, students who complete this course are being sought out and hired by firms based on the transformational learning and skill sets students acquired through the program.

References:

Rajan Suri. Beyond Lean: It’s About Time. 2011 Technical Report, Center for Quick Response Manufacturing, University of Wisconsin-Madison.

Brownlee, John. Apples’s Secret Weapon 06/13/2012. http://www.cnn.com/2012/06/12/opinion/brownlee-apple-secret/index.html?hpt=hp_t2. 12/301/2014.

Scott Metlen Bio:

http://www.uidaho.edu/cbe/business/scottmetlen

 

Flanagan Industries Brings New Facility Online Thanks To ProModel Solution

Flanagan Industries is a major contract manufacturer of aerospace hardware specializing in highly engineered and high value machined components and assemblies.  Over the years their manufacturing operations had been growing steadily to the point where they absolutely needed additional capacity . The original space was not conducive to a manufacturing environment and had become an impediment to taking on more business and staying competitive in the global economy.  So Flanagan decided to expand by opening a new facility that could house bigger and better machinery, however they needed to ensure that the move to the new location would not disrupt their current operations and customer orders.

In the video below, see how Flanagan used a ProModel Simulation Solution to successfully bring their new facility online:

 

 

 

Demystifying System Complexity

Charles Harrell, Founder ProModel Corporation

Charles Harrell, Founder ProModel Corporation

One can’t help but be awe struck, and sometimes even a little annoyed, by the complexity of modern society. This complexity spills over into everyday business systems making them extraordinarily challenging to plan and operate. Enter any factory or healthcare facility and you can sense the confusion and lack of coordination that often seems to prevail. Much of what is intended to be a coordinated effort to get a job done ends up being little more than random commotion resulting in chance outcomes. Welcome to the world of complex systems!

A “complex system” is defined as “a functional whole, consisting of interdependent and variable parts.” (Chris Lucas, Quantifying Complexity Theory, 1999, http://www.calresco.org/lucas/quantify.htm) System complexity, therefore, is a function of both the interdependencies and variability in a system. Interdependencies occur when activities depend on other activities or conditions for their execution. For example, an inspection activity can’t occur until the object being inspected is present and the resources needed for the inspection are available. Variability occurs when there is variation in activity times, arrivals, resource interruptions, etc. As shown below, the performance and predictability of a system is inversely proportional to the degree of interdependency and variability in the system.

Untitled-1

Suppose, for example, you are designing a small work cell or outpatient facility that has five sequential stations with variable activity times and limited buffers or waiting capacity in between. Suppose further that the resources needed for this process experience random interruptions. How does one begin to estimate the output capacity of such a system? More importantly, how does one know what improvements to make to best meet performance objectives?

Obviously, the larger the process and greater the complexity, the more difficult it is to predict how a system will perform and what impact design decisions and operating policies will have. The one thing most systems experts agree on, however, is that increasing complexity tends to have an adverse effect on all aspects of system performance including throughput, resource utilization, time in system and product or service quality.

For Charleys new blog

ProModel and Medmodel are powerful analytic tools that are able to account for the complex relationships in a system and eliminate the guesswork in systems planning. Because these simulation tools imitate the actual operation of a system, they provides valuable insights into system behavior with quantitative measures of system performance.

To help introduce process novices to the way interdependencies and variability impact system performance, ProModel has developed a set of training exercises using an Excel interface to either ProModel or MedModel. Each exercise exposes the initiate to increasingly greater system complexity and how system performance is affected. Additionally, these exercises demonstrate the fundamental ways system complexity can be mitigated and effectively managed.

ProModel is offering these exercises to students and practitioners who are seeking an introduction to simulation and systems dynamics.

 

For more information please contact ProModel Academic

Sandra Petty, Academic Coordinator  spetty@promodel.com

Designing Better Care For Your OR

JCowden Profile Pic

Jennifer Cowden – Sr. Consultant

Earlier this year, my family and I took a vacation to a certain kid-friendly theme park.  As we wandered from ride to ride, we couldn’t help but note that, even at the peak times on the more popular rides, you rarely saw crowds standing outside waiting. The long lines were all contained within a succession of fairly climate-controlled rooms that obviously took some thought to plan. This particular company is big into predictive analytics, so I would hazard to say that they didn’t just guess at the maximum size of the line at peak time; they are probably not going to go live with a new attraction or other big change unless they simulate it first.  An interesting dynamic that we observed was that when a wait time for an attraction was lowered on their new mobile app, we could literally see the “flash mob” of patrons converge on that ride, causing the line to go from a 10-minute wait to a 30-minute wait in the blink of an eye.  I turned to my husband, who is also an engineer and a geek, and said “I wondered if their model predicted that.”

Theme parks obviously need to be concerned about a positive overall  visitor experience; after all, they are always competing for discretionary funds with other sources of entertainment.  Now, more and more hospitals are developing that same mindset: being cognizant of the overall patient experience to the point of modeling new spaces before they go live.  How many OR rooms should they outfit for opening day, and how many can wait?  How can they make the best use of the spare rooms?    Is there enough space in the corridors that the patients won’t feel too crowded?  Is there enough space in the waiting areas for the families of the outpatients?  How many staff members do they need for each department to minimize patient wait time?  Are there any unforeseen bottlenecks due to sudden dynamic shifts?  These are just a few of the questions that simulation can answer.

Check out Jennifer’s Ambulatory Care/OR Suite Model:

About Jennifer

Before joining ProModel in 2013, Jennifer spent 15 years in the automation industry working for a custom turnkey integrator. As an Applications Engineer she built simulation models (primarily using ProModel) to demonstrate throughput capacity of proposed equipment solutions for a variety of customers. Jennifer’s experience covers a wide range of industrial solutions – from power-and-free conveyor systems to overhead gantries and robotic storage and retrieval systems. She has also created applications in the pharmaceutical, medical device, automotive, and consumer appliance industries.

Jennifer has a BS in Mechanical Engineering and a Master of Science in Mechanical Engineering from the Georgia Institute of Technology.

Busy Season at ProModel

Keith Vadas

Keith Vadas – ProModel President & CEO

I am pleased to report ProModel’s second quarter was very positive.  Like many businesses in the US we find ourselves on a serious upswing this Summer of 2014.  Our consultants are working on several projects in a variety of industries, including ship building, power management, retail, manufacturing, food processing, and government contracting.  In all of these projects our experienced team of consultants is working to improve efficiency, save money, and make better decisions for their clients.

ProModel’s DOD projects continue to thrive.  It is hard to believe it has been eight years since we started working with FORSCOM (US Army Forces Command)   on AST (ARFORGEN SYNCHRONIZATION TOOL).  LMI-DST (Lead Materiel Integrator – Decision Support Tool) with the LOGSA Team (US Army Logistics Support Activity) is also going strong.  Our agile team of software developers keeps improving the development process within ProModel and it shows. Just recently the NST Airframe Inventory Management Module was Granted Full Accreditation by the Commander, Naval Air Systems Command.

The time is also ripe for opportunities in Healthcare.  Our patient flow optimization capabilities are perfect for helping hospitals and outpatient clinics improve efficiencies.  Now that the Affordable Care Act has been around for a couple of years, its impact is being felt by healthcare organizations around the country.  The expanded insured-base, and the need for improved processes and different care models is making it absolutely necessary to consider the value of modeling and simulation.  ProModel continues to work with several facilities including Presbyterian Homes and Services, and Array Architects who enhance the flow in Healthcare Facilities design by using MedModel simulation in their design processes.

To better support our base of existing customers, we just released ProModel/MedModel 2014 in July and PCS Pro 2014 at the end of Q1.  EPS 2014 (Enterprise Portfolio Simulator) was released in Q2  and includes a new easy to use, web-based rapid scenario planning tool – Portfolio Scheduler.  You can check this tool out online at – http://portfoliostud.io/#.

There continue to be lots of exciting things happening at ProModel.  We have an outstanding team of consultants and software developers-designers just looking for an opportunity to PARTNER with you to help you meet the next business challenge, or solve the next unexpected problem.