One of ProModel’s Biggest Supporters Reflects on a Career in Academic Optimization!

ProModel Guest Blogger:  Linda Ann Riley, Ph.D., Adjunct Professor of Engineering, University of New Haven; former Associate Dean and recently retired professor from the School of Engineering, Computing and Construction Management at Roger Williams University.

Riley photo

Linda Ann Riley, Ph.D.

When Sandra Petty, Academic Coordinator at ProModel Corporation, invited me to contribute to ProModel’s guest blog, it gave me an opportunity to reflect on an academic career with one ever present constant, the ProModel’s suite of simulation software products.  My Universities may have changed, yet each year for the past twenty or so I have taught at least one, (many times far more) discrete-event simulation courses to an undergraduate, graduate, corporate or government audience.  Regardless of the class, a Ph.D. or a freshman undergraduate, I have continued to use ProModel since its early days as one of the first Window’s based simulation products.  As ProModel Corporation has introduced new products, MedModel, ServiceModel, Process Simulator and Portfolio Simulator, my students have had an invaluable opportunity to be exposed to some of the best simulation products in the industry.

Each simulation class that I teach involves an external project where students work with non-proprietary data from industry, government or non-profit entities. Working only with the ProModel Student Package, I have seen some of the most impactful and innovative uses of ProModel simulation software. From modeling casino floor slot machine layout to nuclear reactor evacuation scenarios, the variety of applications for the software has been virtually limitless.  The simulation skillset acquired by students is one of the primary factors companies have cited when hiring students with ProModel experience.  Through the years, the aerospace, health care, automotive, logistics and defense industries have identified significant value in students graduating with exposure to ProModel’s suite of products.

I too, have benefited from using ProModel software.  For my entire career, my research has focused on productivity/process analysis and optimization.  For the past twenty years, ProModel software has played a central role as an application tool for this research.  As ProModel Corporation has evolved with additional products and capabilities, so too has my research.  In the early years, I focused on health care process and facility layout improvement using MedModel to simulate patient queuing alternatives, throughput strategies and identification of system waste. From there, my research moved to rare event simulation such as security breaches, hazardous materials transportation incidents and hybrid simulation that incorporated both a discrete-event and continuous element.  At that time, I used external code and output from other programs as inputs to ProModel. During this period of research, I also focused with Ph.D. students on new approaches to multi-objective evolutionary algorithms as well as meta-heuristics for optimizing large-scale discrete-event simulations using SimRunner as a starting point.

More recently, my research has concentrated on managing and controlling risk in complex infrastructure projects using discrete-event simulation for stochastic scheduling.   In the construction industry, traditional project management and scheduling approaches for highly-complex construction projects typically use methods such as CPM (critical path method), PERT (program evaluation and review technique) or Monte Carlo simulation. For the most part, these methods rely on deterministic, tractable mathematical models underlying the schedule. The ability to accurately predict project schedule outcome and manage performance volatility is essential to controlling risk.  Prior to ProModel Corporation introducing Project and Portfolio Simulator, I would simulate the stochastic nature of these schedules in ProModel.

Even though I have recently retired from a full-time academic career, I will continue to teach discrete-event simulation using ProModel in an adjunct faculty capacity.  Looking to the future, my research will focus primarily on the incorporation and design of intelligent interfaces that identify and apply algorithms for the optimization problem and constraints under study. This implies perhaps an additional layer of code incorporated into the optimization process. Ultimately, this intelligent interface could “learn” to recognize common optimization scenarios, select starting and stopping rules, and potentially also interface with the system improvement framework.

As a further extension to the intelligent interface, dynamic algorithmic visualization capabilities might be incorporated into the optimization procedures.  Immersive technologies are used in many simulation arenas.  Incorporating immersive visualization into optimization would serve to bring a transparency between the modeling and optimization processes. This would allow users and decision makers to interactively view, and potentially redirect the optimization process. In essence, this feature would provide the decision maker the ability to immerse him or herself into the model, thus “directing” both the simulation and optimization processes.

In retrospect, discrete-event simulation and the ProModel Corporation have played a central role in my development as both a teacher and researcher.  I look forward to what the future holds for both the company and the field of discrete-event simulation.

About Dr. Linda Ann Riley

Contact Information: linda.ann.riley@gmail.com

Linda Ann Riley, Ph.D. presently serves as an Adjunct Professor of Engineering for the University of New Haven’s graduate program in Engineering and Operations Management. She recently retired as full professor from the School of Engineering, Computing and Construction Management at Roger Williams University (RWU) where she worked for twelve years. At RWU, she held the positions of Associate Dean, Engineering Program Coordinator and Professor of Engineering. She has over thirty years of teaching experience in both engineering and business and is the recipient of a number of corporate, university and national excellence in teaching awards. Dr. Riley is the author/co-author of over 100 articles, technical and research reports, and book contributions. Her area of scholarly interest involves the optimization of stochastic systems using simulation and evolutionary algorithms.

In addition, Dr. Riley is an active researcher with notable success in grant writing, grant and contract management, creating collaborative research partnerships and research administration. She is responsible for developing and writing over 150 competitive research/consulting proposals and has been awarded or procured contracts for clients in excess of twenty-five million dollars. Prior to her position at Roger Williams University, Dr. Riley spent 17 years at New Mexico State University (NMSU) holding positions as Director of the University Center for Economic Development Research and Assistance, Assistant Director for the Center for Business Research and Services and Director of the Advanced Modeling and Simulation Laboratory. She also held faculty positions in both the Colleges of Business and Engineering at NMSU.

In addition to teaching and research, Dr. Riley is active in consulting. She has extensive consulting experience in organizational productivity/process improvement implementing six sigma, lean, system dynamics, simulation and optimization approaches. She has extensive experience in the design, communication and implementation of strategic and economic development plans. Also, she worked for a number of years with the National Laboratories on technology commercialization strategies.

Dr. Riley is actively involved in attracting women and under-represented groups into science, engineering, mathematics and technology fields. She is a national speaker on the challenges of attracting women and under-represented groups into these fields and served as Chair of the American Society for Engineering Education Northeastern Section and National Chairperson of American Society of Mechanical Engineers Diversity Metrics Committee. Dr. Riley is a member of several professional business and engineering societies and has served as reviewer and/or editorial board member for business, healthcare and engineering journals.

Dr. Riley received her undergraduate degree from Boston University, earned an M.B.A from Suffolk University, completed a post-graduate fellowship at Brown University and earned her M.S. in Industrial Engineering and a Ph.D. in Business with a major field in Logistics from New Mexico State University. Also, for eleven years, Dr. Riley held the position of Vice-Chair of the Board for a large financial institution. In conjunction with this position, she completed 56 credits of Board of Directors courses and was awarded the Friedrich W. Raiffeisen and Edward W. Filene Awards.

Simulating The Impact Of New Laws On Probation Systems

JCowden Profile Pic

Jennifer Cowden – Sr. Consultant

It was recently announced that the U.S. Justice Department is planning to release 6000 inmates near the end of the month due to new sentencing policies for non-violent drug-offenders.  Most of the prisoners will be placed in half-way houses and drug rehab centers as part of the “largest one-time release of federal prisoners” in U. S History, which begs the question: are these rehabilitation centers going to be ready for this sudden influx?

One state has had a similar law change recently and is rightly concerned about the impact that the new sentencing structure will have on the probation system and ancillary support services.  ProModel consultants have been working with this state’s Administrative Office of Probation to build a series of models around different aspects of the probation system.  The previous phase model studied the movement of youths through the juvenile probation system, while the model discussed in the video below addresses the adult probationer population.

In addition to gaining insight into bottlenecks in the process, the Probation Office was interested in using Predictive Analytics to assess the impact that the new law will have on the probation office workload and the local county jail occupancy rate.  As part of the law change, convicts who are guilty of certain felonies will spend part of their sentence in probation instead of spending all of it in prison.  These felons are at a higher risk level than the current average probationer,  and will likely cause a disproportionate workload increase on the probation officers as well as take up county jail space should custodial sanctions need to be implemented.  The model will be used to help quantify the increased demand so that the appropriate adjustments can be made ahead of time.

The next steps for this model is to combine it with the juvenile model in order to predict more accurately the demand on shared services and resources.

Project Portfolio Management Made Easy!

In this 3 minute overview of Portfolio Scheduler, one of the many capabilities within Enterprise Portfolio Simulator (EPS), Dave Higgins demonstrates how this innovative function allows you to recognize resource supply/demand constraints and reveal alternative portfolio delivery options.  Check it out!

To learn more about Portfolio Scheduler contact Dave Higgins at:

dhiggins@promodel.com  

717 – 884 – 8002 

Demystifying Big Data

Rob Wedertz – Director, Navy Programs

Rob Wedertz – Director, Navy Programs

We live in a data-rich world.  It’s been that way for a while now.  “Big Data” is now the moniker that permeates every industry.  For the sake of eliciting a point from the ensuing paragraphs, consider the following:

FA-18 / Extension / Expenditure / Life / Depot / Operations / Hours / Fatigue

Taken independently, the words above mean very little.  However, if placed in context, and with the proper connections applied, we can adequately frame one of the most significant challenges confronting Naval Aviation:

A higher than anticipated demand for flight operations of the FA-18 aircraft has resulted in an increased number of flight hours being flown per aircraft.  This has necessitated additional depot maintenance events to remedy fatigue life expenditure issues in order to achieve an extension of life cycles for legacy FA-18 aircraft.

120613-N-VO377-095  ARABIAN GULF (June 13, 2012) An F/A-18C Hornet assigned to the Blue Blasters of Strike Fighter Squadron (VFA) 34 launches from the flight deck of the Nimitz-class aircraft carrier USS Abraham Lincoln (CVN 72). Lincoln is deployed to the U.S. 5th Fleet area of responsibility conducting maritime security operations, theater security cooperation efforts and combat flight operations in support of Operation Enduring Freedom. (U.S. Navy photo by Mass Communication Specialist 2nd Class Jonathan P. Idle/Released)

(U.S. Navy photo by Mass Communication Specialist 2nd Class Jonathan P. Idle/Released)

The point here is that it is simply not enough to aggregate data for the sake of aggregation.  The true value in harnessing data is knowing which data are important, which are not, and how to tie the data together.  Often times subscribing to the “big data” school of thought has the potential of distraction and misdirection.  I would argue that any exercise in “data” must first begin with a methodical approach to answering the following questions:

“What challenge are we trying to overcome?”

“What are the top 3 causes of the challenge?”

“Which factors are in my control and which ones are not?”

“Do I have access to the data that affect the questions above?”

“How can I use the data to address the challenge?”

weeds sept 2015 blog graphic

While simply a starting point, the above questions will typically allow us to frame the issue, understand the causal effects of the issue, and most importantly facilitate the process of honing in on the data that are important and systematically ignore the data that are not.

To apply a real-world example of the methodology outlined above, consider the software application ProModel has provided to the U.S. Navy – the Naval Synchronization Toolset (NST).

“What challenge are we trying to overcome?”

Since 2001, the U.S. Navy has participated in overseas contingency operations (Operation Enduring Freedom and Operation Iraqi Freedom) and the legacy FA-18 aircraft (A-D) has consumed more its life expectancy at a higher rate.  Coupled with the delay in Initial Operating Capability (IOC) of the F-35C aircraft, the U.S. Navy has been required to develop and sustain a Service Life Extension Program (SLEP) to extend the life of legacy FA-18 aircraft well beyond their six thousand hour life expectancy and schedule and perform high flight hour inspections and major airframe rework maintenance events.  The challenge is: “how does the Navy effectively manage the strike fighter inventory (FA-18) via planned and unplanned maintenance, to ensure strike fighter squadrons are adequately sourced with the right number of FA-18s at the right time?”

“What are the top 3 causes of the challenge?”

  • Delay in IOC of the F-35C
  • Higher flight hour (utilization) and fatigue life expenditure
  • Fixed number of legacy FA-18 in the inventory

“Which factors are in my control and which ones are not?”

 In:

  • High flight hour inspection maintenance events
  • Airframe rework (depot events)

Out:

  • Delay in IOC of the F-35C
  • Fixed number of legacy FA-18 in the inventory

“Do I have access to the data that affect the questions above?”

Yes.  The planned IOC of the F-35C, flight hour utilization of FA-18 aircraft, and projected depot capacity and requirements are all data that is available and injected into the NST application.

“How can I use the data to address the challenge?”

Using the forecasted operational schedules of units users can proactively source FA-18 aircraft to the right squadron at the right time; balanced against maintenance events, depot rework requirements, and overall service life of each aircraft.

Now that the challenge has been framed, the constraints have been identified, and the data identified, the real work can begin.  This is not to say that there is one answer to a tough question or even that there is a big red “Easy” button available.  Moreover, it has allowed us to ensure that we do not fall victim to fretting over an issue that is beyond our control or spend countless hours wading through data that may not be germane.

NST was designed and developed with the points made above in mind.  The FA-18 is a data-rich aircraft.  However, for the sake of the users, NST was architecturally designed to be mindful of only the key fatigue life expenditure issues that ultimately affect whether the aircraft continues its service life or becomes a museum piece.  In the end, NST’s users are charged with providing strike fighter aircraft to units charged with carrying out our national security strategy.  By leveraging the right data, applying rigor to the identification of issues in and out of their control, and harnessing the technology of computational engines, they do precisely that.