Yes – DevSecOps Can be Done, and Done Well

Weeds picture

Rob Wedertz – VP DoD Programs

DevSecOps Diagram

Inarguably, the pace of change in the technology environment outpaces the program and acquisition oversight within the Department of Defense.  I don’t believe this is a controversial statement.  C-SPAN is riddled with testimony of senior ranking DOD officials asserting the same.  The National Defense Authorization Act (NDAA) is littered with language encouraging the Department to accelerate the adoption of rapid acquisition methodologies.  Nowhere is the delta between advanced technology capabilities and the Department’s ability to procure these capabilities more prevalent than in Software (i.e. Artificial Intelligence, Machine Learning, and Discrete Event Simulation).  And even more specifically, it is the incorporation of the development methodologies, for example DevSecOps, that often befuddles program managers, contracting officers, and even leadership, as this methodology is counter to acquisition guidelines and requirements oversight.

In an effort to close the delta, the Department has established bodies (e.g. DoD Enterprise DevSecOps Community of Practice – a Joint effort among DoD CIO, OUSD (A&S), and DISA; the Defense Innovation Board, the Joint Artificial Intelligence Center, and others) to “sanctify” best practices and is actively campaigning to align acquisition and procurement with best in breed enabling technologies and development methodologies.  Because we have been charged with designing, developing, and implementing the Joint Staff’s Global Force Management Decision Support Platform (ORION), we are actively “leaning out over our skis” to demonstrate that DevSecOps can and should be done.

As a software development company tasked to deliver leading edge technology-enabled decision support platforms to the Joint Staff, there is little more deflating than telling our platform leads that they cannot implement the best in breed capabilities (i.e. open-source software, enablers, architectures, etc.) because the product is evolving so quickly that we cannot introduce it into the Risk Management Framework accreditation sphere.

Fortunately for us, we were introduced to Defense Innovative Unit (DIU) (then with an “experimental” on the tail) early in the ORION development process. They were encouraged by our startup mentality developed in support of our commercial products and they encouraged our government oversight to think about things like; Minimally Viable Products (MVPs), continuous User Engagement, and leveraging modern technology and platforms.  During their assessment of the ORION Joint Platform (at the time known as the Joint Force Capabilities Catalog (JFCC) / Global Laydown Server (GLS)) DIU acknowledged that we were already accomplishing the things they suggested.  They passed as much to the Chairman of the Joint Chiefs of Staff and his support staff.  Achieving this level of maturity didn’t happen overnight.

We lived the painfully slow migration from “waterfall” acquisition and associated development practices to Agile, and are on the leading edge of DevSecOps.  In fact, as DoD CIO, OUSD (A&S), and DISA work through “sanctifying” the DoD Enterprise DevSecOps maturity model (via a Community of Practice), and the Defense Innovation Board awaits the response to their Software Acquisition and Practices (SWAP) study published in April of this year, we’re already demonstrating that the DevSecOps model works, can be implemented at no additional cost to the government, and perhaps most importantly, is scalable.  Case in point – when we began the ORION project, we were squarely in the “rapid prototyping” phase of development as the overarching requirements were being developed, and oversight was being codified.  The early days required rapid deliveries and constant engagements with users, all while adhering to information assurance requirements and cyber security.  (Note – we were (and are) deploying code to the SIPRNet, a production environment, every 2 weeks – functionality that is Beta, IOC, and FOC simultaneously.)  Achieving and sustaining this level of S/W development maturity is difficult and often requires a champion.

Advocacy is paramount.  It is not enough to be an innovative company with technical “chops”.  You MUST have a program sponsor that endorses the DevSecOps methodology and removes legacy critical barriers that prevent innovation at the speed required.  (It does not hurt that our advocacy was a shared understanding and endorsement from the sitting CJCS and the leadership of DIU.  That we were doing it was the result of technical leadership and guidance provided by our Joint Staff J35S Program Manager; that we are continuing to do it is the result of the senior leaders of the DOD acknowledging that is the way it SHOULD be done.  Early in the project, the J35 Deputy Director of Regional Operations, briefed the entirety of the Joint Staff (J-DIRs, Director, and Chairman) and the Deputy Secretary of Defense.  Paraphrasing his remarks, [sic] “these guys are pushing the envelope on s/w development.  They sprint, they fail, they recover, they deliver, they iterate – we win.”

Perhaps the lynchpin in achieving technical maturity in an oftentimes legacy environment is the simple acknowledgement that requirements WILL change.  When we started ORION, Globally Integrated Operations and Dynamic Force Employment were not yet established in policy.  Had we developed and delivered an application that was a reflection of solely the original requirements specifications, both the program and our platform would now be obsolete.  Fortunately we’ve been allowed to iterate throughout the software development lifecycle.  Continuous user feedback and rapid development cycles have facilitated relevance and viability that have ultimately enabled the Joint Staff to make Better Decisions, Faster.

Aligning the DevSecOps methodology with Scaled Agile Framework has additionally ensured that ProModel is permeating best practices not only across our DOD vertical but also in our COTS and Healthcare spaces as well.  Our collective roadmap is articulated in the Defense Innovation Board’s Software Acquisition and Practices (SWAP) study graphic below.  Our objective is to live in the “Do’s” and demonstrate that we can and should avoid the “Don’ts.  ORION is validation that it can be done.

Finding Impartiality in S/W Applications

Rob Wedertz - SME, NST

Rob Wedertz – Director, Navy Programs

As long as I can remember I’ve been a fan of and often used the expression, “I don’t want to build the microwave, I just want to press cook”.  (I’ve never been able to remember when or from whom I first heard it – my apologies for the lack of attribution).  While I’ve sometimes been fascinated by the inner workings and origin of things, in the end I’ve come to adopt the modern world and the pace at which it moves.  I simply don’t have the time of day to dig into the underbellies of things and investigate the underpinnings of how they work nor their interdependencies.  My aversion to such activities was upended when I joined ProModel and led (as a PM) our development team’s efforts to support Verification, Validation, and Accreditation at the behest of our sponsor’s modeling & simulation accreditation agent.  While I do not intend to “build the microwave” here, I would like to describe how I learned that the act of “pressing cook” must be accompanied by complete and total impartiality of the software application.

Software, in a variety of formats, is often used to tell a story.  When it comes to entertainment-based software, and for the sake of the longevity of it, the story should be a very good one.  Thus the reason many folks spend countless hours trying to “level up” (it’s all about the journey, not the destination).  During my college days, I was exposed to Pascal and learned that the methodology (computer language) for telling a story was via if, then, else, while, etc. statements.  Truth be told, I didn’t particularly enjoy trying to figure out how to make the computer say “hello” via that methodology.  Again, I am more of a “show me the story” kind of person, than a “how did you make the story” kind of person.  In that regard I’m quite fond of the software that exists today.  My iPad is a bevy of mindless apps that keep my 5 year old entertained while putting miles on the family wagon.  However, when it comes to decision-support software, the stuff under the hood REALLY does matter and is often as equally important as the story itself.  Through the VV&A journey we’ve traveled to date, I’ve become more and more focused on inner-workings of “the microwave”, both out of necessity and surprisingly out of curiosity.

Our software applications tell stories that often culminate in multi-million dollar and in some cases, billion dollar implications, not necessarily to the good.  Not only must the story be stringently accurate, it must also be 100% impartial (or agnostic) to those who might be directly impacted by the results.  We accomplish that impartiality by ensuring that we never begin our development processes with an end result in mind.  That is not to say that we do not begin with an end-state in mind (i.e. – what is that you want to know?)  The difference is nuanced in print, but significant when it comes to applying the right level of acumen and forethought into software development.  The true genius of leveraging software applications to solve complex problems is that once you’ve figured out “where and why it hurts”, you can use predictive analytics, modeling, and regression analysis to attack the root of the ailment.  In my simplistic mind, our software is being used to treat the condition rather than the symptom.

The rigor that has been applied to the VV&A of our specific DoD program of record is staggering when compared to similar applications.  And it should be.  While many software developers are not particularly fond of documenting source code and defining why a certain script was used, in the end it has made both our customers and us extremely confident about our methodologies, processes, and coding standards.  Frankly, (although I’d never admit it to the folks who raked us through the coals) we’ve become a better development team because of it.  Combine the penultimate requirements associated with VV&A with our use of the Agile/SCRUM development methodology, we’ve accomplished the delivery of an application that withstands painstaking scrutiny and is adaptive enough to answer evolving customer demands and utility.  In the end, the vetting our software application has endured at the hands of the accreditation agent is not the value added propositions our customer demanded, although it was a necessary evolution.  What really matters is that we’ve produced a traceable software application that is impartial.  It may not always give you the answer you want, but it will always give the answer that you need – the truth.