[Date Prev] | [Thread Prev] | [Thread Next] | [Date Next] -- [Date Index] | [Thread Index] | [List Home]
Subject: RE: [soa-rm] SOA-RM Jan 3: delayed but something to think about - Continuous Engineering
Hi Ken, What you recommend below seems necessary to me; however, I recommend considering it in the context of Continuous Engineering, as that concept was originally (?) envisioned, dating back to at
least 1999 (https://rd.springer.com/chapter/10.1007/978-3-540-49020-3_2) and more recently (2014) as in Continuous Software Engineering and Beyond: Trends and Challenges (https://scalare.org/wp-content/uploads/2015/06/rcose20141.pdf).
More concerted adaptation of that thinking to the contemporary [and note the “temporary” part of that word!] Agile thrust would definitely benefit a lot of individuals and organizations. (The Continuous Engineering term has been somewhat coopted and possibly distorted by its use in the IoT domain, especially as marketed by IBM. I’m not criticizing that initiative, but it’s something
“smaller” than the original conceptualizations.) YMMV, BobN From: soa-rm@lists.oasis-open.org [mailto:soa-rm@lists.oasis-open.org]
On Behalf Of Ken Laskey I have a meeting tomorrow morning that is scheduled to go to noon, so I suggest we have an abbreviated meeting from noon to 1300. Does this work for people?
Please review the minutes from 6 Dec 2017 because I seem to recall some of us were supposed to augment what Rex captured. I sent in some edits. Check if you want to make some
contributions.
Finally, some thoughts to consider. I submit the following as a thought piece on something I’ve been kicking around. It may have value or be a rathole. I’ll leave that up to
you to decide. There are a couple weak spots that always show up in discussions of Agile, mostly brought up by people with Waterfall experience who are not convinced Agile should take over the
world. The issues are really carryovers where the Agile approaches don’t adequately address a known Waterfall need. The areas of interest here are requirements at the front end and documentation as we approach the back end. From a Waterfall perspective, requirements are the starting point of what the system needs to accomplish. Traditional problems are (1) how well do we know at the start what the
system is supposed to do, (2) do we really know the system in the detail to which the requirements (we believe) need to be specified, and (3) how likely are we to keep the official requirements up to date as we learn more about the system and how our vision
of the system really meets the user needs. For our second issue, in Waterfall, documentation is often done at the project end when staff is exhausted and lack patience. Depending on how the project went, there may not
be adequate resources left to create quality documentation. If documentation is to be delivered throughout the project, then we run into the same problem as with requirements where we need attention and resources for updates to capture changes or learning
along the way. The Agile emphasis on avoiding documentation that will never be updated and never be read is often interpreted as an excuse to avoid documentation altogether rather than improving the documentation as a whole (where sometimes less is an improvement). Agile and devOps approaches emphasize “continuous”, whether that be continuous testing, continuous integration, or continuous … DevOps also emphasizes automation to consistently
and efficiently perform the enabling processes. We can look at “Continuous Requirements” as the collection of user stories/features/epics, but there are numerous complaints that user stories can become a collection of special cases, missing the common thread
that would be revealed through the traditional requirements analysis. Agile development counts on refactoring as a way to address this, but refactoring can be time consuming and still leaves us with how do we collect the stories (and eventually the refactored
versions) in a useful way that provides an informative whole? The real problem is these tasks still tend to be manual, essentially not continuous, and sometimes just not done. What fundamental artifacts and enablers do we have to consistently address these issues? My thought is this should be approachable as part of Continuous Testing. The capability
delivered by a feature is not really the user stories ticked off the backlog but what has been tested and proven to work. The requirements satisfied are the collection of what tests prove the system can do and why we want the system to do the tested things
and not a list of nice to haves. Can we aggregate information that defines our tests to assemble our documentation? To what extent can this be automated? Can we create the idea of Continuous Documentation? Have you seen this done (attempted) before? What do we know about doing something like this and what do we need to find out? How many of you have been involved with Continuous
Testing? How were tests specified? How were tests documented? How were the tests configuration controlled? What do we do to spit out microservices and containers at the output end? In summary, think of Continuous Requirements as the capture of our evolving thinking of what user stories/features/epics we implement and what is the rationale for our capability
decisions? Think of Continuous Documentation as the capture of what the system can actually accomplish. Can we do this better than we typically do?
Ken ------------------------------------------------------------------------------ |
[Date Prev] | [Thread Prev] | [Thread Next] | [Date Next] -- [Date Index] | [Thread Index] | [List Home]