There’s some good evolution here. Some initial thoughts:
It might be useful to defer to the Semantic Versioning (SEMVER) framework for defining what can happen at MAJOR.MINOR.PATCH numbering points (moving to the language of major change, minor change and patch might also help build understanding with those who don’t immediately map those concepts for integer and decimal)
When thinking about a ‘beta’ of each version for testing, ‘Release Candidate’ might be a good terminology to use.
Does all of the documentation fall into additional rules and guidelines?
- It seems to me it might be worth separating rules (which are normative, and affect ‘validity’) from guidelines (which might be thought of as advisory, and affect ‘utility’ or ‘quality evaluation’)?
- If there is documentation that falls outside this governance process, is that clearly identified.
Terminology wise, I wonder if intention and implementation might better capture the point than content and implementation. I tend to find there is a three stage process from intent (we want to do X with data), via content (what kind of data so people hold, what can be extracted from existing systems, what can be queried effectively), through to implementation (what exactly should the data element look like).
I think moving away from tight timings makes sense with the reality of a process, but it would be good to keep some principles on minimum consultation periods, just to manage expectations about how long people will get to review major or minor changes.
The details of ‘step vi’ final approval may need some working out. In OCDS the governance groups responsibility at that point is just to confirm the process has been followed, but is not to have any say on the substantive content of the standard. I think there is a risk that as governing board were involved in setting intention, their involvement at this step could risk raising substantive implementation issues that should have been picked up earlier.
Happy to join a more in depth discussion as useful.