Skip to content

endjin/Endjin.Ip.Maturity.Matrix

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

14 Commits
 
 
 
 
 
 
 
 

Repository files navigation

IP Maturity Matrix

The aim of this framework is to measure maturity across the disciplines required to build, test, deploy, reuse, support & sell Intellectual Property. This flexible framework is based on a number of conventions to measure "quality", many of these measures are subjective, rather than qualitative, and need to reviewed by people rather than automated by tools.

Our IP exists at different levels of fidelity:

Level Type
0 Script / Template / Code file
1 Component
2 Tool
3 HTTP APIs
4 Solution
5 Product

Each of these has different nuances when it comes to the categories in the IP Maturity Matrix outlined below.

Categories

We have, collectively, defined the following categories as quality measures for our IP. In many ways these measures can be thought of as a more traditional "definition of done". These quality measures cover different aspects of the lifecycle of IP, from maintainance to adoption, to operational support. It also includes commercial and legal concerns.

Shared Engineering Standards

We have developed a standardised set of configuration files for projects and IDEs. Using these creates a pit of quality for internal and external developers to follow.

Score Measure
0 None
+1 Configured

Coding Standards

We have agreed upon coding standards for our primary languages, and in most languages these are enforced by linters or other code analysis tools.

Score Measure
0 None
+1 Enforced via tooling

Executable Specifications

Executable Specifications are our fundamental design tool; whether via Gherkin or OpenAPI, it creates a shared understanding of behaviour, and a common domain language.

Score Measure
0 None
+1 Specs which cover golden path / APIs have full OpenAPI Definition
+1 Specs which cover common failure cases
+1 Specs which explore edge cases

Coverage

Code coverage should be used as a ancillary measure of how well we have written our Executable Specifications. We could expect a similar score across both categories. A discrepancy between scored requires further inspection & analysis.

Score Measure
0 0-25
+1 +26-50
+1 51-75
+1 75+

Benchmarks

We are commonly asked to write high performance / low latency code; understanding the performance characteristics from a memory and CPU perspective is vital as production performance issues have big reputational impact.

Score Measure
0 None
+1 Benchmarks which cover baseline performance
+1 Benchmarks which demonstrate failure conditions

Reference Documentation

Code should be self-documenting, but long term support of maintaince of code requires context and narrative as well as purpose.

Score Measure
0 None
+1 Good quality
+1 Technical Fellow quality

Design & Implementation Documentation

One of our constant failings is under valuing the thought and effort that goes into creating solutions, and capturing all that knowledge in a format we can use to take customers along on the journey.

Score Measure
0 None
+1 Up-to-date architecture & high level conceptual docs & diagrams
+1 Logical, infrastructure, security & ops views
+1 Constraints & extensibility (obtained from benchmarks & specs)

How-to Documentation

The use of IP can be nuanced. Effective documentation allows users to be self-starters. Questions should be captured as FAQs.

Score Measure
0 None
+1 Common scenarios
+1 Extensibility Scenarios
+1 FAQs / Troubleshooting

Demos

Understanding how to get started with our IP, or how different elements of IP can be used together is fundamental for increased producivity.

Score Measure
0 None
+1 Common scenarios
+1 Extensibility Scenarios
+1 End to end integration (with other IP)

Date of Last IP Review

How recently IP was reviewed is a powerful code smell. We exist at the bleeding edge, which means that change is a constant. We need to be vigilant to change, especially when we depend on cloud PaaS services that can change beneath our feet.

Score Measure
0 Unknown
1 > 3 months
2 > 1 month
3 < 1 month

Framework Version

The majority of our IP is based on the .NET Framework. This is now a rapidly evolving ecosystem. Staying up to date is no small feat.

Score Measure
0 Using an unsupported flavour of .NET
+1 Using a LTS version of a flavour of .NET
+1 Using the most current LTS flavour of .NET

Associated Work Items

The number of associated work items is another code smell; it can signal either chaos or order. We need to distinguish between these two states.

Score Measure
0 None
1 Bugs & Features
2 Curated Bugs & Features
3 Active Roadmap

Source Code Availability

One of the most overlooked aspects of our use of IP is supporting our contractual clauses, allowing customers to access the source code for the binaries we use. Historically, we have been approached 5 years after the actual engagement as part of acquisition / due dilligence processes.

Score Measure
0 None
1 Snapshot archive for escrow
2 Private OSS / Mirrored Repo
3 Public OSS

License

A foundation of our commercial success is establishing the licensing of our IP.

Score Measure
0 None
+1 Copyright headers in each source file
+1 License in Source & Packages
+1 Contributor license in Repo

Production Use

A good measure of the quality of our IP is how many times it is being used in a production environment by our customers.

Score Measure
0 None
+1 Accepted by a customer
+1 In production use by a customer
+1 In production use by multiple customers

Insights

When things go wrong, we need the infrastucture in place to help us quickly resolve the situation.

Score Measure
0 None
+1 Telemetry, Diagnostics & Debugging
+1 Perf Counters
+1 Operational Insights (Custom Queries defining abnormal behaviour)

Packaging

It's one thing to create reusable IP, it's entirely another for it to be in a form that easy to find and use.

Score Measure
0 None
+1 Packaged
+1 Versioned
+1 Discoverable

Deployment

The final hurdle is getting the IP into an environment where it can be used.

Score Measure
0 None
+1 Scripted & Documented
+1 Templated
+1 Multi-tenanted - as a Service

Ops

We need to consider other personas, whose funciton is to support the IP we create. How do we make their experience "delightful"?

Score Measure
0 TBD

The IMM Schema

The IMM schema is designed to be very flexible as you may decide that you want to add or alter categories. The current RuleSet is defined in the Endjin.Imm.App project in the solution.

Each rule needs a unique, guid based Id, a name, and a data type which can either be Discrete (meaning it can only be scored by a single item listed in the schema), or Continuous (meaning that the score can be cumulative). These two data types are used to select and run the correct rules engine to calculate the values to be used to render the badges.

How to use the IP Maturity Matrix

Each project that adopts the IMM simply adds a imm.yaml file into the root of the repo. Endjin.Ip.Maturity.Matrix.Host is an Azure Function, which when given the path to the GitHub repo, will render the IMM Measure as a badge, which can be displayed in the repo and provide an "at a glance" view of the quality of the project.

The Azure Function exposes a HTTP Trigger, which has two actions.

Total Score

Call the the Function api/imm/github/<Org_Name>/<Repo>/total?cache=false

And it will render a badge - the example below is for the AIS.NET Project (you may need to refresh the page to wake the Function up!)

IMM

IMM Measures

To display a badge for each measure you need to call api/imm/github/<Org_Name>/<Repo>/rule/<RuleId>?nocache=true

Here are examples for the AIS.NET Project;

Shared Engineering Standards

Coding Standards

Executable Specifications

Code Coverage

Benchmarks

Reference Documentation

Design & Implementation Documentation

How-to Documentation

Date of Last IP Review

Framework Version

Associated Work Items

Source Code Availability

License

Production Use

Packaging

Licenses

GitHub license

The IP Maturity Matrix is available under the Apache 2.0 open source license.

For any licensing questions, please email licensing@endjin.com

Project Sponsor

This project is sponsored by endjin, a UK based Microsoft Gold Partner for Cloud Platform, Data Platform, Data Analytics, DevOps, and a Power BI Partner.

For more information about our products and services, or for commercial support of this project, please contact us.

We produce two free weekly newsletters; Azure Weekly for all things about the Microsoft Azure Platform, and Power BI Weekly.

Keep up with everything that's going on at endjin via our blog, follow us on Twitter, or LinkedIn.

Our other Open Source projects can be found at https://endjin.com/open-source

Code of conduct

This project has adopted a code of conduct adapted from the Contributor Covenant to clarify expected behavior in our community. This code of conduct has been adopted by many other projects. For more information see the Code of Conduct FAQ or contact hello@endjin.com with any additional questions or comments.

About

A framework to measure maturity across the disciplines required to build, test, deploy, reuse, support & sell Intellectual Property

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published