Git Product home page Git Product logo

metrics's Introduction

CHAOSS Metrics

Welcome to the CHAOSS Metrics repository. CHAOSS Metrics work captures metrics for open source community health more transparent and actionable.

Current CHAOSS Metrics Release

The current release of CHAOSS metrics can be found at: https://chaoss.community/metrics/.

Working on CHAOSS Metrics

The current release of CHAOSS metrics represents the published work being done by CHAOSS Metrics Working Groups. The work in the WGs does not necessarily coincide with the formally releases metrics as the WGs are developing new, yet-to-be-released metrics.

Work being done in the CHAOSS WGs can be found at:

CHAOSS Common Metrics Working Group

CHAOSS Diversity & Inclusion Working Group

CHAOSS Evolution Working Group

CHAOSS Value Working Group

CHAOSS Risk Working Group

Metrics Resources

Metrics Template

Contribute and Participate to the CHAOSS Project and Metrics Working Groups

Contribute and Participate.

Repository Maintainers

How to become a maintainer

License

All contributions to implementation-agnostic metrics and standards, including associated scripts, SQL statements, and documentation, will be received and made available under the MIT License (https://opensource.org/licenses/MIT).

metrics's People

Contributors

a-hodges avatar amunz avatar arnetillmann avatar arnom-ms avatar damienlegay avatar datagoggles avatar downeymj avatar elizabethn avatar geekygirldawn avatar georglink avatar germonprez avatar harshalmittal4 avatar illuminatian avatar jeremiah avatar jgbarah avatar jlrifer avatar klumb avatar lawrencehecht avatar lucasgonze avatar marierere avatar nebrethar avatar pmonks avatar polaris000 avatar ritik-malik avatar satyam-bhalla avatar sgoggins avatar tobie avatar yash-yp avatar zxiiro avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

metrics's Issues

Describe path to maintainership

We should describe a path to maintainership for the metrics repository and Metrics Committee.

This feeds into succession planning and inclusion efforts.

New Metrics: Support of source code related metrics

After the ideation of adding support to GrimoireLab to produce source code related metrics using Graal was introduced in chaoss/grimoirelab#182 and supporting discussion thread chaoss/grimoirelab#182 (comment) and a response chaoss/grimoirelab#182 (comment). I had some thoughts to get the discussion started here in order to understand the current state and future scope of additions of metrics.


Some of the ideas aligned were:

  1. License & Copyright analysis related

    • All licenses: List of licenses present in the repository.
    • License Count: Number of licenses found in a software repository.
    • Package License Declaration: A list of license declarations on the software
      package.
    • Copyright Declaration: The degree to which the project properly declares
      copyright details.
    • License Coverage: Number of files with a file notice (copyright notice + license
      notice) and ones without a file notice.
  2. Code complexity analysis related ( cyclomatic complexity, LOC )

    • LOC: Total Lines of Code
    • CCN: Cyclomatic Code Complexity of a software repository.
    • Complex Files: A list of files with higher complexity in terms of Cyclomatic Code
      Complexity (CCN) & Lines of Code (LOC).
    • Functions: Number of functions
    • Comments: Total comment lines present in the repository.

Would like to know if this is the right place for the discussion.
I'd be more than happy to join a call for the discussion :)
Would also like to have some pointers from @jgbarah @valeriocos @aswanipranjal

Thanks

We need a standard naming convention for our individual and aggregate metrics

I am going through the metrics markdown files and finding a lot of repetition. I would like to push a fix that will allow us to get these pages up to date and easier to work with.

Besides some basic editing, I think we need to agree on a standard naming convention for these files. I propose:
"the name of the resource being measured"-"more_description_if_necessary"- "the_measurement_if_necessary".md

Ex:
issues-open-average_time_distribution.md

This example metric describes the average number of issues over a period of time.

By doing this we can easily sort and identify related metrics.

Thoughts?

October 21 metrics release PDF can't be searched

Platform: Mac (Monterey)

  1. Get PDF of the metrics release (https://chaoss-workspace.slack.com/files/U020YDNUXL4/F02JHLNCGLB/english-release-2021-10-21.pdf)
  2. Open it either in Preview or in the native PDF reader in Firefox
  3. Do a search within the file. In Preview that is Command+F
  4. Search for a string you are certain is in the file, such as "Metrics."

Result: no items are found

Why this matters: one of the use cases for this combined file is to simplify looking up metrics

Infrastructure for releasing CHAOSS metrics

The proposed infrastructure for releasing CHAOSS metrics has two components:

  1. Index page of release metrics
  • https://chaoss.community/metrics
  • manually curated markdown file in chaoss/website
  • Templated through WordPress
  • list of: metric name, link to specification on WG GHpage
  • Note at the beginning that our aim is to routinely publish metrics via working groups and that implementation of those metrics in tooling being developed will be an important consideration for what is published and worked on. We want to deliver not only metrics definitions but ways that people can examine those metrics.
  1. Templated specification pages in WG repos
  • Each WG repo has same template, GHpages
  • Subdomain, e.g. https://gmd.chaoss.community/
  • In addition to released metrics pages, WGs can publish additional pages by defining what gets published through a settings file

To-Do's:

  • AI Georg, Tobie: develop template for repo pages
  • AI CHAOSS community: determine sub-domain name for WG's (e.g., gmd.chaoss.community, di.chaoss.community, common.chaoss.community)
  • AI WGs: determine metrics for release
  • AI Kevin: change https://chaoss.community/metrics

Linter Metrics

I wanted to share the idea of a metric based on code linter.

There's a tool on GitHub (with an article) that provides an XML output on code quality based on the linter of a C/C++ program.

I haven't used it yet, and I am just providing it for an example.

To release or not to release?

Posted to mailing list:

  • What does version mean for CHAOSS metrics?

    • What should people understand a version to be?
    • Does it mean a metric will not change? - Probably not, because the metrics definition is a living document.
    • Does it mean that we want implementors to focus on the released version, rather than the most current version? - Probably not, because implementors should implement the latest changes and we update the definition to reflect any challenges (which would not be documented and fixed in the released version).
  • Issues with version:

    • people accidentally implement outdated version without realizing it
    • can cause a distance between our definitions and implementations
    • can cause confusion in users when metrics definitions change between versions (what version am I on and what does this mean?)
    • Software versions are becoming less important due to frequent updates, and our metrics should be reflecting this new reality.
  • benefit of versions

    • Gives us a reason to release an announcement (public relations)

Participation vs. Inclusion

Just thinking about Participation vs. Inclusion in the D&P metric. I go back and forth on this a bit. I'm staring to think that Inclusion is the better word just as it more applied/used in practice.

Remove Documents Folder + Content

In an effort to reduce the repo complexity, I think we should keep random notes off the repo. Maybe wait for the notes to be transcribed and then remove it.

New Issue - Test

I'm testing to see if I can submit a new issue. Nothing is wrong.

[BUG REPORT TEMPLATE]

Steps to Reproduce:

  1. ...step 1 description...
  2. ...step 2 description...
  3. ...step 3 description...

Expected Result:

...description of what you expected to see...

Actual Result:

...what actually happened, including full exception stack traces, log entries, screen shots etc. where appropriate...

Environment:

...your OS & version, browser, etc. etc....

[ENHANCEMENT REQUEST TEMPLATE]

Description of Problem:

...what problem are you trying to solve that the project doesn't currently solve?

...please resist the temptation to describe your request in terms of a solution. Job Story form ("When [triggering condition], I want to [motivation/goal], so I can [outcome].") can help ensure you're expressing the problem, rather than potential solutions.

Dependency Metric

An interesting note. This is the second time that we ran a session where we asked people consider metrics --> Diversity, Risk, Lifecycle, and Dependency -- that Dependency was not attended by anyone.

Rename Second Level metrics folder

Rename metrics/metrics
to
metrics/activitymetrics

Also, move the activity-metrics.md into that folder. Overall, I think we should reduce the noise on that front page.

TODO: Metrics-related Notes that need to find a home

TODO

Please help find a home for each of the following:

  • Appeared in Event Diversity - Attendees Demographics:

    • if you ask before and after the event or collect data at different points in time, the data may not be able to work together. For example, asking for perception and “objective” numbers should stay together.

    • Stipulative terminology section in surveys can help with consistent answers across respondents. [Survey best practices]

    • Note about survey design: We can use the emoji-scale:
 ◯ 😔 ◯ 😟 ◯ 😐 ◯ 😊 ◯ 🤩


Please feel free do add/check items as needed!

New metric under Growth-Maturity-Decline

All,

Before I do another pull request, I thought I discuss this to get a consensus. Under the "code development" section of G-M-D, I'm thinking of another metrics for number of code reviews over a time period. In OPNFV Gerrit, I wanted to know for each contributor how many reviews they have done (and even how many (sub)projects they contributed reviews to) over a time period and realized that this was not part of our dashboard. After talking to Dani at Bitergia, I created a feature request in the Grimoirelab repo at chaoss/grimoirelab#58

Any thoughts/feedback on this?

Cross check metrics with workgroups

We now have a new Readme. The next step is to add a new column that indicates in what workgroup the metric is advanced. New workgroup metrics may have to be added to this list.

Prototype:

Metric Description Workgroup
D&I
GMD
D&I; GMD

Deprecating the Metrics Repository

During the last board meeting and monthly CHAOSS meeting, we decided to deprecate the metrics repository. There is good information in here that needs to be retained but I 100% agree that the metrics repository is not the place for maintaining canonical metrics lists, reflecting the work going on in D&I and GMD.

So, I propose:

  1. https://chaoss.community/metrics/ gets updated to directly point to the work group lists (i.e. focus areas)
  2. The metrics repository simply becomes the Activity Metrics List - this will allow us to capture new metrics without creating unnecessary noise in the work groups.
  3. We only keep the metric name and description -- the detail behind (known implementations; visuals) is simply too difficult to maintain and keep updated.

New Metric? User Experience

In the session on 9/27, one metric that attracted a lot of interest was

Contributor Experience. This included things like:

Issue Resolution Bot - A bot that queries those on the issue/pr that asks how satisfied they were with the resolution of the the issue.

Survey the community on experience

Focus groups with the community

New metric: corporate vs community driven

If a project is driven by a broad community, this is visible from the steady slow continuous growth of SLOC and high number of contributors and commits (even just fixing typo in the documentation). If a project is driven by corporate entities, this is visible from a ladder-style or hiccup growth as development happens within the company and then at given release time there is a sudden significant contribution in one shot. In addition, a corporate driven project has a smaller number of committers, all employees / contracted developers. This metric is not stating which project is better or worse, just sticking a "corporate driven" or "community driven" label to a project.

As a comparison, look at Webkit vs Mozilla Firefox. Webkit is driven by Apple and Google mainly for their Safari and Chrome browsers while Firefox is a community effort.

Diversity Pipeline

Idea for a new metric: Diversity Pipeline

While attending the Open Source Summit Europe - diversity track, I scribbled down in my notes: "Diversity Pipeline - of organizations with multiple community members, what is the diversity of the organization's representation within the community?"

Today, I interpret this to show how well an organization is doing in using the diversity they have internally to represent themselves in open source communities. The idea is that if an organization has a gender diversity of 20% and are represented in an open source community with 5 men and 5 women, then their Diversity Pipeline metric is 50%/20% = 250% !!!

From a community perspective, this metric only applies to organizations that have multiple employees working on the same open source project. From an organization perspective, all employees engaged in open source could be considered and the Diversity Pipeline metric is not specific to an individual community.

Thoughts?

Maintainability: Clear Code and Clean Code

I'm not sure if this could be D&I or Risk or Value (or all of the above).

Maintainability:

If a stranger can modify your code and fix a bug in less than an hour, it is maintainable.

Test it. Ask someone who is outside of the project to take a look at your code and tell you how clear it is. Not how beautiful your classes and code constructs are—that’s what makes it clean. Instead, ask someone to fix a bug in just 30 minutes and see how they react. You will realize how clear the code is and whether it speaks the language a stranger can understand.

Full reference: https://www.yegor256.com/2018/09/12/clear-code.html

Idea: Metric quality checklist

The latest addition to the template, like #192, #191 , and #189 are all requirements or suggestions.

Idea: Pull together all requirements in a checklist that we go through before releasing a metric, including content quality checks and technical requirement checks.

This could be an issue template that WGs use for new/updated metrics and look like:


CHAOSS Metric Quality Checklist

Content Quality

  • Required headings are filled in, including Question.
  • Description provides context to metric
  • Objectives list sample uses for the metric and desired outcomes
  • If any, DEI uses of the metric are included Objectives
  • ...

Technical Requirements

  • Metric file follows naming convention
  • Images are included using markdown and relative links
  • Focus area entry created and links to the metric
  • ...

Process Requirements

  • Release issue for comments during review period
  • Entry to release notes
  • New/updated metric announced on mailing list
  • Issue was created to kick-off translation to other languages
  • ...

New to the team

I want to contribute in your Gsoc projects. But i was not attending your weekly calls. But from now i will be attending the calls.

Common Metrics repository README and other documents

Question: Is the common metrics work group going to continue to work out of the metrics repository or will it be getting its own repository?

Regardless of the answer, can we create a README.md for the group that I can link to from the website.

Additionally, it would be nice to have some documented processes for how to contribute.

I like the idea of keeping it in the metrics repository but how it is structured within the repository probably needs some discussion.

Consider an optional section on related metrics / dependencies

In the Value Working Group, we regularly referring to other metrics as related to a given metric.

At times, it's in a dependent way: I require number of forks in order to calculate popularity.

At other times, it's to offer a different framing: what an organization may consider popularity, an individual may consider a sign of skill demand.

These see-it-from-every-angle trains of thought can make the Description sections a bit lengthy. I wonder if this could be better served as a separate (optional) section?

cc @germonprez @ElizabethN @sgoggins @GeorgLink since we were talking about it.

Get Metrics Together

Is there a good naming convention to get all of the meta-metrics together? Something like:

METRIC.Risk.md
METRIC.Diversity.md

I hate to keep using the word METRIC but right now the metrics are sort of scattered on that front page.

require DCO for all new commits

This issue is to activate protobot/dco (or similar bot) to check that all commits have a sign-off. The CHAOSS Project Charter requires that all contributions are signed-off.

For users of the git command line interface, a sign-off is accomplished with the -s as part of the commit command: git commit -s -m 'This is a commit message'

For users of the GitHub interface, a sign-off is accomplished by writing Signed-off-by: Your Name <[email protected]> into the commit comment field. This can be automated by using a browser plugin like scottrigby/dco-gh-ui

New to the open source and your organisation

I am new to the fascinating open source community .I have not attended any of your meetings but wish to contribute to your community.I know less time is remaining so kindly give me an overview and guidance.
Thank you.

Metric: Downloads

I know that we have talked about downloads and how difficult those numbers are to get and interpret.

Jeremy Katz wrote for Tidelift why Downloads is a terrible metric:
https://blog.tidelift.com/dont-believe-the-download-numbers-when-evaluating-open-source-projects

I am happy to report that I did not find "Downloads" in our 'activtiy-metrics' folder (against my intuition). This issue is simply to document that we have thought about this metric and have chosen to not include it (for now).

Template links to missing file

In metric-template.md, it reads:

The first few sentences should match the description in the [metrics list](../activity-metrics-list.md).

That file no longer exists. Is there a new location for that information or can we offer a short description of the goal in that file?

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.