Page MenuHomePhabricator

Gerrit metrics: details about review queues
Closed, DeclinedPublic

Description

In bug 58424 and bug 58426 we are extracting median time to review of contributions received in Gerrit. However, we miss details about those review queues beyond the average data. For instance, dealing with most reviews in 3-5 days is not the same than dealin with some within a day while letting the rest stuck during weeks.

Let's look how the review queue looks in each repository e.g. changes reviewed in <2 days, <1 week, <1 month, <3 months, >3 months... (values may be fine tuned based on actual data)

It would be great if we could visualize whether there is any difference between the contributions from staff versus volunteers e.g. staff get quick reviews while independents don't.


Version: unspecified
Severity: enhancement

Details

Reference
bz58428

Event Timeline

bzimport raised the priority of this task from to Lowest.Nov 22 2014, 2:19 AM
bzimport set Reference to bz58428.

Ok Quim, let's talk about it next meeting and plan the execution of it.

http://korma.wmflabs.org/browser/gerrit_review_queue.html

Once we agree on how to count the age of changesets (see Bug 37463 Comment 15), maybe this one is simply about calculating the age of the age of the youngest and the oldest 25%, in addition to the median.

This would show us e.g. who is really fast with incoming traffic while being unable to beat the old stuff, and who has really a lot of very old changesets shamefully awaiting a review.

I don't think we need to apply this to the graph of organizations. We can probably add these two lines to "How long does it take to review code contributions?" and "review_time_days_median" in the list of repos.

After using http://korma.wmflabs.org/browser/gerrit_review_queue.html on a daily basis, I think it already offers the information we need to act -- and more. I'd rather focus on improving the data points we have and focus on what is really relevant. Resolving as WONTFIX.