[OpenAFS-devel] Gerrit reviews and the rate of acceptance

Jason Edgecombe jason@rampaginggeek.com
Wed, 11 Apr 2012 13:33:49 -0400


On 04/11/2012 01:00 PM, Jeffrey Altman wrote:
> This week I have received private mail from several individuals which
> are concerned about the rate of OpenAFS development.  There were several
> topics raised across the e-mails and I'm going to pick out themes and
> try to address them in separate mails to this list and openafs-info as
> appropriate.
>
> The first theme is that it takes too long for patches to be accepted
> into OpenAFS.  As evidence of the overwhelming delays is the nearly 100
> patchsets that are sitting in Gerrit for more than two weeks.  Many of
> which have been sitting there for nearly two years.
>
> Many of these patchsets are ideas that were sent to Gerrit either so
> they wouldn't be lost or so others could use them as a starting point.
> Some of the patchsets are items that have been reviewed but which have
> outstanding issues which were never addressed.  But the vast majority of
> the items sitting in Gerrit for an extended period of time are patches
> that have simply not received sufficient reviews to decide what to do
> with them.
>
> The fact is that the number of active reviewers is exceedingly small.
> Here are the statistics for the 4180 patches contributed to the 'master'
> branch since Gerrit was deployed.
>
>     3178     Reviewed-by: Derrick Brashear
>     1355     Reviewed-by: Jeffrey Altman
>      217     Reviewed-by: Simon Wilkinson
>      155     Reviewed-by: Russ Allbery
>      110     Reviewed-by: Andrew Deason
>       68     Reviewed-by: Marc Dionne
>       51     Reviewed-by: Rod Widdowson
>       49     Reviewed-by: Asanka Herath
>       43     Reviewed-by: Alistair Ferguson
>       24     Reviewed-by: Peter Scott
>       21     Reviewed-by: Matt Benjamin
>       15     Reviewed-by: Tom Keiser
>       12     Reviewed-by: Dan Hyde
>       11     Reviewed-by: Michael Meffie
>        9     Reviewed-by: Chaz Chandler
>        6     Reviewed-by: Hartmut Reuter
>        6     Reviewed-by: Benjamin Kaduk
>        5     Reviewed-by: Stefan Kueng
>        5     Reviewed-by: Garrett Wollman
>        3     Reviewed-by: Jacob Thebault-Spieker
>        2     Reviewed-by: Thomas L. Kula
>        2     Reviewed-by: Steve Simmons
>        2     Reviewed-by: Phillip Moore
>        2     Reviewed-by: Mickey Lane
>        2     Reviewed-by: Jonathan A. Kollasch
>        2     Reviewed-by: Jeffrey Hutzelman
>        2     Reviewed-by: Claudio Bisegni
>        2     Reviewed-by: Adam Megacz
>        2     Reviewed-by: Ken Dreyer
>        1     Reviewed-by: sanket
>        1     Reviewed-by: Tharidu Fernando
>        1     Reviewed-by: Steven Jenkins
>        1     Reviewed-by: Stephan Wiesand
>        1     Reviewed-by: Rainer Toebbicke
>        1     Reviewed-by: Jason Edgecombe
>        1     Reviewed-by: Christof Hanke
>        1     Reviewed-by: Chaskiel Grundman
>        1     Reviewed-by: Chas Williams
>        1     Reviewed-by: Alexander Ivan Redinger
>        1     Reviewed-by: Alex Chernyakhovsky
>
> All reviews are performed by volunteers.   It is nobody's day job to
> perform Gerrit reviews.  Obviously Derrick, Russ and I as the
> gatekeepers that review patches as part of the approval process have
> performed the largest number of reviews.  A special note of thanks to
> Simon, Andrew, Marc and Rod for being at the top of the non-gatekeepers
> list.
>
> In the last year alone there have been 44 contributors whose work has
> been committed to openafs master.  Yet there are fewer than that number
> of individuals who have reviewed someone else's work since Gerrit was
> implemented.
>
> No one is being paid to review code.  Not even the gatekeepers.  The
> nearly 100 patchsets that are sitting in Gerrit represent approximately
> 2% of all of the patches that have been submitted.  The role of the
> gatekeepers in the review process is important but they should not be
> the sole review mechanism for OpenAFS submissions.
>
> OpenAFS needs a much more active reviewer community.
>
> If we were to impose a requirement that every patchset must receive a
> total vote of +4 or +5 before it could be submitted, would that help to
> obtain reviews?
>
> The floor is open for discussion.
>
> Jeffrey Altman

Is there an agreed-upon criteria for review? If so, where are the 
criteria? I ask this as I consider my "C" skill to be poorly developed.

What is the goal for patchset review? More eyeballs on fewer patches, 
fewer eyeballs with more patches? In other words, where do we want to be 
on the spectrum or quality vs. quantity?

Other ideas:
* giving more people rights to approve a patchset (more gatekeepers?)
* automatically approving a patchset after the voting threshold is met 
(robo-gatekeeper?)
* different levels of votes with an increased voting threshold. (i.e. 
gatekeepers =3 votes, active contributors=2 votes, everyone else=1 vote. 
3 votes approves a patch?)
* maybe we should keep a regularly-updated scoreboard and award brownie 
points for most active reviewers.

crazier ideas:
* are there other non-openafs communities that we could tap into? If 
stackoverflow.com did code reviews, that would be awesome.
* could we tie in with some other shared reputation/reward system? (i.e. 
earning stackoverflow credits, slashdot karma, bitcoins, 1 free support 
incident from an openafs partner)
* rewards for marketing openafs to the outside world. This might draw in 
fresh blood.
* frequent reviewers earn more votes on gerrit, with some type of 
metamoderation to deter abuse.
* require a commiter to review X patches, from a different person, for 
each patchset that he commits.(X could be 1-3 based on what the group 
decides)
* each reviewer gets X votes per patch that may be used to vote for 
feature prioritization.

Jason