openstack team mailing list archive
-
openstack team
-
Mailing list archive
-
Message #04884
Re: Metrics around code
I think I missed the larger discussion around why these metrics are getting published.
tl;dr metrics good; these metrics, maybe not so good
My worry about publishing number of commits by author is that it could make Gerrit, and our current review process a mess. I agree that smaller commits are better, but do not play well with the Gerrit workflow(in particular if a commit that has dependancies within the same branch is rejected, the entire branch has to be redone). My other fear is with pointless bickering(however good natured it may be) about who committed what(we've already seen it, "looks strange to me to see Jenkins up there").
I don't know if something similar would happen with the bug database, but I would hate to see people touching bugs just to get counted. The problem with software development metrics is that programers optimize processes for a living. When presented with a metric they will often subconsciously optimize for it.
None of this is original ideas to me, I've heard this argument against software metrics many times before. The given solution is to pick a metric that has a tangible meaning to your end users. Taken to the extreme you end up where the Lean Startup movement is; pick something measurable(time to allocate an IP), make a hypothesis(your patch), push it to half your users and observe the results(A/B testing).
I really appreciate the desire to measure the project, I just don't think this is the best way to go about it.
thanks,
Aaron
On Oct 19, 2011, at 6:30 AM, Rohit Karajgi wrote:
> I agree with the HTML report, and this report can be published onto OpenStack Jenkins for world readability. We have some similar metrics for bugs. The following metrics can be easily pulled from Launchpad's bug database using the python-launchpadlib api.
>
> 1. Bug Distribution - By Status
> 2. Bug Distribution - By Importance
> 3. Bug Distribution - By Milestone
> 4. Bug Distribution - By Bug owners
> 5. Bug Distribution - By Fixed-by
> 6. # of times a file was modified
> 7. # of lines modified per file
>
> #6 and #7 specifically can be quite useful to identify those modules that can be good targets for unit tests.
>
> Cheers,
> Rohit
>
> -----Original Message-----
> From: openstack-bounces+rohit.karajgi=vertex.co.in@xxxxxxxxxxxxxxxxxxx [mailto:openstack-bounces+rohit.karajgi=vertex.co.in@xxxxxxxxxxxxxxxxxxx] On Behalf Of Thierry Carrez
> Sent: Wednesday, October 19, 2011 3:04 PM
> To: openstack@xxxxxxxxxxxxxxxxxxx
> Subject: Re: [Openstack] Metrics around code
>
> Stefano Maffulli wrote:
>> You'll find there also the implementation details to answer the
>> question:
>>
>> Who commited to an OpenStack repo, how many times in the past 30
>> days?
>>
>> and a demo report built with Pentaho Reporting representing the total
>> number commits per repository in past 30 days
>> http://wiki.openstack.org/CommunityMetrics/Code?action=AttachFile&do=g
>> et&target=2011-11-commits30daysallrepo-obfuscated.pdf
>> [note: the email addresses are hidden on purpose]
>
> Can an HTML report be produced and posted instead ? It feels like that sort of information should be pullable rather than pushed, from a well-known website, and PDF adds an extra step to access, for no real value (is anybody going to print this ?)
>
>> First of all: do the numbers seem correct to you? In other words, does
>> the SQL query seem correct? Does the demo report look interesting to
>> you? What/how would you change?
>
> I can't really answer that question, but it looks strange to me to see Jenkins up there (I bet he didn't author any patch).
>
>> Then, I would like your feedback to refine the other questions we want
>> to see answered regularly, regarding code (we'll move on to bugs,
>> docs, etc later).
>>
>> Are the following reports interesting? Do we want to have them run
>> monthly or weekly?
>
> If the reports are not pushed, they can run more often. Maybe something like "last 30 days" (refreshed every week) and then generating a report per-milestone (at the end of every milestone) ? I think it would be good to know who committed code for a given milestone, rather than for a given arbitrary month.
>
>> Is this too much information or too little? What else would you like
>> to see regarding code?
>>
>> * Total number of commits across all repos aggregated per month
>> * Total number of commits per repository aggregated per month
>
> Maybe per-milestone would be more useful, though it's a bit more difficult to do (especially since all projects do not follow the common milestone plan).
>
> --
> Thierry Carrez (ttx)
> Release Manager, OpenStack
>
> _______________________________________________
> Mailing list: https://launchpad.net/~openstack
> Post to : openstack@xxxxxxxxxxxxxxxxxxx
> Unsubscribe : https://launchpad.net/~openstack
> More help : https://help.launchpad.net/ListHelp
>
> _______________________________________________
> Mailing list: https://launchpad.net/~openstack
> Post to : openstack@xxxxxxxxxxxxxxxxxxx
> Unsubscribe : https://launchpad.net/~openstack
> More help : https://help.launchpad.net/ListHelp
This email may include confidential information. If you received it in error, please delete it.
Follow ups
References