Launchpad logo and name.


[Date Prev][Date Next][Thread Prev][Thread Next][Date Index ][Thread Index ]

[Launchpad-users] Suggestion for large scale bug report management



Problem:

Large open source projects are drowning in bug reports.
Developers are having a hard time getting an overview of all the bugs, and getting them all prioritised. This is not true for all projects, but it is true for some, and the problem usually grows with the size of the project.

In a perfect world, the developers would have a global/total ordering of all bugs, ranking them from those that should be fixed first, to those that should be fixed last. In that world, all he would have to do when fixing bugs is to start working on the first bug in the queue, knowing that his time will be best spendt fixing exactly that bug. How can we get closer to this scenario?

As it is now, a privileged group of people takes care of reviewing and prioritising bugs. Its all well and good, except for the fact that they can't keep up with the flood of incomming bug reports. This is evidenced by the number of bugs with the tag 'new' in eg. Launchpad. This leaves most of the bug land in darknes...

I suggest using crowd sourcing to mitigate this problem. The croud is often interested in seing bugs fixed (else they wouldn't report them) and this eagerness can be harvested for the benefit of the developer :)
Now, single persons from a crowd may not be as good at bug triaging as experienced bug triagers are. Fortunately, as long as the crowd person has just a slight idea about what he is talking about, the law of big numbers will ensure that the crowd sourced triaging will be trustworthy as well. Maybe even more trustworthy (its the same debate as the trustworthyness of eg. Wikipedia vs eg. Encyclopedia Britannica).

Great, so we want to source the crowd, but how? We want an ordering, but to order, we need criteria. I suggest the following criteria, but others can be used:

1. The negative impact of the bug on the usefullness of the software in question
2. The amount of work needed to fix the bug
3. The amount of expertice needed to be able to fix the bug
(The developer will be free to ignore one or more criteria if he doesn't think them relevant)

The total ordering will then be created by balancing theese criteria against each other. Eg. for a given bug, each criteria can be given a number between 0 and 100, and the total score would then be the sum of the three numbers, or the product or a weighted product.

How could this be implemented in a bugtracker? So: Every bug should be equipped with options to register ones suggestion of how it should be ranked on those three criteria. Also, for each ranking, one should be able to state how confident one feels about that ranking.

The crowd sourced data should automatically become available to the developers. If the developer likes the data, he should be able to op in to automatic triaging, based on these data. As an example, he could say that when more than X people has rated a bug at level Y (where Y is the aforementioned aggregated ranking), it should automatically change status from 'new' to 'confirmed' and be given the corresponding importance (eg. based on criteria no. 1).

The crowd should also be able to see the crowd sourced data, so they can see at which extend the developers embrace the crowd-sourced data. If the developers of a given project doesn't use it, the crowd can spend their time better at rating bugs in other projects.


kind regards Jon Loldrup
Denmark


This is the launchpad-users mailing list archive — see also the general help for Launchpad.net mailing lists.

(Formatted by MHonArc.)