← Back to team overview

dolfin team mailing list archive

Re: Demos, documentation, tests and coverage

 

On Fri, Feb 04, 2011 at 01:47:00PM +0100, Marie E. Rognes wrote:
>
> DOLFIN currently has 7 carefully documented demos and 67 undocumented
> demos.
>
> In theory, we could write careful documentation for the remaining 67
> demos, and continuously maintain and update the documentation for all
> 74 demos. Somehow, I do not see this happening. However, I would like
> to see the demo documentation more complete.
>
> So, plan. Evaluate list of demos with eye to the following
>
> * What functionality does this demo demonstrate?
> * Are there other demos that demonstrate the same functionality?
> * Does this demo demonstrate an important point?
> * Does this demo illustrate the "best practice" to implement a
>   feature?
> * Can we scratch this demo?
>
> Example:
>
>   We have (at least) 4 stokes demos: stokes-iterative, stokes-mini,
>   stokes-stabilized, stokes-taylor-hood. All of these illustrate the
>   use of mixed elements, boundary conditions on mixed elements, use of
>   mesh functions for marking sub-domains. Of these, I would scratch
>   all except stokes-iterative.

I think we should be allowed to keep a collection of demos that are
not documented. If someone is interested in Taylor-Hood, or how to
implement a stabilized solver for Stokes, they can look at those
demos. They may not be documented on the web, but most of the demos
are self-explanatory and can be understood without reading a text.

My suggestion would be to

1. Identify which demos are worth documenting
2. Identify which demos can be scratched
3. Leave the rest under undocumented as a resource pool for users

--
Anders


> After that I would suggest remove a large amount of demos, and add
> careful documentation for those remaining.
>
> On the other hand, the current demo collection serves a very useful
> testing purpose. If we scratch a bunch of demos, more functionality
> will be left untested. One solution would be to replace the scratched
> demos by more unit tests.
>
> However, in order to know which functionality that the demos actually
> test and which tests that are missing, it would be good to have some
> kind of code coverage reports. We had that a while back. (I seem to
> remember that the code coverage was pretty bleak..., so maybe that is
> why it was removed... ;-)) Could we put that back in with one of the
> buildbots?
>
> Also, note that I'm volunteering for the above tasks (except code
> coverage reports) ... :-)
>

> _______________________________________________
> Mailing list: https://launchpad.net/~dolfin
> Post to     : dolfin@xxxxxxxxxxxxxxxxxxx
> Unsubscribe : https://launchpad.net/~dolfin
> More help   : https://help.launchpad.net/ListHelp




References