← Back to team overview

dolfin team mailing list archive

Demos, documentation, tests and coverage

 


DOLFIN currently has 7 carefully documented demos and 67 undocumented
demos.

In theory, we could write careful documentation for the remaining 67
demos, and continuously maintain and update the documentation for all
74 demos. Somehow, I do not see this happening. However, I would like
to see the demo documentation more complete.

So, plan. Evaluate list of demos with eye to the following

* What functionality does this demo demonstrate?
* Are there other demos that demonstrate the same functionality?
* Does this demo demonstrate an important point?
* Does this demo illustrate the "best practice" to implement a
  feature?
* Can we scratch this demo?

Example:

  We have (at least) 4 stokes demos: stokes-iterative, stokes-mini,
  stokes-stabilized, stokes-taylor-hood. All of these illustrate the
  use of mixed elements, boundary conditions on mixed elements, use of
  mesh functions for marking sub-domains. Of these, I would scratch
  all except stokes-iterative.

After that I would suggest remove a large amount of demos, and add
careful documentation for those remaining.

On the other hand, the current demo collection serves a very useful
testing purpose. If we scratch a bunch of demos, more functionality
will be left untested. One solution would be to replace the scratched
demos by more unit tests.

However, in order to know which functionality that the demos actually
test and which tests that are missing, it would be good to have some
kind of code coverage reports. We had that a while back. (I seem to
remember that the code coverage was pretty bleak..., so maybe that is
why it was removed... ;-)) Could we put that back in with one of the
buildbots?

Also, note that I'm volunteering for the above tasks (except code
coverage reports) ... :-)

--
Marie

Follow ups