← Back to team overview

fenics team mailing list archive

Documentation; suggestions and questions from SUNY Stony Brook.

 

Hello all,
    First I wanted to say I was elated to find this Fenics project at
such an advanced stage and wanted to say thanks for all your labors.
Despite the sensation that I'm automating myself out of a job it's
been fun trying to incorporate dolfin into my abstractions.
I'm sure many of my suggestions are already in the works and questions
just need a pointer to certain directories, but hopefully the
struggles I've been having are typical of new users and can be
constructive.

1.  It would be nice to have a class index of the assorted objects
that one can maneuver in a browser, an example would be something like
Cern's root index

http://root.cern.ch/root/html/ClassIndex.html

though an improvement to there webpage could be made by linking to
tutorials that illustrate the use of the objects as well.  I suspect
this is in the works so I was hoping for access to unofficial
documentation while the official documentation is being rendered.

2.  Despite being decently trained in functional methods I still seem
to struggle, in particular with boundary conditions or when time is
involved, to see how the variational problems of the tutorials relate
back to the original pde being solved.  It would be nice if a few
lines were devoted to a latexed equation stating abstractly the
functional problem being solved, if not a whole derivation from a
pde/ode of such an equation.

for example let me state some confusion I have with the
advection-diffusion example.

 I would expect a term with something like 0.5*k*v*div(velocity)*u as well in
a = v*u*dx + 0.5*k*(v*dot(velocity, grad(u))*dx + c*dot(grad(v), grad(u))*dx)
L = v*u0*dx - 0.5*k*(v*dot(velocity, grad(u0))*dx + c*dot(grad(v),
grad(u0))*dx) + k*v*f*dx

if you could pardon some latex, is the underlying pde not (using delta
for the divergence of a gradient)

\partial_t u = c*\delta u - \partial_i{(velocity*u)}+f

?  what changes f from staying zero in the demo?

# Set up boundary condition
g  = Constant(1.0)
bc = DirichletBC(Q, g, sub_domains, 1)

is this setting u = 1 at time zero?
is it implied then that homogeneous neumann conditions are in place on
the spatial boundaries?

I'm trying to develop an example in which the diffusion, c,  were a
spatially dependant tensor, but am having trouble finding the right
objects to do so.  How would one implement a term like dot(grad(v),
c*grad(u))*dx where c would be a matrix expression of sorts that could
say rotate and shrink grad(u) before taking the dot product with
grad(v)? Any suggestions?

Thank you very much for existing and your attention,
   Nathan Borggren
   SUNY Stony Brook, Physics and Astronomy




On Tue, Mar 30, 2010 at 12:44 AM, Johan Hake <johan.hake@xxxxxxxxx> wrote:
> On Monday March 29 2010 04:28:24 Kristian Oelgaard wrote:
>> On 29 March 2010 12:55, Anders Logg <logg@xxxxxxxxx> wrote:
>> > On Mon, Mar 29, 2010 at 12:19:22PM +0200, Kristian Oelgaard wrote:
>> >> On 26 March 2010 19:10, Anders Logg <logg@xxxxxxxxx> wrote:
>> >> >On Fri, Mar 26, 2010 at 02:19:31PM +0100, Kristian Oelgaard wrote:
>> >> >>On 23 March 2010 20:59, Anders Logg <logg@xxxxxxxxx> wrote:
>> >> >>>It looks like we have converged towards reST/Sphinx to be used for
>> >> >>>both the C++ and Python interfaces, as well as for docstrings.
>> >> >>>
>> >> >>>I have summarized some of my conclusions here:
>> >> >>>
>> >> >>> https://blueprints.launchpad.net/fenics-doc
>> >> >>
>> >> >>Looks good, the only thing I don't agree with (possibly because I
>> >> >>don't understand it) is why we will use reST and Sphinx to generate
>> >> >>the doc strings for the Python modules.  As I see it, Sphinx focuses
>> >> >>on handwritten documentation with the ability to auto-generate
>> >> >>documentation from doc-strings if needed. So I don't see a use for
>> >> >>generating a doc-string in Sphinx and then add it to some function,
>> >> >>what good will that do anyway to a developer who has opened a *.py
>> >> >>file to modify it?
>> >> >
>> >> >The motivation is the following:
>> >> >
>> >> >1. We want to split out the documentation so that it is not part of
>> >> >the code. The reasons for this are:
>> >> >
>> >> > 1.a) It may otherwise be hidden deep inside the code and difficult
>> >> > to edit, the prime example being the documentation of Expressions in
>> >> > Python, which is now hidden deep inside a very complex piece of
>> >> > code that handles the metaclass magic for Expressions.
>> >> >
>> >> > 1.b) It may otherwise clutter the code (especially for the otherwise
>> >> > clean C++ header files) and result in something like 90%
>> >> > documentation and 10% code. Then it's not code with documentation,
>> >> > it's documentation with some function declarations here and there.
>> >> >
>> >> > 1.c) It's easier to edit, spell-check, grammar check, translate etc
>> >> > if it is maintained separately.
>> >>
>> >> I agree to all this, so instead of the rather extensive doc-strings
>> >> we have now, we will just have one-liners (which will be overwritten
>> >> when importing the dolfin module) like we have in the C++ part of
>> >> DOLFIN to help developers working in the source files.
>> >
>> > Yes, something like that. It can either be one-liners or it can be
>> > empty. I'm not sure which is best.
>>
>> A last (for now) remark on the doc-strings, when generated in a separate
>> folder/project I think the doc-strings for the dolfin module should be
>> dumped in a module say docs.py and copied to the correct location in the
>> DOLFIN source tree manually to avoid a dependency between the two
>> projects.
>
> Agree, or maybe docstrings.py?
>
>> Also, do we want to handle doc-strings for the pure Python
>> projects like FFC, UFL etc. in the same way or should we extract
>> documentation from doc-strings as these modules have a much more shallow
>> tree structure which makes the documentation more manageable, or should we
>> strive to keep things uniform across the projects?
>
> Maybe use docstrings for now, and if it get out of proportion, which I do not
> think it will, we can consider putting it in its own documentation files.
>
> Johan
>
>
>> Kristian
>>
>> >> >2. The same documentation should appear in the docstrings (when typing
>> >> >help(Expression) or help(assemble) in Python) as in the manual (to
>> >> >avoid duplication of effort). Since by (1) we don't extract the
>> >> >documentation from the code, we must either do the opposite (which I
>> >> >prefer), or generate both documentation and docstrings from a third
>> >> >source (using a preprocessor as suggested by Hans Petter).
>> >>
>> >> I also prefer Hake's suggestion.
>> >>
>> >> >>As far as I can tell, Sphinx can't process doc strings from C++, but
>> >> >>there might be some workarounds using Doxygen if we decide that we
>> >> >>need it, otherwise we can still use Sphinx and simply write
>> >> >>everything by hand.
>> >> >
>> >> >We could use the Breathe (as pointed out by Andy) but my suggestion
>> >> >would be to have hand-written documentation (and then some script to
>> >> >check for missing documentation). Perhaps there's a way to use Doxygen
>> >> >to extract the function signatures and the one-line compact comment,
>> >> >then fill in the rest by hand.
>> >>
>> >> Sounds like a lot of work to get just one line of documentation, if
>> >> we can just run a script to check for functions/classes with missing
>> >> documentation, that will be enough.
>> >
>> > Agree.
>> >
>> > --
>> > Anders
>> >
>> > -----BEGIN PGP SIGNATURE-----
>> > Version: GnuPG v1.4.9 (GNU/Linux)
>> >
>> > iEYEARECAAYFAkuwhwYACgkQTuwUCDsYZdGWjQCgkFKOJlIsJimnd+2dbgsphvH5
>> > S+4Anjx+VFRlAQ6mkR0VwPTqwvWQJ/8b
>> > =BQaF
>> > -----END PGP SIGNATURE-----
>
> _______________________________________________
> Mailing list: https://launchpad.net/~fenics
> Post to     : fenics@xxxxxxxxxxxxxxxxxxx
> Unsubscribe : https://launchpad.net/~fenics
> More help   : https://help.launchpad.net/ListHelp
>



Follow ups