← Back to team overview

yade-dev team mailing list archive

Re: KLA

 

>  This would be the maximalist way, but there 
> are so many functions and so many possible traversing routes, that it is 
> not something I'm considering at the moment (Sylvain Pion already 
> started CGAL python wrapping, but not for regular triangulation 
> http://cgal-python.gforge.inria.fr/).
> 
> Conclusion, I think I'll implement methods and give python access to 
> them for now. Access to data will be done in another step perhaps.

I need this for post-processing (mostly); so I would suggest to make one
function that would return plenty of data, since that way you don't need
to use cgal functions to get something meaningful. (even if cgal is
wrapped in python, it will take year(s) before it is in distributions)

> Problem : with this minimalist approach (or even with the one you 
> described, with acces to vertices and more), there is still no way to 
> get incident cells on a vertex/edge/facet. "Traversing" the 
> triangulation structure, and queries on connectivity, really needs to 
> wrap most CGAL functions.
I don't see any reason why not compute all this inside. You can make an
array that will represent topology, right? 

> The minimalist approach would be to give numpy arrays to cells, i.e. 
> four bodies ids, nothing more.
> It would be enough to compute strain per cell for instance, as you can 
> use python to get the displacement of each body and integrate on the 
> cell's contour.
I would be for "maximalist" approach, though. I know I could compute
strain per cell, but the point is that I don't have to do that in my
code. Also, it is probably more efficient to compute some unneeded data
in c++ than computing those you need in python. 

For myself, what I need is (from the top of my head):

1. volume of cells associated to particles (it would be volume of
Voronoi cells, if different radii were not in play; I guess you know
what I mean)

2. per-particle porosity measure (something like [sphere volume]/[cell
volume] or such), but this can be easily computed from the volume.

3. per-particle "strain" (definition not important now), computed from
adjacent terahedra deformation between vertices at state->pos and
state->refPos.

> Actually, my highest priority now is to compare triax and periTriax as 
> planned.
> I'll test the triangulation code in periTriax. With very few changes 
> (inserting wrapped points), it will compute strain in all cells except 
> the ones crossing the period limit. With a bit more changes, it could 
> consider all cells (duplicating shifted spheres and inserting them in 
> the triangulation). Just to be sure : did you start thinking about it or 
> should I start on my side? What do you plan in terms of triangulation? 
> I'd be glad to help.
> 
> Last thing : we will commit Luc's files for capillary law. They are just 
> big tables with numbers, used by the engine to interpolate capillary 
> force. I think I'll zip them and commit them somewhere in trunk, with 
> documentation and messages helping people to find and unzip them in 
> /bin. What do you think of this solution?

What size are those files? If they are more than, say, 0.5MB compressed,
I would put them on the web for download. (If they are in sources, they
can be uncompressed and installed by scons). Use lzma or bzip2
compression, they tend to be a lot more effective than zip. 

What I think is important is to include code that generates those
tables. Otherwise you're giving people some data they have to trust
without knowing how they were computed.

And lastly, if it takes less than few hours to generate those data, we
can just distribute the code that precomputes them (and write to text
files), instead of the tables themselves.

Cheers, Václav






Follow ups

References