← Back to team overview

dorsal team mailing list archive

Re: Installation problems on Mac OS X 10.6.4

 

 Den 11.10.10 16.04, skrev Andy Ray Terrel:
On Sat, Oct 9, 2010 at 8:46 AM, Andre Massing<massing@xxxxxxxxx>  wrote:
  Den 08.10.10 16.07, skrev Andy Ray Terrel:
Andre, did you get any demo programs to work with the mpich2 port?

I got an error that mpi_init wasn't called correctly again.  When I
disabled MPI but kept PETSc enabled, I get an error that a pointer is
freed before allocated (stack trace points at Global Parameters
KrylovSolver dying in the default Parameters copy constructor ... most
values were optimized away so I stopped digging).

Tested it out, my setup is:
-macports mpich2 +gcc44, gcc44 is also now my standard compiler.
-used petsccore.

Everything compiled fine, but the demos failed in general :

Sat Oct 09 15:23 andre@beluga:cpp 74>./poisson-demo
Solving linear variational problem
  Applying boundary conditions to linear system.
  Solving linear system of size 1089 x 1089 (PETSc Krylov solver).
  PETSc Krylov solver (gmres, ilu) converged in 39 iterations.
Plotting u (a function), press 'q' to continue...
Attempting to use an MPI routine before initializing MPICH
*** Warning: Unable to plot.

Same applies for the mixed poisson example.
It seems somehow caused by the plot function, both examples run smoothly
through
when I comment it out. Maybe I have to recompile vtk5 with mpich2 now? It is
really fun
to dig into all these libraries, compilers etc. dependencies ..

What can of failure did you experience? Does it work now on your machine?
So I got the same type of failure.  Although I didn't try commenting
out the plotting.  I probably won't have too much time for debugging
this as it takes so much time to futz with MacPorts.

-- Andy
I dug a bit deeper and it actually the loading of the swig generated python binding which fails (and even materializes in the c++ demos since the plot function in dolfin just does a *system* call to viper which loads the python counterpart of dolfin. Little bit interweaved :) I already wrote to the DOLFIN ml about this problem with mpich2.

--
Andre
Switching over to using all MacPorts system could be better but for
that super easy install the dmg would probably be easier to support.
I stopped using the Apple gcc because I wanted to use openmp for some
code.  Of course I'm accustom to fine tuning my builds of software so
unless there is a large need out there I wouldn't waste too many
cycles on it.

Yes, you are right. Then at some point I have to look at the cuda/nvida
examples, whether then can easily adapted.
But that's another story.

--
Andre

-- Andy



On Fri, Oct 8, 2010 at 6:07 AM, Garth N. Wells<gnw20@xxxxxxxxx>    wrote:
On 08/10/10 11:58, Anders Logg wrote:
On Fri, Oct 08, 2010 at 10:10:02AM +0100, Garth N. Wells wrote:
On 08/10/10 08:18, Anders Logg wrote:
On Thu, Oct 07, 2010 at 04:29:05PM -0500, Andy Ray Terrel wrote:
Hello,

I wanted to send in my unsuccessful account of installing FEniCS with
Dorsal, hopefully there can be someone who benefits from this.

My installation was attempted by updating mac ports and installing
the
list of ports requested by dorsal.

The first problem I encountered was vtk-devel +python26 doesn't work
for mac ports unless you are using the gcc originally installed by
Apple (/usr/bin/g++-4.2).

Second, petsc complained that fortran for mpi can't be used.  At this
point I tried out the openmpi ports project but that caused many
headaches later.  Perhaps setting --with-fortran=0 would be good
here.

Next, petsc could not link to metis correctly, turns out the octave
port installed a older version of metis that was getting in the way.

Finally, everything installs but when I run any demo, the demo runs
then openmpi gives an error claiming 1) a system-required executable
either could not be found and 2) mpi_init is called after finalize.
  I
couldn't figure out what is going on here as the mpi init and
finalize
should be set correctly by the subsystem manager.  The problem did
not
occur when running the Petsc demos.

At this point I just installed everything by hand. Since Anders says
things work, I wanted to make sure there was some knowledge of where
things are breaking in the wild.  If you want me to try something
different for testing let me know.
Thanks for the report. It seems like most of the issues result from
using a different configuration than what Harish and I have been using
when testing (not using GCC from Apple etc). I'm not sure how to fix
this. It would require quite a bit more of work on the build scripts.

One thing that might solve this issue is a mac bundle (.dmg) for the
whole FEniCS (which does not even require mac ports). It might not be
the best option for you if you want to control which GCC etc is used,
but it might help others.

Another thing that we've discussed is using MacPorts rather than the
Apple supplied compilers and libs. If we use MacPorts for everything
and have the configuration system check for MacPorts versions, we
can be more confident in what's being used. Hopefully the PETSc
Fortran issue would then go away too.

Garth
Yes, that's an option but it would require quite a bit of work and
Harish didn't think it was worth the effort.

It didn't seem to me that the work was the issue. My recollection is that
Harish favoured using the systems libs over MacPorts.

Garth

--
Anders
_______________________________________________
Mailing list: https://launchpad.net/~dorsal
Post to     : dorsal@xxxxxxxxxxxxxxxxxxx
Unsubscribe : https://launchpad.net/~dorsal
More help   : https://help.launchpad.net/ListHelp

_______________________________________________
Mailing list: https://launchpad.net/~dorsal
Post to     : dorsal@xxxxxxxxxxxxxxxxxxx
Unsubscribe : https://launchpad.net/~dorsal
More help   : https://help.launchpad.net/ListHelp

_______________________________________________
Mailing list: https://launchpad.net/~dorsal
Post to     : dorsal@xxxxxxxxxxxxxxxxxxx
Unsubscribe : https://launchpad.net/~dorsal
More help   : https://help.launchpad.net/ListHelp





References