fenics team mailing list archive
-
fenics team
-
Mailing list archive
-
Message #00448
Segfault while executing FeniCS programm
Hi there,
I'm writing you to ask you if you could help me with a problem:
I'm Using Ubuntu 8.04 x86_64 with kernel 2.6.24-19-generic and installed
FEniCS on my computer.
But when I want to start the demo.py, I'm getting pretty soon a fatal MPI
error.
I already googled this issue, but the only "solution" I found there:
http://www.mail-archive.com/dolfin-dev@xxxxxxxxxx/msg02795.html
But this solution (to follow the installation instruction on the FEniCS
homepage) was exactly what I've done to install FEniCS.
I hope you can give me some more information or support to solve this issue.
If you need any additional informations, don't hesitate to ask for it.
Here is the output of the demo.py (which I attach to this mail):
==========================
jiping@iwr22:~/Documents/FEniCS$ python demo.py
[iwr22:06710] mca: base: component_find: unable to open osc pt2pt: file
not found (ignored)
libibverbs: Fatal: couldn't read uverbs ABI version.
--------------------------------------------------------------------------
[0,0,0]: OpenIB on host iwr22 was unable to find any HCAs.
Another transport will be used instead, although this may result in
lower performance.
--------------------------------------------------------------------------
Solving linear PDE.
Assembling matrix over cells (finished).
Assembling vector over cells (finished).
Assembling vector over exterior facets (finished).
Computing Dirichlet boundary values, topological search (finished).
Applying boundary conditions to linear system.
Solving linear system of size 1089 x 1089 (PETSc LU solver, umfpack).
Saved function u (discrete function) to file poisson.pvd in VTK format.
*** An error occurred in MPI_Attr_get
*** after MPI was finalized
*** MPI_ERRORS_ARE_FATAL (goodbye)
[iwr22:6710] Abort before MPI_INIT completed successfully; not able to
guarantee that all other processes were killed!
*** An error occurred in MPI_Comm_rank
*** after MPI was finalized
*** MPI_ERRORS_ARE_FATAL (goodbye)
*** An error occurred in MPI_Type_free
*** after MPI was finalized
*** MPI_ERRORS_ARE_FATAL (goodbye)
Segmentation fault
==========================
Here is, what /var/log/messages tells me:
==========================
Jul 31 17:15:55 iwr22 kernel: [ 454.120388] python[6710]: segfault at 58
rip 7f020373c666 rsp 7fff0dfae960 error 4
==========================
I hope you can help me and thank you in advance,
Jiping Xin
--
Jiping Xin
Engineering Mathematics, Mathematics School of Chalmers University of
Technology
Brahegatan 9, 1207 Room, Gothenburg
from dolfin import *
# Create mesh and finite element
mesh = UnitSquare(32, 32)
element = FiniteElement("Lagrange", "triangle", 1)
# Source term
class Source(Function):
def __init__(self, element, mesh):
Function.__init__(self, element, mesh)
def eval(self, values, x):
dx = x[0] - 0.5
dy = x[1] - 0.5
values[0] = 500.0*exp(-(dx*dx + dy*dy)/0.02)
# Neumann boundary condition
class Flux(Function):
def __init__(self, element, mesh):
Function.__init__(self, element, mesh)
def eval(self, values, x):
if x[0] > DOLFIN_EPS:
values[0] = 25.0*sin(5.0*DOLFIN_PI*x[1])
else:
values[0] = 0.0
# Sub domain for Dirichlet boundary condition
class DirichletBoundary(SubDomain):
def inside(self, x, on_boundary):
return bool(on_boundary and x[0] < DOLFIN_EPS)
# Define variational problem
v = TestFunction(element)
u = TrialFunction(element)
f = Source(element, mesh)
g = Flux(element, mesh)
a = dot(grad(v), grad(u))*dx
L = v*f*dx + v*g*ds
# Define boundary condition
u0 = Function(mesh, 0.0)
boundary = DirichletBoundary()
bc = DirichletBC(u0, mesh, boundary)
# Solve PDE and plot solution
pde = LinearPDE(a, L, mesh, bc)
u = pde.solve()
plot(u)
# Save solution to file
file = File("poisson.pvd")
file << u
Follow ups