← Back to team overview

dolfin team mailing list archive

Re: Buildbot failure summary

 

Thanks for the summary!

The first error involving as_backend_type is probably my fault. I will
take a look.

And I realize now the rename from down_cast conflicts with an example
in the book... Oops. :-)

--
Anders


On Fri, Sep 28, 2012 at 01:55:15PM +0200, Johannes Ring wrote:
> The buildbot page for DOLFIN-trunk
> (http://fenicsproject.org:8010/waterfall?project=dolfin&category=dolfin.trunk)
> shows that most of the buildslaves (ignoring the quick builders) are
> red, that is, they are failing one or more of the tests. Only two of
> the nine buildslaves are currently green. These are running Debian sid
> and Ubuntu quantal (development branch). Below follows a summary of
> the failures on each of the buildslaves.
>
> * lucid-amd64
> Buildslave info: http://fenicsproject.org:8010/buildslaves/lucid-amd64
> Last build: http://fenicsproject.org:8010/builders/dolfin-trunk-full-lucid-amd64/builds/101
> Unit tests failed: la (test, Matrix, Vector), fem (Assembler), book (chapter_1)
>   stdio: http://fenicsproject.org:8010/builders/dolfin-trunk-full-lucid-amd64/builds/101/steps/make%20run_unittests/logs/stdio
> Comments: All of the unit tests fails with error messages which looks like this:
>
> ======================================================================
> ERROR: test_matrix_data (__main__.EpetraTester)
> ----------------------------------------------------------------------
> Traceback (most recent call last):
>   File "./test.py", line 285, in test_matrix_data
>     A = as_backend_type(A)
>   File "/home/buildbot/fenicsbbot/trunk/dolfin-full/lib/python2.6/site-packages/dolfin/cpp/la.py",
> line 5133, in as_backend_type
>     subclass = get_tensor_type(tensor)
>   File "/home/buildbot/fenicsbbot/trunk/dolfin-full/lib/python2.6/site-packages/dolfin/cpp/la.py",
> line 5119, in get_tensor_type
>     "dolfin/swig/la/post.i")
>   File "/home/buildbot/fenicsbbot/trunk/dolfin-full/lib/python2.6/site-packages/dolfin/cpp/common.py",
> line 2041, in dolfin_error
>     return _common.dolfin_error(*args)
> RuntimeError:
>
> *** -------------------------------------------------------------------------
> *** DOLFIN encountered an error. If you are not able to resolve this issue
> *** using the information listed below, you can ask for help at
> ***
> ***     https://answers.launchpad.net/dolfin
> ***
> *** Remember to include the error message listed below and, if possible,
> *** include a *minimal* running example to reproduce the error.
> ***
> *** -------------------------------------------------------------------------
> *** Error:   Unable to Most probably you are trying to do something
> with a uBLAS Matrix. Tensor type: 'Matrix'.
> *** Reason:  dolfin/swig/la/post.i.
> *** Where:   This error was encountered inside Compatability check of
> tensor failed..
> *** Process: 0
> *** -------------------------------------------------------------------------
>
> Regression tests failed: demo/undocumented/mesh-generation/cpp
>   stdio: http://fenicsproject.org:8010/builders/dolfin-trunk-full-lucid-amd64/builds/101/steps/make%20run_regressiontests/logs/stdio
>   demo.log: http://fenicsproject.org:8010/builders/dolfin-trunk-full-lucid-amd64/builds/101/steps/make%20run_regressiontests/logs/demo.log
> Comments: Running the mesh-generation demo results in a segmentation fault.
>
>
> * oneiric-amd64
> Buildslave info: http://fenicsproject.org:8010/buildslaves/oneiric-amd64
> Last build: http://fenicsproject.org:8010/builders/dolfin-trunk-full-oneiric-amd64/builds/53
> Regression tests failed: demo/pde/stokes-iterative/python
>   stdio: http://fenicsproject.org:8010/builders/dolfin-trunk-full-oneiric-amd64/builds/53/steps/make%20run_regressiontests/logs/stdio
>   demo.log: http://fenicsproject.org:8010/builders/dolfin-trunk-full-oneiric-amd64/builds/53/steps/make%20run_regressiontests/logs/demo.log
> Comments: This failure has already been reported in bug #1052801
> (https://bugs.launchpad.net/dolfin/+bug/1052801).
>
>
> * osx-10.6
> Buildslave info: http://fenicsproject.org:8010/buildslaves/osx-10.6
> Last build: http://fenicsproject.org:8010/builders/dolfin-trunk-full-osx-10.6/builds/74
> Comments: Same error as on oneiric-amd64.
>
>
> * osx-10.7
> Buildslave info: http://fenicsproject.org:8010/buildslaves/osx-10.7
> Last build: http://fenicsproject.org:8010/builders/dolfin-trunk-full-osx-10.7/builds/107
> Unit tests failed: quadrature (BaryCenter), la (KrylovSolver)
>   stdio: http://fenicsproject.org:8010/builders/dolfin-trunk-full-osx-10.7/builds/107/steps/make%20run_unittests/logs/stdio
> Comments: The quadrature failure has been there for a very long time,
> reported as bug #918178
> (https://bugs.launchpad.net/dolfin/+bug/918178). The KrylovSolver test
> fails when run in parallel with the following error:
>
> One or more unit tests failed for la (KrylovSolver, Python):
> Process 0: Number of global vertices: 1089
> Process 0: Number of global cells: 2048
> Small graph
> Process 0: Partitioned mesh, edge cut is 61.
> Process 1: Partitioned mesh, edge cut is 61.
> Process 2: Partitioned mesh, edge cut is 61.
>
> Testing DOLFIN la/KrylovSolver interface
> ----------------------------------------
>
> Testing DOLFIN la/KrylovSolver interface
> ----------------------------------------
>
> Testing DOLFIN la/KrylovSolver interface
> ----------------------------------------
> FIXME: Preconditioner 'ilu' does not work in parallel, skipping
> FIXME: Preconditioner 'icc' does not work in parallel, skipping
> FIXME: Preconditioner 'ilu' does not work in parallel, skipping
> FIXME: Preconditioner 'icc' does not work in parallel, skipping
> FIXME: Preconditioner 'ilu' does not work in parallel, skipping
> FIXME: Preconditioner 'icc' does not work in parallel, skipping
> FIXME: Preconditioner 'jacobi' does not work in parallel, skipping
> FIXME: Preconditioner 'jacobi' does not work in parallel, skipping
> FIXME: Preconditioner 'jacobi' does not work in parallel, skipping
> FIXME: Preconditioner 'hypre_amg' does not work in parallel, skipping
> FIXME: Preconditioner 'hypre_amg' does not work in parallel, skipping
> FIXME: Preconditioner 'hypre_amg' does not work in parallel, skipping
> [0]PETSC ERROR: --------------------- Error Message
> ------------------------------------
> [0]PETSC ERROR: Unknown type. Check for miss-spelling or missing
> external package needed for type
>  seehttp://www.mcs.anl.gov/petsc/petsc-as/documentation/installation.html#external!
> [0]PETSC ERROR: Unable to find requested KSP type chebyshev!
> [0]PETSC ERROR:
> ------------------------------------------------------------------------
> [0]PETSC ERROR: Petsc Release Version 3.2.0, Patch 6, Wed Jan 11
> 09:28:45 CST 2012
> [0]PETSC ERROR: See docs/changes/index.html for recent updates.
> [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting.
> [0]PETSC ERROR: See docs/index.html for manual pages.
> [0]PETSC ERROR:
> ------------------------------------------------------------------------
> [0]PETSC ERROR: Unknown Name on a darwin11. named opeth.local by
> buildbot Fri Sep 28 04:30:14 2012
> [0]PETSC ERROR: Libraries linked from /Users/buildbot/local/lib
> [0]PETSC ERROR: Configure run at Tue Aug 21 11:37:44 2012
> [0]PETSC ERROR: Configure options COPTFLAGS=-O2 --with-debugging=0
> --with-shared-libraries=1 --with-clanguage=cxx --with-parmetis=1
> --with-parmetis-dir=/Users/buildbot/local --download-umfpack=1
> --download-hypre=1 --download-mumps=1 --download-scalapack=1
> --download-blacs=1 --download-ptscotch=1 --download-scotch=1
> --with-ml=1 --with-ml-lib=/Users/buildbot/local/lib/libml.so
> --with-ml-include=/Users/buildbot/local/include/trilinos
> --prefix=/Users/buildbot/local
> [0]PETSC ERROR:
> ------------------------------------------------------------------------
> [0]PETSC ERROR: KSPSetType() line 652 in
> /Users/buildbot/local/src/petsc-3.2-p6/src/ksp/ksp/interface/itcreate.c
> [0]PETSC ERROR: KSPSetFromOptions() line 292 in
> /Users/buildbot/local/src/petsc-3.2-p6/src/ksp/ksp/interface/itcl.c
> [0]PETSC ERROR: PCSetUp_MG() line 575 in
> /Users/buildbot/local/src/petsc-3.2-p6/src/ksp/pc/impls/mg/mg.c
> [0]PETSC ERROR: PCSetUp_ML() line 789 in
> /Users/buildbot/local/src/petsc-3.2-p6/src/ksp/pc/impls/ml/ml.c
> [0]PETSC ERROR: PCSetUp() line 819 in
> /Users/buildbot/local/src/petsc-3.2-p6/src/ksp/pc/interface/precon.c
> [0]PETSC ERROR: KSPSetUp() line 260 in
> /Users/buildbot/local/src/petsc-3.2-p6/src/ksp/ksp/interface/itfunc.c
> [0]PETSC ERROR: KSPSolve() line 379 in
> /Users/buildbot/local/src/petsc-3.2-p6/src/ksp/ksp/interface/itfunc.c
> [opeth:18527] *** Process received signal ***
> [opeth:18527] Signal: Abort trap: 6 (6)
> [opeth:18527] Signal code:  (0)
> [opeth:18528] *** Process received signal ***
> [opeth:18528] Signal: Abort trap: 6 (6)
> [opeth:18528] Signal code:  (0)
> [opeth:18527] [ 0] 2   libsystem_c.dylib
> 0x00007fff915abcfa _sigtramp + 26
> [opeth:18527] [ 1] 3   ???
> 0x0000000000000050 0x0 + 80
> [opeth:18527] [ 2] 4   libpetsc.dylib
> 0x000000010548b777
> _Z26PetscTraceBackErrorHandlerP19ompi_communicator_tiPKcS2_S2_i14PetscErrorTypeS2_Pv
> + 679
> [opeth:18527] [ 3] 5   libpetsc.dylib
> 0x0000000105488592
> _Z10PetscErrorP19ompi_communicator_tiPKcS2_S2_i14PetscErrorTypeS2_z +
> 338
> [opeth:18527] [ 4] 6   libpetsc.dylib
> 0x00000001058f33e2 _Z10KSPSetTypeP6_p_KSPPKc + 674
> [opeth:18527] [ 5] 7   libpetsc.dylib
> 0x00000001058e945b _Z17KSPSetFromOptionsP6_p_KSP + 3867
> [opeth:18527] [ 6] 8   libpetsc.dylib
> 0x00000001058a467d _Z10PCSetUp_MGP5_p_PC + 3677
> [opeth:18527] [ 7] 9   libpetsc.dylib
> 0x00000001058ffe5a _Z10PCSetUp_MLP5_p_PC + 7354
> [opeth:18527] [ 8] 10  libpetsc.dylib
> 0x00000001058ad5c1 _Z7PCSetUpP5_p_PC + 817
> [opeth:18527] [ 9] 11  libpetsc.dylib
> 0x00000001058eb55c _Z8KSPSetUpP6_p_KSP + 2828
> [opeth:18527] [10] 12  libpetsc.dylib
> 0x00000001058ebe1c _Z8KSPSolveP6_p_KSPP6_p_VecS2_ + 1676
> [opeth:18527] [11] 13  libdolfin.1.0.dylib
> 0x00000001038248f0
> _ZN6dolfin17PETScKrylovSolver5solveERNS_11PETScVectorERKS1_ + 1894
> [opeth:18527] [12] 14  libdolfin.1.0.dylib
> 0x0000000103825076
> _ZN6dolfin17PETScKrylovSolver5solveERKNS_15PETScBaseMatrixERNS_11PETScVectorERKS4_
> + 186
> [opeth:18527] [13] 15  _la.so
> 0x000000010bc5b7f9 _wrap_PETScKrylovSolver_solve + 5625
> [opeth:18527] [14] 16  Python
> 0x00000001022652a5 PyEval_EvalFrameEx + 18709
> [opeth:18527] [15] 17  Python
> 0x00000001022683e3 fast_function + 179
> [opeth:18527] [16] 18  Python
> 0x000000010226509d PyEval_EvalFrameEx + 18189
> [opeth:18527] [17] 19  Python
> 0x00000001022682a7 PyEval_EvalCodeEx + 2103
> [opeth:18527] [18] 20  Python
> 0x00000001021f558b function_call + 347
> [opeth:18527] [19] 21  Python
> 0x00000001021ccbe1 PyObject_Call + 97
> [opeth:18527] [20] 22  Python
> 0x000000010226595a PyEval_EvalFrameEx + 20426
> [opeth:18527] [21] 23  Python
> 0x00000001022682a7 PyEval_EvalCodeEx + 2103
> [opeth:18527] [22] 24  Python
> 0x00000001021f558b function_call + 347
> [opeth:18527] [23] 25  Python
> 0x00000001021ccbe1 PyObject_Call + 97
> [opeth:18527] [24] 26  Python
> 0x00000001021dea17 instancemethod_call + 503
> [opeth:18527] [25] 27  Python
> 0x00000001021ccbe1 PyObject_Call + 97
> [opeth:18527] [26] 28  Python
> 0x0000000102227e0e slot_tp_call + 94
> [opeth:18527] [27] 29  Python
> 0x00000001021ccbe1 PyObject_Call + 97
> [opeth:18527] [28] 30  Python
> 0x00000001022652b5 PyEval_EvalFrameEx + 18725
> [opeth:18527] [29] 31  Python
> 0x00000001022682a7 PyEval_EvalCodeEx + 2103
> [opeth:18527] *** End of error message ***
> [opeth:18528] [ 0] 2   libsystem_c.dylib
> 0x00007fff915abcfa _sigtramp + 26
> [opeth:18528] [ 1] 3   ???
> 0x0000000000000050 0x0 + 80
> [opeth:18528] [ 2] 4   libpetsc.dylib
> 0x00000001052ec777
> _Z26PetscTraceBackErrorHandlerP19ompi_communicator_tiPKcS2_S2_i14PetscErrorTypeS2_Pv
> + 679
> [opeth:18528] [ 3] 5   libpetsc.dylib
> 0x00000001052e9592
> _Z10PetscErrorP19ompi_communicator_tiPKcS2_S2_i14PetscErrorTypeS2_z +
> 338
> [opeth:18528] [ 4] 6   libpetsc.dylib
> 0x00000001057543e2 _Z10KSPSetTypeP6_p_KSPPKc + 674
> [opeth:18528] [ 5] 7   libpetsc.dylib
> 0x000000010574a45b _Z17KSPSetFromOptionsP6_p_KSP + 3867
> [opeth:18528] [ 6] 8   libpetsc.dylib
> 0x000000010570567d _Z10PCSetUp_MGP5_p_PC + 3677
> [opeth:18528] [ 7] 9   libpetsc.dylib
> 0x0000000105760e5a _Z10PCSetUp_MLP5_p_PC + 7354
> [opeth:18528] [ 8] 10  libpetsc.dylib
> 0x000000010570e5c1 _Z7PCSetUpP5_p_PC + 817
> [opeth:18528] [ 9] 11  libpetsc.dylib
> 0x000000010574c55c _Z8KSPSetUpP6_p_KSP + 2828
> [opeth:18528] [10] 12  libpetsc.dylib
> 0x000000010574ce1c _Z8KSPSolveP6_p_KSPP6_p_VecS2_ + 1676
> [opeth:18528] [11] 13  libdolfin.1.0.dylib
> 0x00000001036858f0
> _ZN6dolfin17PETScKrylovSolver5solveERNS_11PETScVectorERKS1_ + 1894
> [opeth:18528] [12] 14  libdolfin.1.0.dylib
> 0x0000000103686076
> _ZN6dolfin17PETScKrylovSolver5solveERKNS_15PETScBaseMatrixERNS_11PETScVectorERKS4_
> + 186
> [opeth:18528] [13] 15  _la.so
> 0x000000010babc7f9 _wrap_PETScKrylovSolver_solve + 5625
> [opeth:18528] [14] 16  Python
> 0x00000001020c92a5 PyEval_EvalFrameEx + 18709
> [opeth:18528] [15] 17  Python
> 0x00000001020cc3e3 fast_function + 179
> [opeth:18528] [16] 18  Python
> 0x00000001020c909d PyEval_EvalFrameEx + 18189
> [opeth:18528] [17] 19  Python
> 0x00000001020cc2a7 PyEval_EvalCodeEx + 2103
> [opeth:18528] [18] 20  Python
> 0x000000010205958b function_call + 347
> [opeth:18528] [19] 21  Python
> 0x0000000102030be1 PyObject_Call + 97
> [opeth:18528] [20] 22  Python
> 0x00000001020c995a PyEval_EvalFrameEx + 20426
> [opeth:18528] [21] 23  Python
> 0x00000001020cc2a7 PyEval_EvalCodeEx + 2103
> [opeth:18528] [22] 24  Python
> 0x000000010205958b function_call + 347
> [opeth:18528] [23] 25  Python
> 0x0000000102030be1 PyObject_Call + 97
> [opeth:18528] [24] 26  Python
> 0x0000000102042a17 instancemethod_call + 503
> [opeth:18528] [25] 27  Python
> 0x0000000102030be1 PyObject_Call + 97
> [opeth:18528] [26] 28  Python
> 0x000000010208be0e slot_tp_call + 94
> [opeth:18528] [27] 29  Python
> 0x0000000102030be1 PyObject_Call + 97
> [opeth:18528] [28] 30  Python
> 0x00000001020c92b5 PyEval_EvalFrameEx + 18725
> [opeth:18528] [29] 31  Python
> 0x00000001020cc2a7 PyEval_EvalCodeEx + 2103
> [opeth:18528] *** End of error message ***
> --------------------------------------------------------------------------
> WARNING: A process refused to die!
>
> Host: opeth.local
> PID:  18527
>
> This process may still be running and/or consuming resources.
>
> System test failed:
>   stdio: http://fenicsproject.org:8010/builders/dolfin-trunk-full-osx-10.7/builds/107/steps/make%20run_systemtests/logs/stdio
> Comments: It seem that SciPy is needed for the test:
>
> ======================================================================
> ERROR: test_diff_then_integrate (__main__.IntegrateDerivatives)
> ----------------------------------------------------------------------
> Traceback (most recent call last):
>   File "assembly_derivatives.py", line 152, in test_diff_then_integrate
>     F_diff = F((x1,)) - F((x0,))
>   File "/Users/buildbot/fenicsbbot/trunk/lib/python2.7/site-packages/ufl/exproperators.py",
> line 369, in _call
>     return _eval(self, arg, mapping)
>   File "/Users/buildbot/fenicsbbot/trunk/lib/python2.7/site-packages/ufl/exproperators.py",
> line 361, in _eval
>     return f.evaluate(coord, mapping, component, index_values)
>   File "/Users/buildbot/fenicsbbot/trunk/lib/python2.7/site-packages/ufl/mathfunctions.py",
> line 259, in evaluate
>     error("You must have scipy installed to evaluate bessel functions
> in python.")
>   File "/Users/buildbot/fenicsbbot/trunk/lib/python2.7/site-packages/ufl/log.py",
> line 148, in error
>     raise self._exception_type(self._format_raw(*message))
> UFLException: You must have scipy installed to evaluate bessel
> functions in python.
>
>
> * precise-amd64
> Buildslave info: http://fenicsproject.org:8010/buildslaves/precise-amd64
> Last build: http://fenicsproject.org:8010/builders/dolfin-trunk-full-precise-amd64/builds/58
> Comments: Same error as on oneiric-amd64.
>
>
> * precies-i386
> Buildslave info: http://fenicsproject.org:8010/buildslaves/precise-i386
> Last build: http://fenicsproject.org:8010/builders/dolfin-trunk-full-precise-i386/builds/44
> Unit tests failed: la (Matrix, KrylovSolver) plus possibly others
>   stdio: http://fenicsproject.org:8010/builders/dolfin-trunk-full-precise-i386/builds/44/steps/make%20run_unittests/logs/stdio
> Comments: The tests sometimes hangs and the buildslave time out. This
> is probably the same problem as in bug #1036992
> (https://bugs.launchpad.net/dolfin/+bug/1036992). I have tried both
> PETSc 3.2 and the latest 3.3. The error message when running the
> KrylovSolver.py test looks like this:
>
> [2]PETSC ERROR: --------------------- Error Message
> ------------------------------------
> [2]PETSC ERROR: Error in external library!
> [2]PETSC ERROR: Error in HYPRE_IJMatrixCreate()!
> [2]PETSC ERROR:
> ------------------------------------------------------------------------
> [2]PETSC ERROR: Petsc Release Version 3.3.0, Patch 3, Wed Aug 29
> 11:26:24 CDT 2012
> [2]PETSC ERROR: See docs/changes/index.html for recent updates.
> [2]PETSC ERROR: See docs/faq.html for hints about trouble shooting.
> [2]PETSC ERROR: See docs/index.html for manual pages.
> [2]PETSC ERROR:
> ------------------------------------------------------------------------
> [2]PETSC ERROR: Unknown Name on a linux-gnu named prceise32-bbot by
> buildbot Fri Sep 28 13:16:17 2012
> [2]PETSC ERROR: Libraries linked from /home/buildbot/local/lib
> [2]PETSC ERROR: Configure run at Wed Sep 26 14:49:09 2012
> [2]PETSC ERROR: Configure options COPTFLAGS=-O2 --with-debugging=0
> --with-shared-libraries=1 --with-clanguage=cxx --with-c-support=1
> --download-umfpack=1 --download-hypre=1 --download-mumps=1
> --download-scalapack=1 --download-blacs=1 --download-ptscotch=1
> --download-scotch=1 --download-metis=1 --download-parmetis=1
> --with-ml=1 --with-ml-lib=/home/buildbot/local/lib/libml.so
> --with-ml-include=/home/buildbot/local/include/trilinos
> --prefix=/home/buildbot/local
> [2]PETSC ERROR:
> ------------------------------------------------------------------------
> [2]PETSC ERROR: MatHYPRE_IJMatrixCreate() line 71 in
> /home/buildbot/local/src/petsc-3.3-p3/src/dm/impls/da/hypre/mhyp.c
> [2]PETSC ERROR: PCSetUp_HYPRE() line 100 in
> /home/buildbot/local/src/petsc-3.3-p3/src/ksp/pc/impls/hypre/hypre.c
> [2]PETSC ERROR: PCSetUp() line 832 in
> /home/buildbot/local/src/petsc-3.3-p3/src/ksp/pc/interface/precon.c
> [2]PETSC ERROR: KSPSetUp() line 278 in
> /home/buildbot/local/src/petsc-3.3-p3/src/ksp/ksp/interface/itfunc.c
> [2]PETSC ERROR: KSPSolve() line 402 in
> /home/buildbot/local/src/petsc-3.3-p3/src/ksp/ksp/interface/itfunc.c
> ^Cmpirun: killing job...
>
> Regression tests failed: demo/undocumented/entityintersection/python
> (plus others)
>   stdio: http://fenicsproject.org:8010/builders/dolfin-trunk-full-precise-i386/builds/44/steps/make%20run_regressiontests/logs/stdio
>   demo.log: Not available due to timeout
> Comments: Some of the demos hangs and make the buildslave time out.
> Not sure which demo, but it is probably the same problem as with the
> unit tests. The entityintersection demo fails with the following error
> message:
>
> *** Failed
> *** Warning: The UnitSphere class is broken and should not be used for
> computations. It generates meshes of very bad quality (very thin
> tetrahedra).
> Total number of cells in Cube: 108
> Total number of cells in Sphere: 162
> Intersecting pairwise cells of a cube and sphere mesh
> Cube cell index | Sphere cell index
> ------------------------------
> Traceback (most recent call last):
>   File "demo_entityintersection.py", line 48, in <module>
>     if do_intersect(c1,c2):
>   File "/home/buildbot/fenicsbbot/trunk/dolfin-full/lib/python2.7/site-packages/dolfin/cpp/mesh.py",
> line 348, in do_intersect
>     return _mesh.PrimitiveIntersector_do_intersect(*args)
> StandardError: CGAL ERROR: precondition violation!
> Expr: k.coplanar_orientation_3_object() (*p,*q,*r) == POSITIVE
> File: /usr/include/CGAL/Triangle_3_Triangle_3_do_intersect.h
> Line: 89
>
>
> * squeeze-amd64
> Buildslave info: http://fenicsproject.org:8010/buildslaves/squeeze-amd64
> Last build: http://fenicsproject.org:8010/builders/dolfin-trunk-full-squeeze-amd64/builds/62
> Comments: Same unit test errors as on lucid-amd64. These are the only
> buildslaves with Python 2.6. Maybe that is the reason?
>
>
> Johannes
>
> _______________________________________________
> Mailing list: https://launchpad.net/~dolfin
> Post to     : dolfin@xxxxxxxxxxxxxxxxxxx
> Unsubscribe : https://launchpad.net/~dolfin
> More help   : https://help.launchpad.net/ListHelp


References