Neumann boundary conditions not working in parallel

Bug #426473 reported by Anders Logg
6
This bug affects 1 person
Affects Status Importance Assigned to Milestone
DOLFIN
Fix Released
High
Niclas Jansson

Bug Description

Neumman boundary conditions (exterior facet integrals) don't seem to be working in parallel. To reproduce this error, go to test/system/parallel-assembly-solve/ and run

  mpirun -n 2 python solver.py

I get the following output:

(unitsquare.xml.gz, 1): *** ERROR (norm = 25.19288137326567, reference = 9.547454087327344, diff = 15.64542728593833)
(unitsquare.xml.gz, 2): OK (norm = 18.42366670418566, reference = 18.42366670418527, diff = 3.872457909892546e-13)
(unitsquare.xml.gz, 3): OK (norm = 27.29583104741751, reference = 27.29583104741712, diff = 3.907985046680551e-13)
(unitsquare.xml.gz, 4): OK (norm = 36.16867128090711, reference = 36.1686712809094, diff = 2.295053036505124e-12)
(unitcube.xml.gz, 1): *** ERROR (norm = 14.62062082718494, reference = 8.876490653853809, diff = 5.744130173331129)
(unitcube.xml.gz, 2): *** ERROR (norm = 32.93811914105483, reference = 19.99081167299566, diff = 12.94730746805917)
(unitcube.xml.gz, 3): OK (norm = 33.85477561286864, reference = 33.85477561286852, diff = 1.20792265079217e-13)
(unitcube.xml.gz, 4): OK (norm = 49.97357666762994, reference = 49.97357666762962, diff = 3.197442310920451e-13)

Strangely enough, it works for q = 2, 3, 4 in 2D but not q = 1, and works for q = 3, 4 in 3D but not q = 1, 2.

Anders Logg (logg)
Changed in dolfin:
importance: Undecided → High
milestone: none → 0.93
Anders Logg (logg)
Changed in dolfin:
status: New → Confirmed
assignee: nobody → Niclas Jansson (njansson)
Revision history for this message
Niclas Jansson (njansson) wrote : Re: [DOLFIN-dev] [Bug 426473] Re: Neumann boundary conditions not working in parallel

logg <email address hidden> writes:

> ** Changed in: dolfin
> Status: New => Confirmed
>
> ** Changed in: dolfin
> Assignee: (unassigned) => Niclas Jansson (njansson)
>
> --
> Neumann boundary conditions not working in parallel
> https://bugs.launchpad.net/bugs/426473
> You received this bug notification because you are subscribed to DOLFIN.
>
> Status in DOLFIN: Confirmed
>
> Bug description:
> Neumman boundary conditions (exterior facet integrals) don't seem to be working in parallel. To reproduce this error, go to test/system/parallel-assembly-solve/ and run
>
> mpirun -n 2 python solver.py
>
> I get the following output:
>
> (unitsquare.xml.gz, 1): *** ERROR (norm = 25.19288137326567, reference = 9.547454087327344, diff = 15.64542728593833)
> (unitsquare.xml.gz, 2): OK (norm = 18.42366670418566, reference = 18.42366670418527, diff = 3.872457909892546e-13)
> (unitsquare.xml.gz, 3): OK (norm = 27.29583104741751, reference = 27.29583104741712, diff = 3.907985046680551e-13)
> (unitsquare.xml.gz, 4): OK (norm = 36.16867128090711, reference = 36.1686712809094, diff = 2.295053036505124e-12)
> (unitcube.xml.gz, 1): *** ERROR (norm = 14.62062082718494, reference = 8.876490653853809, diff = 5.744130173331129)
> (unitcube.xml.gz, 2): *** ERROR (norm = 32.93811914105483, reference = 19.99081167299566, diff = 12.94730746805917)
> (unitcube.xml.gz, 3): OK (norm = 33.85477561286864, reference = 33.85477561286852, diff = 1.20792265079217e-13)
> (unitcube.xml.gz, 4): OK (norm = 49.97357666762994, reference = 49.97357666762962, diff = 3.197442310920451e-13)
>
> Strangely enough, it works for q = 2, 3, 4 in 2D but not q = 1, and works for q = 3, 4 in 3D but not q = 1, 2.

Should work now.

BoundaryComputation needs facet connectivity and overlap, which are not
computed for P1. Otherwise it applies boundary conditions to the
"interior" boundary between processes.

Niclas

Revision history for this message
Niclas Jansson (njansson) wrote :

Niclas Jansson <email address hidden> writes:

> logg <email address hidden> writes:
>
>> ** Changed in: dolfin
>> Status: New => Confirmed
>>
>> ** Changed in: dolfin
>> Assignee: (unassigned) => Niclas Jansson (njansson)
>>
>> --
>> Neumann boundary conditions not working in parallel
>> https://bugs.launchpad.net/bugs/426473
>> You received this bug notification because you are subscribed to DOLFIN.
>>
>> Status in DOLFIN: Confirmed
>>
>> Bug description:
>> Neumman boundary conditions (exterior facet integrals) don't seem to be working in parallel. To reproduce this error, go to test/system/parallel-assembly-solve/ and run
>>
>> mpirun -n 2 python solver.py
>>
>> I get the following output:
>>
>> (unitsquare.xml.gz, 1): *** ERROR (norm = 25.19288137326567, reference = 9.547454087327344, diff = 15.64542728593833)
>> (unitsquare.xml.gz, 2): OK (norm = 18.42366670418566, reference = 18.42366670418527, diff = 3.872457909892546e-13)
>> (unitsquare.xml.gz, 3): OK (norm = 27.29583104741751, reference = 27.29583104741712, diff = 3.907985046680551e-13)
>> (unitsquare.xml.gz, 4): OK (norm = 36.16867128090711, reference = 36.1686712809094, diff = 2.295053036505124e-12)
>> (unitcube.xml.gz, 1): *** ERROR (norm = 14.62062082718494, reference = 8.876490653853809, diff = 5.744130173331129)
>> (unitcube.xml.gz, 2): *** ERROR (norm = 32.93811914105483, reference = 19.99081167299566, diff = 12.94730746805917)
>> (unitcube.xml.gz, 3): OK (norm = 33.85477561286864, reference = 33.85477561286852, diff = 1.20792265079217e-13)
>> (unitcube.xml.gz, 4): OK (norm = 49.97357666762994, reference = 49.97357666762962, diff = 3.197442310920451e-13)
>>
>> Strangely enough, it works for q = 2, 3, 4 in 2D but not q = 1, and works for q = 3, 4 in 3D but not q = 1, 2.
>
> Should work now.
>
> BoundaryComputation needs facet connectivity and overlap, which are not
> computed for P1. Otherwise it applies boundary conditions to the
> "interior" boundary between processes.
>
> Niclas

Anders Logg (logg)
Changed in dolfin:
status: Confirmed → Fix Committed
Changed in dolfin:
status: Fix Committed → Fix Released
Anders Logg (logg)
Changed in dolfin:
status: Fix Released → Fix Committed
Anders Logg (logg)
Changed in dolfin:
status: Fix Committed → Fix Released
To post a comment you must log in.
This report contains Public information  
Everyone can see this information.

Other bug subscribers

Remote bug watches

Bug watches keep track of this bug in other bug trackers.