Paraview Volume rendering crashes on NVIDIA cards for large datasets

Bug #1355683 reported by Steffen Kieß
6
This bug affects 1 person
Affects Status Importance Assigned to Milestone
paraview (Ubuntu)
New
Undecided
Unassigned

Bug Description

Paraview Volume rendering crashes on NVIDIA cards (using the proprietary driver) for large datasets (larger than 96 MB).
This happens when using the GPU Based renderer or the Smart renderer (which I suppose uses the GPU Based renderer).

I've attached a testcase for the problem which can be run with "pvpython testcase.py". Output on my system is:
ERROR: In /build/buildd/paraview-4.0.1/VTK/Common/ExecutionModel/vtkAlgorithm.cxx, line 1387
vtkImageResample (0x27c36f0): Attempt to get connection index 0 for input port 0, which has 0 connections.

Segmentation fault (core dumped)

When recompiling the paraview package with "nvidia-settings" installed, the problem disappears. As far as I can see this is because in this case paraview is using libXNVCtrl for determining the available GPU memory, while otherwise it will use a default value of 128 MB. When the GPU renderer then decides that the data won't fit into memory, something goes wrong a paraview crashes.

So one way of fixing this would be do add nvidia-settings to Build-Depends.

Revision history for this message
Steffen Kieß (steffen-kiess) wrote :
Revision history for this message
Steffen Kieß (steffen-kiess) wrote :
To post a comment you must log in.
This report contains Public information  
Everyone can see this information.

Other bug subscribers

Remote bug watches

Bug watches keep track of this bug in other bug trackers.