staff client patron search results continuously eats memory

Bug #1110817 reported by Jason Etheridge
36
This bug affects 8 people
Affects Status Importance Assigned to Milestone
Evergreen
Won't Fix
Medium
Unassigned

Bug Description

Open the patron search interface and do a search that returns results (I used smith with the stock test data). No need to select a patron thus returned. Let the interface sit and watch the evergreen process slowly increase memory usage over time. My first suspect is exec.chain in search_result.js, though it shouldn't be running continuously.

Revision history for this message
Jason Etheridge (phasefx) wrote :

Whatever it is, it doesn't appear to be util.exec. Galen mentioned that what it seems to be leaking is nsXPCWrappedJS( nsIDOMXULControlElement ).

Ben Shum (bshum)
Changed in evergreen:
status: New → Triaged
Ben Shum (bshum)
tags: added: memoryleak
tags: added: staffclient
Changed in evergreen:
importance: Undecided → High
status: Triaged → Confirmed
Revision history for this message
Jason Etheridge (phasefx) wrote :

An updated on this; GPLS funded more investigation into this specific issue, a report for which can be found here:

http://nox.esilibrary.com/~jason/patron_search/patron_search.html

From the abstract, "We have observed a cyclical pattern where memory usage increases and decreases over a period of roughly 4 hours. It is not yet clear if any memory is actually being leaked during these cycles." If memory is leaked, it's a very small amount for what is really an atypical use of the staff client.

Revision history for this message
Steven Chan (schan2) wrote :

Here at Sitka (BC Evergreen), the leak appears to be dependent on OS platform. One of us can see the leak on a Windows 7 staff client 2.4 after a 2 min delay.

On Windows XP SP3 with same version of staff client: After making a wide Patron search (ie, result list occupies the screen), I do not see an increase in memory use in Task Manager after waiting for 20 to 30 mins.

On a Ubuntu 12.04 with the 32-bit Linux version: I do not see an increase in resident memory use in htop.

Revision history for this message
Jason Etheridge (phasefx) wrote :

I'm not convinced there is a leak here, but the behavior was disconcerting when I first noticed it. If you watch for hours, you'll see it go and up and down on the systems affected.

Revision history for this message
Chris Sharp (chrissharp123) wrote :

I would be interested to hear if sites who have applied the fix for bug 1086458 are still experiencing problems that appear to be related to this suspected leak. The GPLS/PINES project Jason at ESI led found that there is not a leak.

Changed in evergreen:
status: Confirmed → Incomplete
Revision history for this message
Chris Sharp (chrissharp123) wrote :

Marked the bug "Incomplete" until we know more from sites.

Revision history for this message
Holly Brennan (hollyfromhomer) wrote :

I just stumbled on this bug report, and I have no techie input, but I'm here so why not.

We launched Evergreen just about a year ago (2.3.4) and noticed this "leak" problem immediately. Our circ computers were crashing every day. By 4:30 or 5pm, Evergreen was using 1,800,000 K - and that was the breaking point for our computers. Staff learned to either notice the subtle signs of a impending crash... or we'd keep an eye on the clock and just do a reboot when there was a quiet moment (not often during that after-school hour.

Last week we finally upgraded to 2.5.2 and we have not experienced crashes at our circ computers. In fact, it's 4:35pm RIGHT NOW and our two most-used circ computers are running at 650,000 and 860,000 K. These circ computers are running Windows XP SP3.

That's still a lot more than my own computer used for cataloging (Windows 7), which is at 160,700 K.

Maybe this helps, maybe not. But there's definitely a difference in memory use between those versions.

Revision history for this message
Chris Sharp (chrissharp123) wrote :

Requesting comment from anyone experiencing this issue on currently supported versions of Evergreen (2.6+).

no longer affects: evergreen/2.3
Revision history for this message
Jason Stephenson (jstephenson) wrote :

Given the recent silence on this bug, I am setting the importance to Medium.

Changed in evergreen:
importance: High → Medium
Revision history for this message
Josh Stompro (u-launchpad-stompro-org) wrote :

Hello, My co-worker has been cleaning up patron records via the staff client, and has reported the staff client freezing up on her several times this week. She checked the memory this last time and the client was using 1.4 GB of ram when it became unresponsive.

She has been using the patron search UI to pull up the patrons by barcode. I've asked her to try using the checkout UI, or the by database ID method to see if the memory leak still happens in those situations.

EG 2.8.4

Josh

Revision history for this message
Josh Stompro (u-launchpad-stompro-org) wrote :

I forgot to add, the computer that this is happening on is a 64 bit Windows 10 dell laptop. I just read the note above about this problem being platform dependent.
Josh

Revision history for this message
Jeff Davis (jdavis-sitka) wrote :

We have a multibranch system with Linux workstations that has reported slowness/freezing/crashing issues with the XUL client. This system makes heavy use of patron search to retrieve patron records. So I did a quick-and-dirty test on Ubuntu 16.04.1 / EG 2.10.2 / XULRunner 14.0.1:

1. Launch the client.
2. Load the patron search interface.
3. Execute a search (but don't select or retrieve any of the results).
4. Leave the client idle, and monitor its memory usage over time.

After 30 minutes, resident memory (physical RAM usage) gradually increased from 133.0MiB to 156.3MiB, even though the client was idle; virtual and shared memory did not change significantly. Further testing would be required to confirm this result and to see if Linux shows the same 4-hour cycle seen in GPLS/ESI's testing on Windows XP in 2013.

Revision history for this message
Terran McCanna (tmccanna) wrote :

No more memory leaks!

Changed in evergreen:
status: Incomplete → Won't Fix
To post a comment you must log in.
This report contains Public information  
Everyone can see this information.

Other bug subscribers

Remote bug watches

Bug watches keep track of this bug in other bug trackers.