Possible reference loops lead to high memory usage when idle
Affects | Status | Importance | Assigned to | Milestone | |
---|---|---|---|---|---|
OpenStack Heat |
Fix Released
|
High
|
Zane Bitter |
Bug Description
Deploying a TripleO overcloud uses a lot of memory, and the heat-engine process is one of the top consumers.
However it seems that we hold on to the memory after the deployment (until heat-engine is restarted), so I think we may have more reference loops similar to bug #1454873
Before (just after a heat-engine restart)
12932 heat 20 0 368408 71504 6564 S 0.0 0.9 0:02.43 heat-engine
12942 heat 20 0 372536 70972 2076 S 0.0 0.9 0:00.27 heat-engine
12943 heat 20 0 372504 70932 2076 S 0.0 0.9 0:00.26 heat-engine
12944 heat 20 0 372612 70928 2076 S 0.0 0.9 0:00.24 heat-engine
12945 heat 20 0 372600 70936 2076 S 0.0 0.9 0:00.25 heat-engine
24436 heat 20 0 364336 66760 4100 S 0.0 0.8 0:49.87 heat-api-cfn
24510 heat 20 0 357592 59436 3584 S 0.0 0.7 0:00.56 heat-api
24542 heat 20 0 369376 69624 2024 S 0.0 0.9 3:32.44 heat-api
24543 heat 20 0 369812 70360 2012 S 0.0 0.9 3:32.41 heat-api
After:
[root@instack ~]# top -b -n1 | grep heat
12932 heat 20 0 368408 71504 6564 S 0.0 0.9 0:22.23 heat-engine
12942 heat 20 0 602812 296144 3164 S 0.0 3.7 10:43.63 heat-engine
12943 heat 20 0 579648 269740 3276 S 0.0 3.3 7:02.62 heat-engine
12944 heat 20 0 647048 323432 4280 S 0.0 4.0 12:23.91 heat-engine
12945 heat 20 0 697316 372644 4196 S 0.0 4.6 10:50.47 heat-engine
24436 heat 20 0 364340 66764 4100 S 0.0 0.8 0:52.44 heat-api-cfn
24510 heat 20 0 357592 59436 3584 S 0.0 0.7 0:00.56 heat-api
24542 heat 20 0 369376 69624 2024 S 0.0 0.9 4:02.81 heat-api
24543 heat 20 0 369812 70360 2012 S 0.0 0.9 4:02.74 heat-api
We can see heat-engine has gone from ~70M per worker to about 300M, which means there's nearly a gig not freed after the deployment completes.
Done some investigation with objgraph and heapy but as yet not isolated the cause(s)
tags: | added: tripleo |
Changed in heat: | |
importance: | Undecided → High |
There may be reference loops but they may not be the cause of this high memory use in heat.
Instead it is most likely to be a high-water mark / heap fragmentation. This thread is interesting: www.gossamer- threads. com/lists/ python/ python/ 1162114
http://
Our best options for memory-consuming stacks may be:
1. moving to process-based workers which exit at the end of their work
2. use less memory in the first place
We'll be discussing both at the optimisation session in Austin.