Comment 1 for bug 1352851

Revision history for this message
Michele Simionato (michele-simionato) wrote :

My analysis in real life use cases (especially the USA computations) shows that the duplication is negligible. For instance you may have 200,000 sources, 180,000 are filtered out, and therefore the duplicate filtering happens only for 20,000 sources, thus taking a negligible amount of time. On the other hand, the speedup is minor because a lot of the time is spend in transferring the sources back and forth. Still, it is worth doing because we can remove the parameter source_max_weight and keep only the parameter concurrent_tasks which is much more transparent to the end user.