OP is intentionally caching results of, presumably, expensive computations or network calls. It's a cache. There is no memory leak. OP just did not detail how cache invalidation/replacement happens. The 2nd comment adds a rather rudimentary mechanism of removing items from cache as soon as they are ready. You get the benefit of batching requests that occur before the resolve, but you get the downside of requests coming in right after the resolve hitting the expensive process again. Requests don't neatly line up at the door and stop coming in as soon as your database query returns.
Typically you would use an LRU mechanism. All caches (memcached, redis, etc.) have a memory limit (whether fixed or unbounded, i.e. all RAM) and a cache replacement policy.
OP is intentionally caching results of, presumably, expensive computations or network calls. It's a cache. There is no memory leak. OP just did not detail how cache invalidation/replacement happens. The 2nd comment adds a rather rudimentary mechanism of removing items from cache as soon as they are ready. You get the benefit of batching requests that occur before the resolve, but you get the downside of requests coming in right after the resolve hitting the expensive process again. Requests don't neatly line up at the door and stop coming in as soon as your database query returns.
Typically you would use an LRU mechanism. All caches (memcached, redis, etc.) have a memory limit (whether fixed or unbounded, i.e. all RAM) and a cache replacement policy.