i've been told that I need to split up my requests because they're "big" (I will defend to the grave that they shouldn't be classified as big even 20 years ago..)
anyway, so now I have a 100 requests, each for a months worth of data.
can I submit them all in parallel with the same token?
any limits on the parallelization of requests?
what happens server side? are the sequentially queued and executing in order anyway?