Find more posts tagged with
Hi @peter.meszaros,
Have a look at this article on chains which discusses the different types of chains and different ways to process them.
@peter.meszaros referring to the article linked by Nick above, if you are able to predict or at least guess the names of the chain links you could proactively request them in large batches, but wait until you get all the link data back to confirm your guesses. Similarly with the actual linked records, for example if you're looking at an index chain as opposed to say a top-gainers chain, you could proactively batch request the records you think will be in it while waiting for the links. Just how much performance that gains you is something you'll have to experiment with.
I'm not sure if you are suggesting this, but if you are traversing an unknown chain one link at a time, discovering the contents as you go, you should not hold on to the LONGLINKs from one chain record in order to batch them with ones from the next link that hasn't even been requested yet.
Hi @peter.meszaros,
Have a look at this article on chains which discusses the different types of chains and different ways to process them.
15 items should be the optimal batch size for your use case becase the maximum LONGLINK is not over 14 plus 1 LONGNEXTLR.