I have items in my SQS queue. I then grab 100 items from queue to process and then I want to store the processed items in S3.
However, the problem is that since I will be storing 100 items each, there will be numerous pieces of logs in the S3.
One way to solve this is to store 100 items locally into a file and then keep updating locally until it reaches a certain threshold. Only after that upload to S3.
Is this how it is supposed to work?
Thanks.