However if you want to loop over an indeterminate-size list of URLs to find out when each page last changed, it could get hard. I dreaded writing this bit until I remembered tail recursion, which is a relatively simple I don’t see many people talking about. So I thought I’d write it up.
1. Arrange all the URLs in an array.
2. Write a routine that reads one URL.
3. When it completes, recursively call yourself for the next item in the array.
4. When you fall off the array, do nothing but return.
I wrote it up with a dummy routine called httpRequest. All it does is sum the sizes of all the pages in the array.