I suggest to stay away from wget for your task as it makes your life really complicated for no reason. PHP is perfectly fine to fetch downloads.
I would add all URLs into a database (it might be just a text file, like in your case). Then I would use a cronjob to trigger the script.
On each run I would check a fixed number of sites and put their RSS feeds into the folder. E.g. with file_get_contents
and file_put_contents
you are good to go. This allows you full control over what to fetch and how to save it.
The I would use another script that goes over the files and does the parsing. Separating the scripts from beginning will help you to scale later on.
For a simple site, just sorting the files by mtime
should do the trick. For a big scaleout, I would use a jobqueue.
The overhead in PHP is minimal while the additional complexity by using wget is a big burden.