Frage

I have a bunch of Google alerts set up as rss feeds that update in real time. What I want is to be able to store the new data the rss feed is sending out in a database.

After looking around I found Google and Superfeedr both offer hubs that do most of the work for you; however they both require a callback url (obviously). I do have an Apache server running on the machine I'm working off, it already has python enabled so I can run python scripts on my server. However at the moment its only accessible from within my LAN.

What my real question is, what do I do next? I know that in php you would just have a call back file that handles requests but I'm lost as to what to do in python. Would I write a script and give the google/superfeedr services a url to that script? What would be in the script? Specific imports needed?

Also, I just read that if you use XMPP you don't need a callback url. How does that work?

War es hilfreich?

Lösung

For the local LAN problem, the most commonly used solution is to use tuneling solutions like Passageway. They will temporarily expose a local port of your machine to the "outer" web.

Now, as for implementation, it's fairly easy to set things up. Python is similar to PHP in the sense that you'll have to write a script that listen on networking connection and then handles the HTTP requests you're getting from Superfeedr or Google. (it looks like you're not familiar with Python, why not stick to PHP then?)

Finally XMPP is a feature that only us (Superfeedr) offer. It solves the problem of exposing local ports because it works behind the firewall.

Lizenziert unter: CC-BY-SA mit Zuschreibung
Nicht verbunden mit StackOverflow
scroll top