We have middleware written for node (expressjs) that lets you determine if the user is a bot, and then forward the request on to a server that serves up the prerendered HTML. The code can be found here: https://github.com/prerender/prerender-node
It works by supporting the _escaped_fragment_
protocol, so we only check the user agent for services that don't support the protocol (facebook, twitter, linkedin, etc).
It's currently built to forward requests to our Prerender.io service, which handles all of the work of turning javascript pages into static HTML, but we also have our phantomjs server open sourced: https://github.com/prerender/prerender. That way you can host everything on your own if you'd like.
This is not considered as cloaking because you're really serving what the user sees. Just don't change the static HTML to be any different than what the user actually sees when visiting those pages and you'll be fine. Our service serves over 1 million pages to search engines every day and none of our users have been penalized for cloaking.
Check it out and let me know if you have any questions.