Question

I would like to include a public/robots.txt file with the staging deployment of my Meteor app (at *.meteor.com), basically to avoid this version of the site to be crawled at all. How can I achieve this? I'm using the meteor deploy command to deploy to staging.

Was it helpful?

Solution

I've come up with a (hopefully temporary?) solution, use a deployment script that creates public/robots.txt before invoking meteor deploy and at the end deletes public/robots.txt.

The Script:

#!/usr/bin/env python
import subprocess
import os.path

dpath = os.path.dirname(__file__)
if dpath:
  os.chdir(dpath)

if not os.path.exists('public'):
  os.makedirs('public')
robots = None
try:
  with open('public/robots.txt', 'wb') as robots:
    robots.write("""User-agent: *
Disallow: /
""")

  subprocess.check_call(["meteor", "deploy", '<myapp>.meteor.com'])
finally:
  if robots is not None:
    os.remove(robots.name)
Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top