You can tell Google and also other search enginges, which parts of your web presence you don't want to be indexed.
Just put a file robots.txt
in the root of your public reachable website, so all the robots can find it at http://www.example.com/robots.txt where www.example.com is your domain address.
This is a simple text file and below you find examples how to use the file to give commands to the web robots.
http://en.wikipedia.org/wiki/Robots_exclusion_standard says:
The standard specifies the instruction format to be used to inform the robot about which areas of the website should not be processed or scanned
This example tells all robots that they can visit all files because the wildcard * specifies all robots:
User-agent: *
Disallow:
This example tells all robots to stay away from one specific file:
User-agent: *
Disallow: /directory/file.html
This example tells all robots not to enter three directories:
User-agent: *
Disallow: /cgi-bin/
Disallow: /tmp/
Disallow: /junk/