You can use a robots.txt file in your root folder. All standards-abiding robots will obey this file and not index your files and folders.
Example Robots.txt that tells all (*) crawlers to move on and index nothing.
User-agent: *
Disallow: /
You could use .htaccess files to fine tune what your server (assuming Apache) serves out and what directory indexes are visible. In which case you would add
IndexIgnore *
To your .htaccess file to disallow indexes.
Updated (Credit to https://stackoverflow.com/users/1714715/samuel-cook):
If you want to specifically stop a bot/crawler and know its USER AGENT string you can do so in your .htaccess
<IfModule mod_rewrite.c>
RewriteEngine on
RewriteCond %{HTTP_USER_AGENT} Googlebot
RewriteRule ^.* - [F,L]
</IfModule>
Hope this helps.