Likely Impossible
I do not think what you are trying to do is possible. All the reduce function does is to aggregate/sum the word count across multiple documents with the same key, it will always return something for all key's you have generated in your map function.
Consider reduce/rereduce
Even if you can accept code with 'null' you have a potential bug. Have a read over: https://wiki.apache.org/couchdb/Introduction_to_CouchDB_views#Reduce_vs_rereduce
Assuming you have a few thousand emits for a key, a subset of these emits will likely be reduced in smaller segments and then be revisited in a rereduce function across all the segments.
Unless these segments (the size of which is managed by couchdb) are > 3000 elements, your query would likely mean you'll be generating a lot of 'null's and then be rereducing them. If anything your code should read:
function(key, values, rereduce)
{
if(rereduce && sum(values)<3000){return 0;}
return sum(values);
}
Alternative Setup
I assume you have just too many words in your documents to be able to query all of them. I'd test if you can use parts of the word as the key, so for instance if you have a word "couch" and "couchdb" you'd emit these as part of a document with the key "co" or "cou" and the like
{ "couch" : 1, "couchdb" : 15 }
You'd still have a limited number of key's you could parse and apply the 3000-rule on the rereduce. You are however at risk of falling foul of the following rule of thumb on the size of values after the reduce call:
https://wiki.apache.org/couchdb/Introduction_to_CouchDB_views#reduced_value_sizes
Disclaimer
For the type of full text search problem you may want to look at couchdb-lucene. (I have not used it so don't know if you may be able to solve your issue.)