Generally filtering is done also to disambiguate over-approximations of languages. We write ambiguous but clear context-free grammars for programming languages, then use tree walkers or other mechanisms to remove the unwanted derivations.
one reference:
One the other hand, you could also consider a type checker that processes abstract syntax trees as such a filter. Type checkers reject trees produced by a parser based on non-local (context) information. For example:
1 + "1"
is accepted by the grammar because:
E ::= Int | String | E "+" E;
but the type checker says that addition does not work between integers and strings and rejects the sentence from the language. The type checker does this by traversing the tree after parsing and identifying the addition symbol, then possibly looking up valid combinations of operands in a table and if the combination is not a valid it starts complaining. I guess that is typically how compilers work. See the Aho et al. dragon book. It sounds more interesting if you talk about it abstractly :-)