flume provides the pipleline to Hadoop/HBase originally, and it allows you to do pretty much all sort of decorating, transforming and intercepting before it reaches the final storage. So flume is a perfect place to have the pre-processing (alerting in your case). The flume sink can be Elastic Search, which means the logs will be eventually ended up in Elastic Search. To answer your question, before the logs gets into the final destination, it makes perfect sense to have all your alerting/alarm/notifications triggered in the pipeline, both old flume and flume-ng architecture are customisable and powerful in this regard.
Another thing to mention is that, Elastic Search is perfect for full-text-search, but analytics, it can't compete against Hadoop ecosystem. Cloudera CDH4.3 added Solr cloud into Hadoop, this gives a plus to the combination: flume + HDFS or HBase + Solr. It's worthwhile looking at this mix as well.