Yes, and we have!
Please use the latest version of Crawler4j as I have added many methods to catch different types of exceptions.
Now, when you extend WebCrawler, just override the many methods you can override: https://github.com/yasserg/crawler4j/blob/master/src/main/java/edu/uci/ics/crawler4j/crawler/WebCrawler.java
Like the following for example: onPageBiggerThanMaxSize onUnexpectedStatusCode onContentFetchError onUnhandledException etc.
just please note that those methods were called and the page wasn't processed due to a reason, so adding it again as a seed shouldn't change the problem...
Anyway, the latest version of crawler4j, handles many pages much better, so by just upgrading to v4.1 (currently) or later you will be able to crawl many more pages.