This example is problematic, because one variable has a single 1 and the rest are zero. This is a case where logistic regression can diverge (if not regularized), since driving that coefficient to infinity (plus or minus depending on the response) will predict that observation perfectly, and not impact anything else.
Now the model is regularized, so this should not happen, but it does cause problems. I found by making alpha smaller (toward ridge, .5 for this example) the problem went away.
The real problem here is to do with the lambda sequence used for each fold, but this gets a little technical. I will try and make a fix to cv.glmnet that makes this problem go away.
Trevor Hastie (glmnet maintainer)