You seem to be confusing few things here:
- Polynomial kernel is not a "2 dimensional kernel", polynomial kernel maps to the roughly O(md) dimensional space
- Empirical VC dimension is not a true VC dimension, the true VC dimension is the analytical object, that cannot be directly computed "from the data", it requires rigorous proofs, and one (of a few existing) says that for n-dimensional space, VC dimension of the linear classifiers is n+1, it holds no matter how you "get" to this space.
- Number of support vectors is somehow related to the generalization abilities of the model, and so is VC dimension. Unfortunately there is no easy "one-to-one" dependency between the number of support vectors and the VC dimension of the data. In fact, as far as I know, there is no known proof of the VC dimension of the SVM model (we know, that it is a gap tolerant classifier, which should have lower VC dimension, but it is far from being a dimension proof).