The conditional probability table (CPT) for 'class' should have 8 (2*2*2) elements in this case. The posterior output (marg.T) of the inference engine seems right for a binary variable.
It reads as: "with 0.8 probability the 'class' node is in state 1, and with 0.2 probability it is in state 2". From this point on, it is up to the user to decide whether to appoint 'class' to state 1 or 2.
When it comes to classification, in the simplest (and not very advisable) case, you can define a posterior probability threshold of 0.5 and say:
if P(class=1)> 0.5
class = 1
else
class = 2
end
In assessing the performance of your binary classification, you can look into predictive accuracy or Area Under the ROC curve (AUC) or do more intelligent things that take into account the prior probabilities of the 'class' states.
P.S. You say the junction tree engine does not work in this case, but it should. You may be missing a point, there should be a junction_tree example (I don't exactly remember what the name of the .m file was) in the BNT toolbox .zip file. If you use the junction tree inference engine, you will see that you get the same answer as with variable elimination.