Semantics is applied to program generated by grammar. When you generate program from this grammar you will not see non-terminals, so you don't need to define semantics for them.
Consider example:
Exp ::= Num | Exp + Exp
Num ::= 0,1,2,...
In this example it is necessary to define semantics of whole Num
case by case, but it is also important to define semantics for Exp
, because one of the productions even though is not terminal (but it has some notion of terminality - +
is not going to change) has some special meaning assigned to it - addition.
So you will do as follows:
[[0]] = 0 // first 0 is just some string, the second is a number from N
[[1]] = 1 ...
[[Exp + Exp]] = [[Exp]] + [[Exp]] // first + sign is just string,
// the second is addtion in N
As you can see, when you define semantics you are striving to give meaning to every possible program which can be generated by your grammar.
You can think of it as follows: when your production can be applied recursively you need to define semantics for it - that is the case for Exp + Exp
. Num
, however, leads straight to terminal no matter what you try to do.
Sidenote: It is worth mentioning why we are given grammar to define semantics.
Grammars are most handy definitions of language to define semantics. This is because semantics can be easily defined using induction over structure of our language.
We supply rules for base cases - terminals, and non-trivial production, like +
in our example.