質問

I think I'm pretty confused about what's called a calculus and what's called a programming language.

I tend to think, and might have been told, that a calculus is a formal system for reasoning about the equivalence of programs. Programs have an operational semantics specified by a machine, that should (I think?) be deterministic. In this way, a (correct) calculus for a language $L$ is a proof method for program equivalence.

This seems like a reasonable split to me, but is this the commonly accepted meaning? Or maybe it's even wrong?

Related, why are some operational semantics nondeterministic (assume it is confluent)? What is gained from leaving the choice of strategy open?

I'd really appreciate some clarification on these; and concrete references even more! Thanks!

正しい解決策はありません

ライセンス: CC-BY-SA帰属
所属していません cs.stackexchange
scroll top