문제

I think I'm pretty confused about what's called a calculus and what's called a programming language.

I tend to think, and might have been told, that a calculus is a formal system for reasoning about the equivalence of programs. Programs have an operational semantics specified by a machine, that should (I think?) be deterministic. In this way, a (correct) calculus for a language $L$ is a proof method for program equivalence.

This seems like a reasonable split to me, but is this the commonly accepted meaning? Or maybe it's even wrong?

Related, why are some operational semantics nondeterministic (assume it is confluent)? What is gained from leaving the choice of strategy open?

I'd really appreciate some clarification on these; and concrete references even more! Thanks!

올바른 솔루션이 없습니다

라이센스 : CC-BY-SA ~와 함께 속성
제휴하지 않습니다 cs.stackexchange
scroll top