Question

I haven't done any programming in ML, but I am studying functional programming. ML is really particular about types in general, but I wanted to know if you could do typecasting to get around its strict type limitations? If you can't, is it because ML wants to lessen the chances of incorrect types, or is it something that just wasn't implemented?

Was it helpful?

Solution

"Type casting" can mean a lot of different things. There are at least 3 completely different use cases:

  • Converting between specific different types (e.g. numeric types). That is possible in ML through respective functions, they just aren't disguised as "casts"

  • Casting back and forth from/to a top type (like Object or void*). This is just a hack to work around inexpressive type systems. Fortunately, ML has proper parametric polymorphism and algebraic datatypes, so doesn't normally need this.

  • Doing nasty low-level trickery, subverting the type system completely. That is generally not supported in ML, since it is designed to be a safe, high-level language.

What form of type casting are you interested in in particular, and why?

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top