Question

Python, and others, use the import technique of getting external functionality.

C, and others, use include (and, eg C++, has attendant namespace headaches).

What is the reason to pick one over the other (or use both like Objective-C can) in designing a language?

I see Apple is proposing some updates/changes via a paper to LLVM, and wonder why the differences exist.

clarification based on @delnan's answer

Given that there are multiple ways to implement import (of which I was unaware until his answer), what is the overall benefit of !include versus include? The import technique seems to only find individual subcomponents based on pathing given to them (at least in Python - whose [apparent] method is the only one I know).

How do other uses of the import methodology differ yet from that? When would using the 'old style' include method make sense in modern language design and implementation (if ever)?

Was it helpful?

Solution

The approach of C, which C++ and Objective C simply inherited, is very simple to define and implement (in a nutshell, "when encountering an #include, replace it with the contents of the file, and continue"), but has serious problems. Some of these problems are named in the presentation you've seen (and elsewhere). There are idioms and best practices (also discussed in that presentation and elsewhere) and minor extensions (#pragma once, precompiled headers) which alleviate some of the problems, but at the end of the day, the approach is fundamentally too limited to handle what software engineers have come to expect of a module system. Pretending it does what more recent alternatives do (see below) is a quite leaky abstraction.

Nowadays, everyone with an opinion on language design seems to agree that you shouldn't do it if you can help it. C++ and Objective C didn't have that choice due to the need for backwards compatibility (though both had and still have the choice to add another mechanism, and Objective C did it). It is "fair for it's day", in that it was a rather good decision back when it was made (it worked well enough, and it still kinda works if you have discipline), but the world has moved on and settled on better ways to split code into modules, then pull it back together. (Note that such ways already existed back in the early C days, but apparently they didn't catch on for a while.)

What you describe as "the" import technique is actually a pretty large design space. Many module systems are almost, but not quite, entirely unlike each other - and the rest still has enough subtle differences to ruin your day. It can be anything from just executing the imported file in a new scope (Python, PHP) to fully-blown ML-style functors. There are some similarities, in that all these module systems give each "module" (whatever that means in the respective system) their own scope/namespace, (usually) permit separate compilation of modules, and generally go out of their way to fix the problems of C-style textual includes (or whatever other problem the creator sees with the alternatives). That's about as much as one can say in general.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top