Question

At first when I saw the upcoming C++0x standard I was delighted, and not that I'm pessimistic, but when thinking of it now I feel somewhat less hopeful.

Mainly because of three reasons:

  1. a lot of boost bloat (which must cause hopeless compile times?),
  2. the syntax seems lengthy (not as Pythonic as I initially might have hoped), and
  3. I'm very interested in portability and other platforms (iPhone, Xbox, Wii, Mac), isn't there are very real risk that the "standard" will take long to get portable enough?

I suppose #3 is less of a risk, lessons learned from templates in the previous decade; however Devil's in the details.

Edit 2 (trying to be less whimsy): Would you say it's safe for a company to transition to C++0x in the first effective year of the standard, or will that be associated with great risk?

Was it helpful?

Solution

Edit: do I (and others like me) have to keep a very close eye on build times, unreadable code and lack of portability and do massive prototyping to ensure that it's safe to move on with the new standard?

Yes. But you have to do all these things with the current standard as well. I don't see that it is getting any worse with C++0x.

C++ build times have always sucked. There's no reason why C++0x should be slower than it is today, though. As always, you only include the headers you need. And each header has not grown noticeably bigger, as far as I can tell.

Of course Concepts was one of the big unknowns here, and it was feared that they would slow down compile-times dramatically. Which was one of the many reasons why they were cut.

C++ easily becomes unreadable if you're not careful. Again, nothing new there. And again, C++0x offers a lot of tools to help minimize this problem. Lambdas aren't quite as concise as in, say, Python or SML, but they're a hell of a lot more readable than the functors we're having to define today.

As for portability, C++ is a minefield already. There are no guarantees given for integer type sizes, nor for string encodings. In both cases, C++0x offers the tools to fix this (with Unicode-specific char types, and integers of a guaranteed fixed size)

The upcoming standard nails down a number of issues that currently hinder portability.

So overall, yes, the issues you mention are real. They exist today, and they will exist in C++0x. But as far as I can see, C++0x lessens the impact of these problems. It won't make them worse.

You're right, it'll take a while for compliant standards to become available on all platforms. But I think it'll be a quicker process than it was with C++98.

All the major compiler vendors seem very keen on C++0x support, which wasn't really the case last time around. (probably because back then, it was mostly a matter of adjusting and fixing the pre-standard features they already implemented, so it was easier to claim that your pre-standard compiler was "sort of almost nearly C++98-compliant".

I think on the whole, the C++ community is much more standard-focused and forward-looking than a decade ago. If you want to sell your compiler, you're going to have to take C++0x seriously.

But there's definitely going to be a period of several years from the standard is released until fully (or mostly) compliant compilers are available.

OTHER TIPS

  1. You pay for only what you use. If you don't need a complex template feature, don't #include the headers it's defined in, and you won't have to deal with it.
  2. Lambda functions should reduce the verbosity of STL algorithms a good bit; and auto variables will help with code like std::map<foo, std::shared_ptr<std::vector<bar> > >::const_iterator...
  3. Yes, it will take a while. Many of the new features are indeed in boost, and if you want portability that's what you should be using for at least a few years after the standard is implemented. Fortunately there are only two compilers that cover those platforms you mentioned: g++ and Microsoft's C++ compiler. Once they get support, it's just a matter of time before the embedded toolchains get rebuilt with the new versions. Unfortunately, possibly a lot of time...

Like most of C++ you will pay for only what you need. Therefore if you don't want the "boost bloat" of useful tracking pointers, thread libraries etc. then you don't need to pay for the compilation.

I'm very sure that portability will be addressed with the design, especially since a lot is based on existing portable code from projects like boost. Both GCC and Microsoft VC have implementations of much of C++ 0x already as you'll see from their respective current prototype versions.

  1. C++ is always going to have hopeless compile times, it's whole philosophy is do things once(i.e. do it at compile time) so you don't have to repeat it at runtime and reduce the performance. And as others have said, don't include a library if you don't need it!
  2. C++ will never be very pythonic because of it's aim of being backwards compatible. The verboseness comes from being an old language that had many things added on as the language evolved. As some others have said, lambdas and auto variables will greatly reduce the verbosity as well
  3. This is a problem with any big changes to a language, but I think it's widely agreed that the changes will make the language a lot easier to use, so it should get adopted quickly.

I would actually consider #3 the biggest risk short term. AFAIK, the standard is introducing new syntax in a couple of areas (lambdas) and changing the meaning of previous words (auto for instance). Code using these features can only be portable if the compiler for every platform you deploy to supports them.

Sure this will happen at some point in time. But adding a new feature to a compiler is no small feat and takes quite a bit of time. I would be afraid that it will take too long for these features to be supported in the main compilers and hence inhibit the ability of a programmer to be an early adopter and portable.

I find that C++0x makes code a lot clearer in many cases.

Build times can be greatly shortened when dealing with heavily templated code, due to variadic templates. boost:: uses a really ugly template overloading scheme to sort of implement "variadic templates" in C++98. This causes compile time to blow through the roof. Link

Range-based for-loops are great as well for readability.

Consider the C++98-ish:

for (std::vector<int>::const_iterator itr = vec.begin(); itr != vec.end(); ++itr)
   foo(*itr);

Can now be written (implemented in G++ 4.6) as:

for (auto elem : vec)
   foo(elem);

auto keyword reduces syntactic noise as well.

Lambdas are great for use with STL algorithms as well, and makes it more readable when you don't have to separately create C style callback functions or even a whole class.

One of the advantages of standard things is that compilers can take shortcuts. E.g. the boost template circus needed to implement is_X<U> disappears if you can simply hand off to __compiler__is_X<U>. This could easily save 2 orders of magnitude, sometimes 3.

What's the question here? Of course it's going to take years before compilers on esoteric platforms will implement these features. Don't count on being able to use new features until 3, maybe 5 years in the future. 'He who then lives, then worries', as we say in Dutch.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top