Question

I'm not talking about countryside colleges/unis.

  • Are there any thought-out procedures in top-ranking institutions based on formal criteria, actual statistics or demand from the industry?
  • If so, what are the criteria used?
  • Are the decisions to drop certain languages and introduce others made systematically or in a haphazard manner?
Was it helpful?

Solution

If we're talking about, say, top 10 ranked US colleges and universities (other countries will likely have different traditions and people will have wildly different definitions of a "countryside college or university"), no. A community college will generally choose what languages to teach based on what languages employers in the geographic area served by the college use. The top-ranked universities, on the other hand, generally have much more idiosyncratic criteria. For example, when I went to MIT, the primary languages used were Scheme (because professors Abelson and Sussman wrote Structure and Interpretation of Computer Programs) and CLU because Prof. Liskov designed it. Neither of these have ever been particularly useful languages for professionals though LISP, at least, has some users in academic fields like AI. The top-ranked universities see themselves as teaching computer science, not programming, and since they generally assume that you'll use many different languages over your career, the particular languages that get taught are not particularly important. Community colleges are in the business of teaching programming so they generally prefer languages that are in more common use.

From time to time, universities do change the languages they teach. That's generally going to involve a lot of faculty meetings and discussions but it's hard to know whether that would qualify as "systemic" or "haphazard" in your definitions. In general, the criteria will be pedagogical, not practical. That is, the professors are much more likely to care about what languages make teaching the underlying concepts easier and what languages they view as promoting beautiful code rather than what languages employers demand.

OTHER TIPS

Most "higher-ranked" universities don't focus on particular technologies or languages, because they are teaching concepts of Computer Science. Their primary goal is not to prepare graduates to do enterprise software development.

So while their introductory courses might all use the same language for consistency's sake, it's not like they chose it based on how popular it is in the software industry.

As for courses later in their programs, my experience has been that teachers just use whatever they're familiar with. They don't care as much that you learn Python, for example, as much as they care that you learn the theoretical concepts they're teaching. Programming languages are generally just a vehicle professors use to hammer home what they're really interested in teaching.

Most colleges that I know of teach Java as the introductory language. Talking to one of my professors about how our department chose that language (and why it can't be replaced by something else), he said:

  • Java is portable. Write once, run everywhere.
  • It's strongly-typed
  • It's a managed language (most first-year CS students get very confused with pointers)
  • It's old, which means there are plenty of resources for it
  • It shares a C-like syntax, making it broadly applicable
  • Getting a basic working program in Java takes very little time and code

He said that there are no other languages that provide those benefits in such a way to provoke the department to change. I argued for C#, but he contended that it's not truly portable and even though .NET has far better support for things like generics, it's not enough to prompt a change.

Licensed under: CC-BY-SA with attribution
scroll top