Question

Why do many software developers violate the open/closed principle by modifying many things like renaming functions which will break the application after upgrading?

This question jumps to my head after the fast and the continuous versions in the React library.

Every short period I notice many changes in syntax, component names, ...etc

Example in the coming version of React:

New Deprecation Warnings

The biggest change is that we've extracted React.PropTypes and React.createClass into their own packages. Both are still accessible via the main React object, but using either will log a one-time deprecation warning to the console when in development mode. This will enable future code size optimizations.

These warnings will not affect the behavior of your application. However, we realize they may cause some frustration, particularly if you use a testing framework that treats console.error as a failure.


  • Are these changes considered as a violation of that principle?
  • As a beginner to something like React, how do I learn it with these fast changes in the library (it's so frustrating)?
Was it helpful?

Solution

IMHO JacquesB's answer, though containing a lot of truth, shows a fundamental misunderstanding of the OCP. To be fair, your question already expresses this misunderstanding, too - renaming functions breaks backwards compatibility, but not the OCP. If breaking compatibility seems necessary (or maintaining two versions of the same component to not break compatibility), the OCP was already broken before!

As Jörg W Mittag already mentioned in his comments, the principle does not say "you can't modify the behavior of a component" - it says, one should try to design components in a way they are open for beeing reused (or extended) in several ways, without the need for modification. This can be done by providing the right "extension points", or, as mentioned by @AntP, "by decomposing a class/function structure to the point where every natural extension point is there by default." IMHO following the OCP has nothing in common with "keeping the old version around unchanged for backwards compatibility"! Or, quoting @DerekElkin's comment below:

The OCP is advice on how to write a module [...], not about implementing a change management process that never allows modules to change.

Good programmers use their experience to design components with the "right" extension points in mind (or - even better - in a way no artificial extension points are needed). However, to do this correctly and without unnecessary overengineering, you need to know beforehand how future use cases of your component might look like. Even experienced programmers can't look into the future and know all upcoming requirements beforehand. And that is why sometimes backwards compatibility needs to be violated - no matter how many extension points your component has, or how well it follows the OCP in respect to certain types of requirements, there will always be a requirement which cannot be implemented easily without modifying the component.

OTHER TIPS

The open/closed principle has benefits, but it also has some serious drawbacks.

In theory the principle solves the problem of backwards compatibility by creating code which is "open for extension but closed for modification". If a class has some new requirements, you never modify the source code of the class itself but instead creates a subclass which overrides just the appropriate members necessary to change the behavior. All code written against the original version of the class is therefore unaffected, so you can be confident your change did not break existing code.

In reality you easily end up with code bloat and a confusing mess of obsolete classes. If it is not possible to modify some behavior of a component through extension, then you have to provide a new variant of the component with the desired behavior, and keep the old version around unchanged for backwards compatibility.

Say you discover a fundamental design flaw in a base class which lots of classes inherit from. Say the error is due to a private field being of the wrong type. You cannot fix this by overriding a member. Basically you have to override the whole class, which means you end up extending Object to provide an alternative base class - and now you also have to provide alternatives to all the subclasses, thereby ending up with a duplicated object hierarchy, one hierarchy flawed, one improved. But you cannot remove the flawed hierarchy (since deletion of code is modification), all future clients will be exposed to both hierarchies.

Now the theoretical answer to this problem is "just design it correctly the first time". If the code is perfectly decomposed, without any flaws or mistakes, and designed with extension points prepared for all possible future requirement changes, then you avoid the mess. But in reality everyone makes mistakes, and nobody can predict the future perfectly.

Take something like the .NET framework - it still carries around the set of collection classes which were designed before generics were introduced more than a decade ago. This is certainly a boon for backwards compatibility (you can upgrade framework without having to rewrite anything), but it also bloats the framework and presents developers with a large set of options where many are simply obsolete.

Apparently the developers of React have felt it was not worth the cost in complexity and code-bloat to strictly follow the open/closed principle.

The pragmatic alternative to open/closed is controlled deprecation. Rather than breaking backwards compatibility in a single release, old components are kept around for a release cycle, but clients are informed via compiler warnings that the old approach will be removed in a later release. This gives clients time to modify the code. This seems to be the approach of React in this case.

(My interpretation of the principle is based on The Open-Closed Principle by Robert C. Martin)

I would call the open/closed principle an ideal. Like all ideals, it gives little consideration to the realities of software development. Also like all ideals, it is impossible to actually attain it in practice -- one merely strives to approach that ideal as best as one can.

The other side of the story is known as the Golden Handcuffs. Golden Handcuffs are what you get when you slave yourself to the open/closed principle too much. Golden Handcuffs are what occur when your product which never breaks backwards compatibility can't grow because too many past mistakes have been made.

A famous example of this is found in the Windows 95 memory manager. As part of the marketing for Windows 95, it was stated that all Windows 3.1 applications would work in Windows 95. Microsoft actually acquired licenses for thousands of programs to test them in Windows 95. One of the problem cases was Sim City. Sim City actually had a bug which caused it to write to unallocated memory. In Windows 3.1, without a "proper" memory manager, this was a minor faux pas. However, in Windows 95, the memory manager would catch this and cause a segmentation fault. The solution? In Windows 95, if your application name is simcity.exe, the OS will actually relax the constraints of the memory manager to prevent the segmentation fault!

The real issue behind this ideal is the pared concepts of products and services. Nobody really does one or the other. Everything lines up somewhere in the grey region between the two. If you think from a product oriented approach, open/close sounds like a great ideal. Your products are reliable. However, when it comes to services, the story changes. It's easy to show that with the open/closed principle, the amount of functionality your team must support must asymptotically approach infinity, because you can never clean up old functionality. This means your development team must support more and more code every year. Eventually you reach a breaking point.

Most software today, especially open source, follows a common relaxed version of the open/closed principle. It's very common to see open/closed followed slavishly for minor releases, but abandoned for major releases. For example, Python 2.7 contains many "bad choices" from the Python 2.0 and 2.1 days, but Python 3.0 swept all of them away. (Also, the shift from the Windows 95 codebase to the Windows NT codebase when they released Windows 2000 broke all sorts of things, but it did mean we never have to deal with a memory manager checking the application name to decide behavior!)

Doc Brown's answer is closest to accurate, the other answers illustrate misunderstandings of the Open Closed Principle.

To explicitly articulate the misunderstanding, there seems to be a belief that the OCP means that you should not make backwards incompatible changes (or even any changes or something along these lines.) The OCP is about designing components so that you don't need to make changes to them to extend their functionality, regardless of whether those changes are backwards compatible or not. There are many other reasons besides adding functionality that you may make changes to a component whether they are backwards compatible (e.g. refactoring or optimization) or backwards incompatible (e.g. deprecating and removing functionality). That you may make these changes doesn't mean that your component violated the OCP (and definitely doesn't mean that you are violating the OCP).

Really, it's not about source code at all. A more abstract and relevant statement of the OCP is: "a component should allow for extension without need of violating its abstraction boundaries". I would go further and say a more modern rendition is: "a component should enforce its abstraction boundaries but allow for extension". Even in the article on the OCP by Bob Martin while he "describes" "closed to modification" as "the source code is inviolate", he later starts talking about encapsulation which has nothing to do with modifying source code and everything to do with abstraction boundaries.

So, the faulty premise in the question is that the OCP is (intended as) a guideline about evolutions of a codebase. The OCP is typically sloganized as "a component should be open to extensions and closed to modifications by consumers". Basically, if a consumer of a component wants to add functionality to the component they should be able to extend the old component into a new one with the additional functionality, but they should not be able to change the old component.

The OCP says nothing about the creator of a component changing or removing functionality. The OCP is not advocating maintaining bug compatibility forevermore. You, as the creator, are not violating the OCP by changing or even removing a component. You, or rather the components you've written, are violating the OCP if the only way consumers can add functionality to your components is by mutating it e.g. by monkey patching or having access to the source code and recompiling. In many cases, neither of these are options for the consumer which means if your component isn't "open for extension" they are out of luck. They simply can't use your component for their needs. The OCP argues to not put the consumers of your library into this position, at least with respect to some identifiable class of "extensions". Even when modifications can be made to the source code or even the primary copy of the source code, it's best to "pretend" that you can't modify it as there are many potential negative consequences to doing so.

So to answer your questions: No, these are not violations of the OCP. No change an author makes can be a violation of the OCP because the OCP is not a proporty of changes. The changes, however, can create violations of the OCP, and they can be motivated by failures of the OCP in prior versions of the codebase. The OCP is a property of a particular piece of code, not the evolutionary history of a codebase.

For contrast, backwards compatibility is a property of a change of code. It makes no sense to say some piece of code is or is not backwards compatible. It only makes sense to talk about the backwards compatibility of some code with respect to some older code. Therefore, it never makes sense to talk about the first cut of some code being backwards compatible or not. The first cut of code can satisfy or fail to satisfy the OCP, and in general we can determine whether some code satisfies the OCP without referring to any historical versions of the code.

As to your last question, it's arguably off-topic for StackExchange in general as being primarily opinion-based, but the short of it is welcome to tech and particularly JavaScript where in the last few years the phenomenon you describe has been called JavaScript fatigue. (Feel free to google to find a variety of other articles, some satirical, talking about this from multiple perspectives.)

Licensed under: CC-BY-SA with attribution
scroll top