Question

A friend of mine is working for a small company on a project every developer would hate: he's pressured to release as quickly as possible, he's the only one who seem to care about technical debt, the customer has no technical background, etc.

He told me a story which made me think about the appropriateness of design patterns in projects like this one. Here's the story.

We had to display products at different places on the website. For example, content managers could view the products, but also the end users or the partners through the API.

Sometimes, information was missing from the products: for example, a bunch of them didn't have any price when the product was just created, but the price wasn't specified yet. Some didn't have a description (the description being a complex object with modification histories, localized content, etc.). Some were lacking shipment information.

Inspired by my recent readings about design patterns, I thought this was an excellent opportunity to use the magical Null Object pattern. So I did it, and everything was smooth and clean. One just had to call product.Price.ToString("c") to display the price, or product.Description.Current to show the description; no conditional stuff required. Until, one day, the stakeholder asked to display it differently in the API, by having a null in JSON. And also differently for content managers by showing "Price unspecified [Change]". And I had to murder my beloved Null Object pattern, because there was no need for it any longer.

In the same way, I had to remove a few abstract factories and a few builders, I ended up replacing my beautiful Facade pattern by direct and ugly calls, because the underlying interfaces changed twice per day for three months, and even the Singleton left me when the requirements told that the concerned object had to be different depending on the context.

More than three weeks of work consisted of adding design patterns, then tearing them apart one month later, and my code finally became spaghetti enough to be impossible to maintain by anyone, including myself. Wouldn't it be better to never use those patterns in the first place?

Indeed, I had to work myself on those types of projects where the requirements are changing constantly, and are dictated by persons who don't really have in mind the cohesion or the coherence of the product. In this context, it doesn't matter how agile you are, you'll come with an elegant solution to a problem, and when you finally implement it, you learn that the requirements changed so drastically, that your elegant solution doesn't fit any longer.

What would be the solution in this case?

  • Not using any design patterns, stop thinking, and write code directly?

    It would be interesting to do an experience where a team is writing code directly, while another one is thinking twice before typing, taking the risk of having to throw away the original design a few days later: who knows, maybe both teams would have the same technical debt. In the absence of such data, I would only assert that it doesn't feel right to type code without prior thinking when working on a 20 man-month project.

  • Keep the design pattern which doesn't make sense any longer, and try to add more patterns for the newly created situation?

    This doesn't seem right neither. Patterns are used to simplify the understanding of the code; put too much patterns, and the code will become a mess.

  • Start thinking of a new design which encompasses the new requirements, then slowly refactor the old design into the new one?

    As a theoretician and the one who favors Agile, I'm totally into it. In practice, when you know that you'll have to get back to the whiteboard every week and redo the large part of the previous design and that the customer just doesn't have enough funds to pay you for that, nor enough time to wait, this probably won't work.

So, any suggestions?

Was it helpful?

Solution

I see some wrong assumptions in this question:

  • code with design patterns, though applied correctly, needs more time to be implemented than code without those patterns.

Design patterns are no end in itself, they should serve you, not vice versa. If a design pattern does not make the code easier to implement, or at least better evolvable (that means: easier to be adapted to changing requirements), then the pattern misses its purpose. Don't apply patterns when they don't make "life" easier for the team. If the new Null object pattern was serving your friend for the time he used it, then everything was fine. If it was to be eliminated later, then this could be also ok. If the Null object pattern slowed the (correct) implementation down, then its usage was wrong. Note, from this part of the story one cannot conclude any cause of "spaghetti code" so far.

  • the customer is to blame because he has no technical background and does not care about cohesion or the coherence of the product

That is neither his job nor his fault! Your job is to care about cohesion and coherence. When the requirements change twice a day, your solution should not be to sacrifice the code quality. Just tell the customer how long it takes, and if you think you need more time for getting the design "right", then add a big enough safety margin to any estimation. Especially when you have a customer trying to pressure you, use the "Scotty Principle". And when arguing with a non-technical customer about the effort, avoid terms like "refactoring", "unit-tests", "design patterns" or "code documentation" - that are things he does not understand and probably regards to be "unnecessary nonsense" because he sees no value in it. Always talk about things which are visible or at least understandable to the customer (features, sub-features, behaviour changes, user docs, error fixes, performance optimization, and so on).

  • the solution for rapid changing requirements is to rapidly change the code

Honestly, if "underlying interfaces change twice per day for three months", then the solution should not be to react by changing the code twice a day. The real solution is to ask why the requirements change so often and if it is possible to make a change at that part of the process. Maybe some more upfront-analysis will help. Maybe the interface is too broad because the boundary between components is chosen wrong. Sometimes it helps to ask for more information regarding which part of the requirements are stable, and which are still under discussion (and actually postpone the implementation for things under discussion). And sometimes some people just have to be "kicked in their asses" for not changing their minds twice a day.

OTHER TIPS

My humble opinion is that you shouldn't avoid or not-avoid using design patterns.

Design patterns are simply well known and trusted solutions to general problems, that were given names. They aren't different in a technical manner than any other solution or design you can think of.

I think the root of the problem might be that your friend thinks in terms of "applying or not applying a design pattern", instead of thinking in terms of "what is the best solution I can think of, including but not limited to the patterns I know".

Maybe this approach leads him to use patterns in partly artificial or forced ways, in places where they don't belong. And this is what results in a mess.

In your example of using the Null Object pattern, I believe it eventually failed because it met the programmer's needs and not the customer's needs. The customer needed to display the price in a form appropriate to the context. The programmer needed to simplify some of the display code.

So, when a design pattern doesn't meet the requirements, do we say that all design patterns are a waste of time or do we say that we need a different design pattern?

It would appear the mistake was more to remove the pattern objects, than to use them. In the initial design, the Null Object appears to have provided a solution to a problem. This may not have been the best solution.

Being the only person working on a project gives you a chance to experience the whole development process. The big disadvantage is not having someone to be your mentor. Taking time to learn and apply best or better practices is likely to pay off quickly. The trick is identifying which practice to learn when.

Chaining references in the form product.Price.toString('c') violate the Law of Demeter. I've seen all sorts of issues with this practice, many of which relate to nulls. A method like product.displayPrice('c') could handle null prices internally. Likewise product.Description.Current could be handled by product.displayDescription(), product.displayCurrentDescription(). or product.diplay('Current').

Handling the new requirement for managers and content providers needs to be handled by responding to context. There are a variety of approaches that can be used. The factory methods could use different product classes depending on the user class they will be displayed to. Another approach would be for the product class display methods to construct different data for different users.

The good news is you friend realizes that things are getting out of hand. Hopefully, he has the code in revision control. This will allow him to back out bad decisions, which he will invariably make. Part of learning is to try different approaches, some of which will fail. If he can handle the next few months, he may find approaches that simplify his life and clean up the spaghetti. He could try working on fixing one thing each week.

The question seems to be wrong at so many points. But the blatant ones are:

  • For the Null Object Pattern you mentioned, after the requirements changed, you change a bit of the code. That's fine but it doesn't mean you 'murder' the Null Object Pattern (btw, be careful with your wording, this sounds too extreme, some people too paranoiac won't see this as funny at all).

Many people have correctly said, design patterns are very much about labeling and naming a common practice. So think about a shirt, a shirt has a collar, for some reason you remove the collar or part of the collar. The naming and labeling changes, but it is still a shirt in essence. This is exactly the case here, minor changes in detail which doesn't mean you have 'murdered' that pattern. (again mind the extreme wording)

  • The design you talked about is bad because when the requirement changes come as minor things, you make massive strategic design changes all over the places. Unless the high-level business problem changes, you can't justify making massive design change.

From my experience, when minor requirements come, you only need to change small part of the codebase. Some might be a bit hacky, but nothing too serious to substantially affect maintainability or readability and often a few lines of comment to explain the hacky part will suffice. This is a very common practice as well.

Let's pause for a moment and look at the fundamental issue here - Architecting a system where the architecture model is too coupled to low-level features in the system, causing the architecture to break frequently in the development process.

I think we have to remember that the use of architecture and design patterns related to it have to be laid on a appropriate level, and that the analysis of what is the right level is not trivial. On one hand, you might easily keep the architecture your system on a too high level with only very basic constraints like "MVC" or the like, which can lead to missed opportunities as in clear guidelines and code leverage, and where spaghetti code can easily flourish in all that free space.

On the other hand, you might just as well over-architect your system, as in setting the constraints on a to detailed level, where you assume you can rely on constraints that in reality is more volatile then you expect, constantly breaking your constraints and forcing you to constantly remodel and rebuild, until you start to despair.

Changes in requirements for a system will always be there, to a lesser or greater extent. And the potential benefits of using architecture and design patterns will always be there, so there is not really a question of using using design patterns or not, but on what level you should use them.

This requires you not only to understand the current requirements of the proposed system, but also to identity what aspects of it can be seen as stable core properties of the system, and what properties might be likely to change during the course of the development.

If you find that you constantly have to fight with unorganized spaghetti code, you are probably not doing enough architecture, or on a to high level. If you find that your architecture frequently breaks, you are probably doing too detailed architecture.

The use of architecture and design patterns is not something you can just "coat" a system in, like if you would paint a desk. They are techniques that should be applied thoughtfully, on a level where the constraints you need to rely on have a high possibility of being stable, and where these techniques actually are worth the trouble of modelling the architecture and implementing the actual constraints/architecture/patterns as code.

Relevant to the issue of a too detailed architecture, you can as well put just to much effort in architecture where it is not giving much value. See risk driven architecture for reference, I like this book - Just enough Software Architecture, maybe you will too.

Edit

Clarified my answer since I realized I often expressed myself as "too much architecture", where I really meant "too detailed architecture", which is not exactly the same. Too detailed architecture can probably often be seen as "too much" architecture, but even if you keep the architecture on a good level and create the most beautiful system mankind has ever seen, this might still be too much effort on architecture if the priorities are on features and time to market.

Your friend seems to be facing numerous headwinds based on his anecdote. That is unfortunate, and can be a very hard environment to work in. Despite the difficulty, he was on the correct path of using patterns to make his life easier, and it is a shame that he left that path. The spaghetti code is the ultimate result.

Since there are two different problem areas, technical and interpersonal, I'll address each separately.

Interpersonal

The struggle your friend is having is with rapidly changing requirements, and how that is affecting his ability to write maintainable code. I would first say that requirements changing twice a day, every day for such a long period of time is a bigger problem, and has an unrealistic implicit expectation. The requirements are changing faster than the code can change. We cannot expect the code, or the programmer, to keep up. This rapid pace of change is symptomatic of an incomplete conception of the desired product at a higher level up. This is a problem. If they don't know what they really want, they'll waste a lot of time and money to never get it.

It might be good to set boundaries for changes. Group together changes into sets every two weeks, then freeze them for the two weeks while they are implemented. Build up a new list for the next two week period. I have a feeling some of these changes overlap or contradict (ex, waffling back and forth between two options). When changes come fast and furious, they all have top priority. If you let them accumulate into a list, you can work with them to organize and prioritize what is most important to maximize efforts and productivity. They may see that some of their changes are silly or less important, giving your friend some breathing room.

These problems should not stop you from writing good code, though. Bad code leads to worse problems. It may take time to refactor from one solution to another, but the very fact that it is possible shows the benefits of good coding practices through patterns and principles.

In an environment of frequent changes, technical debt will come due at some point. It is far better to make payments toward it rather than throw in the towel and wait until it becomes too big to overcome. If a pattern is no longer useful, refactor it away, but don't go back to cowboy coding ways.

Technical

Your friend seems to have a good grasp on the basic design patterns. Null object is a good approach to the problem he was facing. In truth, it still is a good approach. Where he seems to have challenges is understanding the principles behind the patterns, the why of what they are. Otherwise, I don't believe he would have abandoned his approach.

(What follows is a set of technical solutions that were not asked for in the original question, but that show how we could adhere to patterns for illustrative purposes.)

The principle behind null object is the idea of encapsulating what varies. We hide what changes so we don't have to deal with it everywhere else. Here, the null object was encapsulating variance in the product.Price instance (I'll call it a Price object, and a null object price will be NullPrice). Price is a domain object, a business concept. Sometimes, in our business logic, we don't yet know the price. This happens. Perfect use case for the null object. Prices have a ToString method that outputs the price, or empty string if it's not known (or, NullPrice#ToString returns an empty string). This is a reasonable behavior. Then requirements change.

We have to output a null to the API view or a different string to the managers' view. How does this affect our business logic? Well, it doesn't. In the above statement, I used the word 'view' twice. This word was probably not spoken explicitly, but we must train ourselves to hear the hidden words in requirements. So why does 'view' matter so much? Because it tells us where the change has to really happen: in our view.

Aside: whether or not we are using an MVC framework is irrelevant here. While MVC has a very specific meaning for 'View', I am using it in the more general (and perhaps more applicable) meaning of a piece of presentation code.

So we really need to fix this in the view. How could we do that? The easiest way to do this would be an if statement. I know the null object was intended to get rid of all the ifs, but we have to be pragmatic. We can check the string to see if it's empty, and switch:

if(product.Price.ToString("c").Length == 0) { // one way of many
    writer.write("Price unspecified [Change]");
} else {
    writer.write(product.Price.ToString("c"));
}

How does this affect encapsulation? The most important part here is that view logic is encapsulated in the view. We can keep our business logic/domain objects completely isolated from changes in the view logic this way. It's ugly, but it works. This is not the only option, though.

We could say that our business logic has changed just a bit in that we want to output default strings if no price is set. We can make a minor tweak to our Price#ToString method (actually create an overloaded method). We can accept a default return value and return that if no price is set:

class Price {
    ...
    // A new ToString method
    public string ToString(string c, string default) {
        return ToString(c);
    }
    ...
}

class NullPrice {
    ...
    // A new ToString method
    public string ToString(string c, string default) {
        return default;
    }
    ...
}

And now our view code becomes:

writer.write(product.Price.ToString("c", "Price unspecified [Change]"));

The conditional is gone. Doing this too much, though, could proliferate special case methods into your domain objects, so this only makes sense if there will only be a few instances of this.

We could instead create an IsSet method on Price that returns a boolean:

class Price {
    ...
    public bool IsSet() {
        return return true;
    }
    ...
}

class NullPrice {
    ...
    public bool IsSet() {
        return false;
    }
    ...
}

View logic:

if(product.Price.IsSet()) {
    writer.write(product.Price.ToString("c"));
} else {
    writer.write("Price unspecified [Change]");
}

We see the return of the conditional in the view, but the case is stronger for business logic telling if the price is set. We can use Price#IsSet elsewhere now that we have it available.

Finally, we can encapsulate the idea of presenting a price entirely in a helper for the view. This would hide the conditional, while preserving the domain object as much as we would like:

class PriceStringHelper {
    public PriceStringHelper() {}

    public string PriceToString(Price price, string default) {
        if(price.IsSet()) { // or use string length to not change the Price class at all
           return price.ToString("c");
        } else {
            return default;
        }
    }
}

View logic:

writer.write(new PriceStringHelper().PriceToString(product.Price, "Price unspecified [Change]"));

There are many more ways to make the changes (we could generalize PriceStringHelper into an object that returns a default if a string is empty), but these are a few quick ones that preserve (for the most part) both the patterns and the principles, as well as the pragmatic aspect of making such a change.

A design pattern's complexity can bite you if the problem it was supposed to solve suddenly disappears. Sadly, due to enthusiasm and popularity of design patterns, this risk is rarely made explicit. Your friend's anecdote helps a lot to show how patterns don't pay off. Jeff Atwood has some choice words about the topic.

Document variation points (they are risks) in requirements

Many of the more complex design patterns (Null Object not so much) contain the concept of Protected variations, that is "Identify points of predicted variation or instability; assign responsibilities to create a stable interface around them." Adapter, Visitor, Facade, Layers, Observer, Strategy, Decorator, etc. all exploit this principle. They "pay off" when the software needs to be extended in the dimension of the expected variability, and the "stable" assumptions remain stable.

If your requirements are so unstable that your "predicted variations" are always wrong, then the patterns you apply will cause you pain or be unneeded complexity at best.

Craig Larman speaks of two opportunities to apply Protected variations:

  • variation points - in the existing, current system or requirements, such as multiple interfaces that must be supported, and
  • evolution points - speculative points of variation which are not present in existing requirements.

Both are supposed to be documented by developers, but you should probably have customer commitment to the variation points.

To manage risk, you could say that any design pattern applying PV should be traced to a variation point in the requirements signed off by the customer. If a customer changes a variation point in the requirements, your design may have to change radically (because you likely invested in design [patterns] to support that variation). No need to explain cohesion, coupling, etc.

For example, your customer wants the software to work with three different legacy inventory systems. That's a variation point you design around. If the customer drops that requirement, then of course you have a bunch of useless design infrastructure. The customer needs to know that variation points cost something.

CONSTANTS in source code are a simple form of PV

Another analogy to your question would be to ask if using CONSTANTS in source code is a good idea. Referring to this example, let's say the customer dropped the need for passwords. Thus the MAX_PASSWORD_SIZE as a constant spread throughout your code would become useless and even a hindrance to maintenance and legibility. Would you blame the use of CONSTANTS as the reason?

I think it at least partly depends on the nature of your situation.

You mentioned constantly changing requirements. If the customer says "I want this bee-keeping application to also work with wasps" then that seems like the kind of situation in which careful design would help progress, not hinder it (especially when you consider that in the future she might want to keep fruit-flies too.)

On the other hand, if the nature of the change is more like "I want this bee-keeping application to manage the payroll of my laundromat conglomerate", no amount of code will dig you out of your hole.

There's nothing inherently good about design patterns. They're tools like any other - we only use them to make our jobs easier in the medium-to-long run. If a different tool (such as communication or research) is more useful then we use that.

Licensed under: CC-BY-SA with attribution
scroll top