Question

I have a bunch of algorithms and collections, and I am using a Policy Based design (see the book Modern C++ Design) to deal with arbitrary combinatorial complexity. This is great, but in order to prevent the destruction of the Host class by using the pointer to a policy, 1 suggests to make the destructors of policied protected. However, if I make the Algorithm and Collection destructors protected, I cannot use them on their own, but only as policies. Also, I don't see the benefits of the Policy Based design, compared to a Generic Factory Pattern...

Here is the model of the code:

#include <iostream>

template<class Collection>
class AlgorithmOne
{
    public: 
        void doSomethingWithData(Collection& data) 
        {
            // Use Collection iterators to build up an algorithm. 
        }
};

template<class Collection>
class AlgorithmTwo
{
    public: 
        void doSomethingWithData(Collection& data) 
        {
            // Use Collection iterators to build up an algorithm. 
        }
};

template<class Collection>
class AlgorithmThree
{
    public: 
        void doSomethingWithData(Collection& data) 
        {
            // Use Collection iterators to build up an algorithm. 
        }
};


template<class Element>
class CollectionOne
{
    public: 
        typedef Element ElementType;

};

template<class Element>
class CollectionTwo
{
    public: 
        typedef Element ElementType;

};

template<class Element>
class CollectionThree
{
    public: 
        typedef Element ElementType;

};

template<typename HostTraits>
class HostInheritsData
:
    public HostTraits::Collection, 
    public HostTraits::Algorithm
{
    public: 
        typedef HostTraits Traits;

        using Traits::Algorithm::doSomethingWithData;

        void doSomethingWithData() 
        {
            doSomethingWithData(*this);
        }
};

template<typename HostTraits>
class HostCompositsData 
:
    public HostTraits::Algorithm
{
    typename HostTraits::Collection data_;

    public: 
        typedef HostTraits Traits;

        using Traits::Algorithm::doSomethingWithData; 

        void doSomethingWithData() 
        {
            doSomethingWithData(data_);
        }

        // Clumsy and breaking encapsulation
        typename HostTraits::Collection& data()
        {
            return data_;
        }
};

template<typename HostTraits>
class GenericStrategy
{
    typename HostTraits::Collection data_; 
    typename HostTraits::Algorithm algorithm_; 

    public: 

        void doSomethingWithData() 
        {
            algorithm_.doSomethingWithData(data_);
        }
};

class ElementOne {}; 
class ElementTwo {}; 
class ElementThree {}; 

struct MyConfig
{
    typedef ElementOne                  Element;
    typedef CollectionThree<Element>      Collection;
    typedef AlgorithmOne<Collection>  Algorithm;
};

int main(int argc, const char *argv[])
{
    HostInheritsData<MyConfig> hostInherits;
    hostInherits.doSomethingWithData(); 

    // This must be a mistake, are policies meant to be used this way? 
    hostInherits.doSomethingWithData(hostInherits); 

    HostCompositsData<MyConfig> hostComposits;
    hostComposits.doSomethingWithData(); 

    // Clumsy to use, not intuitive and breaking encapsulation.
    hostComposits.doSomethingWithData(hostComposits.data()); 

    // Combinatorics are there, I can combine whatever I want in MyConfig as for
    // policies, but I can also have global Algorithm and Collection objects 
    // (no protected destructors).
    GenericStrategy<MyConfig> strategy; 
    strategy.doSomethingWithData(); 

    return 0;
}

Here are my questions:

I am customizing the structure of a Host class using policies, how can I trully enrich the interface of the Host class, when every realistic Algorithm requires a Collection to work on, and the Collection is encapsulated in the host?

When I compare the Policy Based Design together with a Generic Factory, doesn't the Generic Factory bring me the same combinatorial complexity? It seems that using the Generic Factory is better since I can interchange Element, Container and Algorithm in all possible combinations as well, and I can still have public destructors for all the Policies, which allows me to combine them in any way I want, e.g. a global combination of Element, Collection and Algorithm.

It seems to me that the Enriched Policies become a problem as soon as the structure is customized. Even if I later on add a member function to an algorithm, it will probably have parameters related to Collection (e.g. Collection iterators): if the Host encapsulates Collection using composition, I need to ask it for the parameter of its own member function:

// Clumsy to use, not intuitive and breaking encapsulation.
hostComposits.doSomethingWithData(hostComposits.data());

and if the Host encapsulates the Collection using inheritance, it gets (to me at least) even weirder:

 // This must be a mistake, are policies meant to be used this way? 
    hostInherits.doSomethingWithData(hostInherits); 

Did I completely misunderstand Policy Based design (yet again), am I using the traits properly? Is the Generic Strategy Pattern a better choice in this case?

Was it helpful?

Solution

You might want to think carefully about the amount of coupling in your design. E.g. make sure that you really want your algorithms to take a Collection as template parameters. This introduces a coupling between the algorithm and the container it operates on. Take a look at the Standard Library: its algorithms are function templates taking iterators as template parameters. Iterators do not know anyting about the container (Collection in your vocabulary) they point too.

To do other things than iteration and access, algorithms take slightly more enriched types as parameters, e.g. back_inserters to accesss the push_back() member of a container. But in general -without prior knowledge- there is absolutely no need to pass the entire container interface to all algorithms. To do really container specific things, embedding the algorithm as a container member function (e.g. the sort() member function of std::list) is more appropriate.

For doing slightly different things, there are several overloads (e.g. of std::transform) of the same function name. Only if your algorithm needs to maintain state is it really necessary to make it a class template, and the preferable way is to make it a function object, i.e. containing an overloaded operator(), instead of a DoSomethingWithData() member function.

Your data is parameterized in the same way as the Standard Library: a class template with an Element as template parameter. The way to feed such data to the algorithms is to provide iterator access to your data in the form of begin() and end() member functions. The Standard containers (vector, map, unorderd_map etc.) also take policies as template parameters, e.g. an Allocator, Compare or Hash class through which you can customize your data's behavior. The Standard smart pointers take a Deleter policy parameter to customize their behavior.

In short: carefully check your design. What is it that you want to do? What does each component (algorith, data structure) need to know about the other? And can you get there with proper use of the Standard Library? It's a safe bet that most of the things you might want to are already coded in there, and you can concentrate on writing your application's logic, rather than the algorithmic or data structure details.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top