문제

Sometimes the number of kwargs of a method increase to a level where I think it should be refactored.

Example:

def foo(important=False, debug=False, dry_run=False, ...):
    ....
    sub_foo(important=imporant, debug=debug, dry_run=dry_run, ...)

My current preferred solution:

class Args(object):
    ...

def foo(args):
    sub_foo(args)

First question: How to call Args? Is there a well known description or design pattern?

Second question: Does Python have something which I could use as base class for Args?

Update

I use Python work daily since 13 years. I used methods with many kwargs and wrote methods with many kwargs. During the last weeks a read the book "clean code" and I liked it. Somehow it is like wearing an other pair of glasses now. My old code works, but it is not nice to look at. Splitting long methods into several smaller methods is easy. But I am not sure how to handle methods with kwargs-bloat.

도움이 되었습니까?

해결책

I think what you've described is an example of the "Context" design pattern.

I usually call your "Args" a "Context" (or a "FooContext" if it's foo-specific enough).

I think the best explanation I saw was here: http://accu.org/index.php/journals/246 ("The Encapsulate Context Pattern", by Allen Kelly in Overload Journal #63 - Oct 2004, which I saw from another SO answer: https://stackoverflow.com/a/9458244/3427357).

There's also some decent papers that elaborate further if you want an in-depth exploration: http://www.two-sdg.demon.co.uk/curbralan/papers/europlop/ContextEncapsulation.pdf https://www.dre.vanderbilt.edu/~schmidt/PDF/Context-Object-Pattern.pdf

As pointed out by yet another SO answer (https://stackoverflow.com/a/1135454/3427357), the Context pattern is considered dangerous by some (c.f. http://misko.hevery.com/2008/07/18/breaking-the-law-of-demeter-is-like-looking-for-a-needle-in-the-haystack/).

But I think the "Law of Demeter" warnings are about not over-complicating your early design more than they're about cleaning up the cruft that accidentally grew while you were solving other problems. If you're passing an "important" boolean through multiple function call layers you're already going to testing hell, and in that situation the refactor you've described is generally a pure win in my experience.

I don't think there's a standard base class for this in python, unless maybe you're lazy enough to pass an argparse.Namespace as your context object just because you already had your parameter values there.

다른 팁

def foo(*args, **kwargs):
    sub_foo(*args, **kwargs)

Better would be to use introspection to call through to the subfunction.

You just need a way to get information on the function. You could do something like this:

def passthru(func):
    l = inspect.stack()[1][0].f_locals
    args = inspect.getargspec(func)[0]
    kwargs = dict((x, l[x]) for x in args if x in l)
    return func(**kwargs)

def f(x=1, y=2):
    print x,y

def g(x=4):
    passthru(f)

f()
1 2

g()
4 2

g(6)
6 2

It seems to have some overhead, though.

I'm not exactly sure what you are looking for, so perhaps an edit to add in some additional information might be helpful (e.g. what do you mean by clean code, and why doesn't *args, **kwargs satisfy that, what is the ultimate goal you are trying to accomplish, etc).

I'll throw out one additional idea not mentioned yet. You could create a dictionary and pass it in as the keyword arguments by using **

def foo(important=False, debug=False, dry_run=False):
    print important, debug, dry_run

args = dict()
args['important'] = True
args['debug'] = True
args['dry_run'] = False
foo(**args)

Or since you wanted to involve OOP, you could perhaps use an object.

class Args(object):
    pass

def foo(important=False, debug=False, dry_run=False):
    print important, debug, dry_run

args = Args()
args.important = True
args.debug = True
args.dry_run = False
foo(**args.__dict__)

I don't understand why you would do that. Generally, if a method has that many arguments the problem is that method is doing too much, not that you need to wrap the arguments up in some object. If you just want to be able to pass the arguments around you can use **kwargs.

That said, if you have some strange use-case and really need this you could use NamedTuple.

def foo(a=1, b=2, c=3, d=4, e=5, f=6, g=7): # kwarg way
    do_things(a, 7, b, 12, c, 3, d, e, f, g) # or whatever

FooArgs = collections.namedtuple('FooArgs', ['a', 'b', 'c', 'd', 'e', 'f', 'g'])
foo_args = FooArgs(1, 2, 3, 4, 5, 6, 7)
foo_args.a # 1
foo_args.e # 5

def foo(args): # namedtuple way
    do_things(args.a, 7, args.b, 12, args.c, 3, args.d, args.e, args.f, args.g)

I see a few ways out:

automagic e.g. thread-local storage or other context where these values can be gotten from. web frameworks often follow this, e.g. here https://stackoverflow.com/a/19484699/705086 I find this most pythonic, in a sense that it's easier to read. Call it poor man's context-oriented programming. It's similar to giving direct access to sys.argv but more precise.

it's best for cross-cutting concerns, authorization, logging, usage limits, retries…

collections.namedtuple especially useful if same set of arguments is often repeated exactly or if multiple instance of this kind are common, for example:

job = collections.namedtuple("job", "id started foo bar")
todo = [job(record) for record in db.select(…)]

**kwargs, anonymous, bug-prone when unexpected keyword argument is passed in.

self, if you keep passing arguments from one function to the next, perhaps these should be class/object members

You can also mix and match these, in your example:

  • debug ⇾ automagic context
  • dry_run ⇾ automagic context
  • important ⇾ keep a named kwarg, for explicit is better than implicit

I believe that you're just wasting your time and making your code more complex. As a Python developer I'd rather see a function with 20 arguments than a function that takes a complex Args object.

라이센스 : CC-BY-SA ~와 함께 속성
제휴하지 않습니다 StackOverflow
scroll top