Question

I have created a function which has a reasonably large number of parameters (all vectors of the same length) and I thought it would make life easier to be able to bundle up the parameters in a data frame. I have managed to achieve this using S3 methods, but reading a bit more about S3 methods, I am starting to wonder whether the way I have coded my functions is a bit of an abuse of S3 conventions.

I'd like to know whether what I have done is a bad idea or not. If it is, alternative approaches would be welcome.

Here is a simplified example of what I have done:

myfunc <- function(x, ...) UseMethod("myfunc")

myfunc.default(time, money, age, weight) {
   # a silly calculation
   return(money/(age + weight) - time)
}

myfunc.data.frame <- function(params, ...) {
  names(params) <- tolower(names(params))
  pass.args <- intersect(names(params), names(formals(myfunc.default)))
  res <- do.call(myfunc.default, c(params[pass.args], ...))
  return(res)
}

Then if I had a data frame mydata with column names Money, time, AGE, weight and name then a call like this myfunc(mydata) would pass the relevant data to myfunc.default. It all works well, but is it wise?

Was it helpful?

Solution

Thanks for the comments. I have concluded my strategy of using S3 methods here was a bad idea. I have switched to two functions, such as myfunc and myfunc_df. I have also created a helper function for doing the heavy lifting of turning a function with individual arguments into one which accepts a data frame:

df_call <- function(.function, .parameters, .case=tolower, ...) {
  try(names(.parameters) <- match.fun(.case)(names(.parameters)))
  pass.args <- intersect(names(.parameters), names(formals(.function)))
  do.call(.function, c(.parameters[pass.args], list(...)))
}
Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top