It is really quite simple (two live examples, gcc and clang)
template<class...T> void foo(T&&...) {std::cout << "...T\n";}
template<class T> void foo(T&&) {std::cout << "T\n";}
int main() {
foo(3);
}
Overloads not taking ...
seem to be preferred when the choice is an explicit template parameter.
The class=std::enable_if_t
does not change this.
So both your functions f
are candidates, then the compiler prefers the one without variardics.
14.8.2.4 Deducing template arguments during partial ordering [temp.deduct.partial]
/8:
If A
was transformed from a function parameter pack and P
is not a parameter pack, type deduction fails. Otherwise, using the resulting types P
and A
, the deduction is then done as described in 14.8.2.5
. If P
is a function parameter pack, the type A
of each remaining parameter type of the argument template is compared with the type P of the declarator-id
of the function parameter pack. Each comparison deduces template arguments for subsequent positions in the template parameter packs expanded by the function parameter pack. If deduction succeeds for a given type, the type from the argument template is considered to be at least as specialized as the type from the parameter template. [
Example:
template<class... Args> void f(Args... args); // #1
template<class T1, class... Args> void f(T1 a1, Args... args); // #2
template<class T1, class T2> void f(T1 a1, T2 a2); // #3
f(); // calls #1
f(1, 2, 3); // calls #2
f(1, 2); // calls #3; non-variadic template #3 is more
// specialized than the variadic templates #1 and #
In particular, the f(1,2)
example.
All the enable_if_t
clause does is remove the one-argument version from consideration when you pass a std::string
as T
.