ChatGPT解决这个技术问题 Extra ChatGPT

Why does the Scala compiler disallow overloaded methods with default arguments?

While there might be valid cases where such method overloadings could become ambiguous, why does the compiler disallow code which is neither ambiguous at compile time nor at run time?

Example:

// This fails:
def foo(a: String)(b: Int = 42) = a + b
def foo(a: Int)   (b: Int = 42) = a + b

// This fails, too. Even if there is no position in the argument list,
// where the types are the same.
def foo(a: Int)   (b: Int = 42) = a + b
def foo(a: String)(b: String = "Foo") = a + b

// This is OK:
def foo(a: String)(b: Int) = a + b
def foo(a: Int)   (b: Int = 42) = a + b    

// Even this is OK.
def foo(a: Int)(b: Int) = a + b
def foo(a: Int)(b: String = "Foo") = a + b

val bar = foo(42)_ // This complains obviously ...

Are there any reasons why these restrictions can't be loosened a bit?

Especially when converting heavily overloaded Java code to Scala default arguments are a very important and it isn't nice to find out after replacing plenty of Java methods by one Scala methods that the spec/compiler imposes arbitrary restrictions.

"arbitrary restrictions" :-)
It looks like you can get around the issue using type arguments. This compiles: object Test { def a[A](b: Int, c: Int, d: Int = 7): Unit = {}; def a[A](a:String, b: String = ""): Unit = {}; a(2,3,4); a("a");}
@user1609012: Your trick did not work for me. I tried it out using Scala 2.12.0 and Scala 2.11.8.
IMHO this is one of the strongest pain-points in Scala. Whenever I try to provide a flexible API, I often run into this issue, in particular when overloading the companion object's apply(). Although I slightly prefer Scala over Kotlin, in Kotlin you can do this kind of overloading...
The ticket of record on this is github.com/scala/bug/issues/8161

E
Eugen Labun

I'd like to cite Lukas Rytz (from here):

The reason is that we wanted a deterministic naming-scheme for the generated methods which return default arguments. If you write def f(a: Int = 1) the compiler generates def f$default$1 = 1 If you have two overloads with defaults on the same parameter position, we would need a different naming scheme. But we want to keep the generated byte-code stable over multiple compiler runs.

A solution for future Scala version could be to incorporate type names of the non-default arguments (those at the beginning of a method, which disambiguate overloaded versions) into the naming schema, e.g. in this case:

def foo(a: String)(b: Int = 42) = a + b
def foo(a: Int)   (b: Int = 42) = a + b

it would be something like:

def foo$String$default$2 = 42
def foo$Int$default$2 = 42

Someone willing to write a SIP proposal?


I think your proposal here makes a lot of sense, and I don't see what would be so complex about specifying/implementing it. Essentially, the parameter types are part of the function's ID. What does the compiler currently do with foo(String) and foo(Int) (i.e., overloaded methods WITHOUT a default)?
Wouldn't this effectively introduce mandatory Hungarian Notation when accessing Scala methods from Java? It seems like it would make the interfaces extremely fragile, forcing the user to take care when type parameters to functions change.
Also, what about complex types? A with B, for instance?
M
Martin Odersky

It would be very hard to get a readable and precise spec for the interactions of overloading resolution with default arguments. Of course, for many individual cases, like the one presented here, it's easy to say what should happen. But that is not enough. We'd need a spec that decides all possible corner cases. Overloading resolution is already very hard to specify. Adding default arguments in the mix would make it harder still. That's why we have opted to separate the two.


Thanks for your answer. The thing which probably confused me was that basically everywhere else the compiler only complains if there actually is some ambiguity. But here the compiler complains because there might be similar cases where ambiguity could arise. So in the first case the compiler only complains if there is a proven problem, but in the second case the compiler behavior is much less precise and triggers errors for "seemingly valid" code. Seeing this with the principle of the least astonishment, this is a bit unfortunate.
Does "It would be very hard to get a readable and precise spec [...]" mean that there is an actual chance that the current situation might be improved if someone steps up with a good specification and/or implementation? The current situation imho limits the usability of named/default parameters quite a bit ...
There is a process for proposing changes to the spec. scala-lang.org/node/233
I have some comments (see my comments below the linked answer) about Scala making overloading frowned upon and a second-class citizen. If we continue to purposely weaken overloading in Scala, we are replacing typing with names, which IMO is a regressive direction.
If Python can do, I don´t see any good reason why Scala couldn't. The argument for complexity is a good one: implementing this feature will make Scale less complex from the user's perspective. Read other answers and you will see people inventing very complex things just to solve a problem which should not even exist from the users' perspective.
L
Landei

I can't answer your question, but here is a workaround:

implicit def left2Either[A,B](a:A):Either[A,B] = Left(a)
implicit def right2Either[A,B](b:B):Either[A,B] = Right(b)

def foo(a: Either[Int, String], b: Int = 42) = a match {
  case Left(i) => i + b
  case Right(s) => s + b
}

If you have two very long arg lists which differ in only one arg, it might be worth the trouble...


Well, I tried to use default arguments to make my code more concise and readable ... actually I added an implicit conversion to the class in one case which did just convert the alternative type to the type accepted. It just feels ugly. And the approach with the default args should just work!
You should be careful with such conversions, since they apply for all uses of Either and not just for foo - this way, whenever a Either[A, B] value is requested, both A and B are accepted. One should define instead a type which is only accepted by functions having default arguments (like foo here), if you want to go this direction; of course, it becomes even less clear whether this is a convenient solution.
b
belka

What worked for me is to redefine (Java-style) the overloading methods.

def foo(a: Int, b: Int) = a + b
def foo(a: Int, b: String) = a + b
def foo(a: Int) = a + "42"
def foo(a: String) = a + "42"

This ensures the compiler what resolution you want according to the present parameters.


G
Guillaume Massé

Here is a generalization of @Landei answer:

What you really want:

def pretty(tree: Tree, showFields: Boolean = false): String = // ...
def pretty(tree: List[Tree], showFields: Boolean = false): String = // ...
def pretty(tree: Option[Tree], showFields: Boolean = false): String = // ...

Workarround

def pretty(input: CanPretty, showFields: Boolean = false): String = {
  input match {
    case TreeCanPretty(tree)       => prettyTree(tree, showFields)
    case ListTreeCanPretty(tree)   => prettyList(tree, showFields)
    case OptionTreeCanPretty(tree) => prettyOption(tree, showFields)
  }
}

sealed trait CanPretty
case class TreeCanPretty(tree: Tree) extends CanPretty
case class ListTreeCanPretty(tree: List[Tree]) extends CanPretty
case class OptionTreeCanPretty(tree: Option[Tree]) extends CanPretty

import scala.language.implicitConversions
implicit def treeCanPretty(tree: Tree): CanPretty = TreeCanPretty(tree)
implicit def listTreeCanPretty(tree: List[Tree]): CanPretty = ListTreeCanPretty(tree)
implicit def optionTreeCanPretty(tree: Option[Tree]): CanPretty = OptionTreeCanPretty(tree)

private def prettyTree(tree: Tree, showFields: Boolean): String = "fun ..."
private def prettyList(tree: List[Tree], showFields: Boolean): String = "fun ..."
private def prettyOption(tree: Option[Tree], showFields: Boolean): String = "fun ..."

S
Shiva Wu

One of the possible scenario is


  def foo(a: Int)(b: Int = 10)(c: String = "10") = a + b + c
  def foo(a: Int)(b: String = "10")(c: Int = 10) = a + b + c

The compiler will be confused about which one to call. In prevention of other possible dangers, the compiler would allow at most one overloaded method has default arguments.

Just my guess:-)


J
Janx

My understanding is that there can be name collisions in the compiled classes with default argument values. I've seen something along these lines mentioned in several threads.

The named argument spec is here: http://www.scala-lang.org/sites/default/files/sids/rytz/Mon,%202009-11-09,%2017:29/named-args.pdf

It states:

 Overloading If there are multiple overloaded alternatives of a method, at most one is
 allowed to specify default arguments.

So, for the time being at any rate, it's not going to work.

You could do something like what you might do in Java, eg:

def foo(a: String)(b: Int) =  a + (if (b > 0) b else 42)

关注公众号,不定期副业成功案例分享
Follow WeChat

Success story sharing

Want to stay one step ahead of the latest teleworks?

Subscribe Now