Unbounded Local Error: Why Does it occur when a global variable is redefined inside a function but not when it is merely referenced? [duplicate]

StackOverflow https://stackoverflow.com/questions/21924919

  •  14-10-2022
  •  | 
  •  

Question

This question stems from curiosity rather than a need to solve some particular problem. Given the two snippets of code below, why does the first produce an unbounded local error while the second does not? They appear to have the same critical sequence of operations in my mind, but there must be some under-the-hood magic at work that I am missing.

    x=5

    def fun1():
      x=x+2                # or x+=2
      print x

    def fun2():
      z=x+2                
      print z

I understand that to make the first function work, the global function must be passed. I just want to know what the interpreter does to make those two statements much more different than they seem. Thanks.

  • This question is different from similar questions in that it asks for the underlying cause of this behavior and not a solution for mitigating it.
Was it helpful?

Solution

There's actually a good/bad reason for this, and it's because you used specifically x=x+2.

Take these three functions as an example:

x = 5

def f1(): # Executes with no error
    x = 5

def f2(): # Unbounded local error
    x = x + 5

def f3(): # Executes with no error
    y = x + 5

This happens because when you execute x = 10, or any statement with x being assigned, Python uses the local version of x in the statement. Assigning x = 10 is thus fine. You are simply setting the local version of x to 10.

However, when you try x = x + 5, or x += 5 (which is the same thing), then Python uses the local version of x for the right hand side x as well, because you are also assigning the local version on the left hand side. But it hasn't been created yet! Which is why you get the error.

The issue is that when you assign to a variable name, Python assumes any references to that name in the right hand side of the assignment are in the same scope.

In the function f1, Python assigns 5 to the local variable x, with no problems.

In the function f2, Python creates the variable x, and tries to assign x + 5 to it, using the same local x as was just created. And since x doesn't exist yet, it can't read from its value, and throws an error.

In the function f3, since you are not assigning a value to x, Python chooses the global version of x, and reads from its value.

There is probably someone who can explain the internal workings of this better than me.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top