What is the use of torch.no_grad in pytorch?
-
31-10-2019 - |
Question
I am new to pytorch and started with this github code. I do not understand the comment in line 60-61 in the code "because weights have requires_grad=True, but we don't need to track this in autograd"
. I understood that we mention requires_grad=True
to the variables which we need to calculate the gradients for using autograd but what does it mean to be "tracked by autograd"
?
No correct solution
Licensed under: CC-BY-SA with attribution
Not affiliated with datascience.stackexchange