According to the source code, Scrapy
uses the following CSS selector to parse the inputs out of the form:
descendant::textarea|descendant::select|descendant::input[@type!="submit" and @type!="image" and @type!="reset"and ((@type!="checkbox" and @type!="radio") or @checked)]
In other words, all of your hidden inputs are successfully parsed (and sent with the request later) with the values equal to value
attributes. So, Scrapy does what it should here.
The login using from_response()
doesn't work because __EVENTTARGET
has a empty value
attribute. If you make the login using a real browser, __EVENTTARGET
parameter value would be set to b_Login
via javascript __doPostBack()
function call. And, since Scrapy cannot handle javascript (cannot call js functions), __EVENTTARGET
is sent with an empty value which causes login failure.
__EVENTARGUMENT
has an empty value
too, but it is actually set to the empty string in the __doPostBack()
function, so it doesn't make a difference here.
Hope that helps.