so... Theano builds graphs for the expressions it computes before evaluating them. By passing a theano variable such as 'x' in the example to the initialization of the logistic regression object, you will create a number of expressions such as p_y_given_x in your object which are theano expressions dependent on x. This is later used for symbolic gradient calculation.
To get a better feel for it you can do the following:
import theano.pp #pp is for pretty print
x = T.dmatrix('x') #naming your variables is a good idea, and important i think
lr = LogisticRegression(x,n_in = 28*28, n_out= 10)
print pp(lr.p_y_given_x)
This should given you an output such as
softmax( W \dot x + b)
And while you're at it go ahead and try out
print pp(T.grad(lr._y_given_x,x)) #might need syntax checkng
which is how theano internally stores the expression. Then you can use these expressions to create functions in theano, such as
values = theano.shared( value = mydata, name = 'values')
f = theano.function([],lr.p_y_given_x ,
givens ={x:values},on_unused_input='ignore')
print f()
then calling f should give you the predicted class probabilities for the values defined in mydata. The way to do this in theano (and the way it's done in the DL tutorials) is by passing a "dummy" theano variable and then using the "givens" keyword to set it to a shared variable containing your data. That's important because storing your variables in a shared variable allows theano to use your GPU for matrix operations.