Question

I want to implement Logisitic regression from scratch in python. Following are the functions in it:

  1. sigmoid
  2. cost
  3. fminunc
  4. Evaluating Logistic regression

I would like to know, what would be a great start to this to start from scratch in python. Any guidance on how and what would be good. I know the theory of those functions but looking for a better pythonic answer.

I used octave and I got it all right but dont know how to start in python as OCtave already has those packages setup to do the work.

Was it helpful?

Solution

You may want to try to translate your octave code to python and see what's going on. You can also use the python package to do this for you. Check out scikit-learn on logistic regression. There is also an easy example in this blog.

OTHER TIPS

In order to implement Logistic Regression, You may consider the following 2 approaches:

  1. Consider How Linear Regression Works. Apply Sigmoid Function to the Hypothesis of Linear Regression and run gradient Descent until convergence. OR Apply the Exponential based Softmax function to rule out lower possibility of occurrence.

    def logistic_regression(x, y,alpha=0.05,lamda=0):
        '''
        Logistic regression for datasets
        '''
        m,n=np.shape(x)
        theta=np.ones(n)
        xTrans = x.transpose()
        oldcost=0.0
        value=True
        while(value):
            hypothesis = np.dot(x, theta)
            logistic=hypothesis/(np.exp(-hypothesis)+1)
            reg = (lamda/2*m)*np.sum(np.power(theta,2))
            loss = logistic - y
            cost = np.sum(loss ** 2)
            #print(cost)
            # avg cost per example (the 2 in 2*m doesn't really matter here.
            # But to be consistent with the gradient, I include it)
            # avg gradient per example
            gradient = np.dot(xTrans, loss)/m
            # update
            if(reg):
                cost=cost+reg
                theta = (theta - (alpha) * (gradient+reg))
            else:
                theta=theta -(alpha/m) * gradient
            if(oldcost==cost):
                value=False
            else:
                oldcost=cost
        print(accuracy(theta,m,y,x))
        return theta,accuracy(theta,m,y,x)
    
Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top