Logistic Regression in Machine Learning.
“Logistic
regression measures the relationship between the categorical dependent variable
and one or more independent variables by estimating probabilities using a logistic function”.
Let’s understand the above logistic
regression model definition word by word. What logistic regression model will
do it, It uses a black box function to understand the relation between the
categorical dependent variable and the independent
variables. This black box function is popularly known as the Softmax function.
logistic
regression as a special case of linear regression when the outcome variable is
categorical, where we are using a log of odds as a dependent variable. In simple
words, it predicts the probability of occurrence of an event by fitting data to
a logit function.
Odds = pro. ( event happen)/pro. (event not
happened)
Or
Odds = pro.(event happened
)/pro,( 1- event happened)
The dependent variable is
the target class variable we are going to predict. However, the independent
variables are the features or attributes we are going to use to predict the
target class.
Based on the
a number of categories, Logistic regression can be classified as:
1. binomial: target variable can have only 2
possible types
2. multinomial: target variable can have 3 or
more possible types which are not ordered.
3. ordinal: it deals with target variables
with ordered categories.
For example, a test score can be
categorized as 1st grade, 2nd grade, 3rd grade. Base on student score.
Binary
logistic regression is estimated using Maximum Likelihood Estimation (MLE).
MLE is an iterative procedure, meaning that it starts with a guess as to the
the best weight for each predictor variable (that is, each coefficient in the
model) and then adjusts these coefficients repeatedly until there is no
additional improvement in the ability to predict the value of the output variable (either 0 or 1) for each case.
What is Softmax function?
Softmax function used in:- Naive Bayes Classifier
- Multinomial Logistic Classifier
- Deep Learning (While building Neural networks)
The special cases of softmax function input
- Multiplying the Softmax function inputs (Multiplying the Logits with any value)
- Dividing the Softmax function inputs (Dividing the Logits with any value)
Multiplying the Softmax function inputs:
Dividing the Softmax function inputs:
Dividing the Softmax function inputs:
Softmax function
Softmax
the function is the popular function to calculate the probabilities of the
events. The other mathematical advantages of using the softmax function are the
output range. Softmax function output values are always in the range of (0, 1). The sum of the output values will always equal
to the 1. The Softmax is also known as the normalized exponential function.
The above is
the softmax formula. Which takes each value (Logits) and find the
probability. The numerator the e-power values of the Logit and the denominator
calculates the sum of the e-power values of all the Logits.
Before we
implementing the softmax function, Let’s study the special cases of the Softmax
function inputs.
The two
special cases we need to consider the Softmax function output If we do
the below modifications to the Softmax function inputs.
If we multiply the Softmax function inputs, the inputs values will
become large. So the logistic regression will be more confident (High
Probability value) about the predicted target class.
If we divide the Softmax function inputs, the inputs values will
become small. So the Logistic regression model will be not confident (Less
Probability value) of the predicted target class.
A logistic
function or logistic
curve is a common "S" shape (sigmoid curve),
with equation:
F(x) = L/ 1+e-k(x-x0)
where
·
e = the natural logarithm base
(also known as Euler's number),
·
x0 = the x-value of the sigmoid's
midpoint,
·
L = the curve's maximum value, and
·
k = the logistic growth rate or
steepness of the curve.[1]
For values of x in the domain
of real numbers from −∞ to +∞, the S-curve showed
on the right is obtained, with the graph of approaching L as x approaches
+∞ and approaching zero as x approaches −∞.
A standard logistic
function is called sigmoid function (k=1,x0=0,L=1)
S(x) = 1/1+ ( e ^ - x)
The logistic function finds applications in a range
of fields, including artificial neural
networks, biology (especially ecology), biomathematics, chemistry, demography, economics, geoscience, mathematical psychology, probability, sociology, political science, linguistics,
and statistics.




0 Comments