3.1 Using Logistic Regression for Classification (Parts 1-3)
What we covered in this video lecture
We drew a general version of a single-layer neural network in this lecture. Then, we applied it to different models: linear regression, the perceptron from unit 2, and logistic regression. Logistic regression, similar to the perceptron, is a model for binary classification. In fact, many of its concepts, such as the sigmoid activation and the logistic loss, are also used in deep neural networks. So, it’s an important model that we will look at closely in Unit 3.
Additional resources if you want to learn more
In machine learning, we typically refer to the logistic function as sigmoid function due to its sigmoidal (S) shape. However, there are other sigmoid functions that exist. If you are interested in learning about other sigmoid functions, check out this Wikipedia page.
We briefly introduced the logistic loss function, which is also referred to as negative log-likelihood loss or binary cross-entropy. If you are interested, I have written about it in more detail here. (Certain parts of this article involve topics we still need to cover, such as multi-layer neural networks. So, feel free to bookmark this article and revisit it after completing Unit 4.)
Log in or create a free Lightning.ai account to access:
- Completion badges
- Progress tracking
- Additional downloadable content
- Additional AI education resources
- Notifications when new units are released
- Free cloud computing credits