Skip to content

Commit 7f9ff7a

Browse files
committed
update
1 parent 4237f03 commit 7f9ff7a

File tree

2 files changed

+1
-1
lines changed

2 files changed

+1
-1
lines changed

doc/src/week47/Latexslides/answersweek47.tex

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -53,7 +53,7 @@ \section{Logistic Regression}
5353
3. True. It has an analytical closed-form solution for its parameters, analogous to the normal equation in linear regression, however it needs to be solved numerically.

5454

5555
4. Its loss function (log loss) is convex, which guarantees a unique global optimum during training.
Logistic regression is a probabilistic classifier for binary outcomes, learned via maximum likelihood. Unlike linear regression, it does not have a closed-form coefficient solver and must be fit with iterative methods. Its negative log-likelihood (cross-entropy) cost is convex, ensuring a single global minimum .
56-
56+
If you don't have a use the negative log-likelihood, then the function is concave and you have a maximization problem.
5757
\item[\textbf{6.}]\textbf{(True/False)}
5858
Logistic regression produces a linear decision boundary in the input feature space.
5959
Answer: True. Logistic regression (with no feature transformations) produces a linear decision boundary in the input feature space.
True. The model is $\sigma(w^T x + b)$ and the decision boundary occurs at $w^T x + b = 0$, which is a hyperplane, that is a linear boundary.
File renamed without changes.

0 commit comments

Comments
 (0)