Skip to content

Commit 763dbb4

Browse files
committed
update the past 26 days
1 parent df71bff commit 763dbb4

26 files changed

+719
-0
lines changed

posts/010425.md

Lines changed: 80 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,80 @@
1+
---
2+
title: 'how logistic regression works'
3+
tags: 'journal'
4+
date: 'Apr 1, 2025'
5+
---
6+
7+
sketching out logistic regression for my interview, because i need to get the fundamentals down.
8+
9+
the problem is we want to predict binary outcomes (0,1), but we can't do that with linear regression that predicts continuous.
10+
11+
how do we get a model that outputs probabilities between 0 and 1? and how can we make a decision boundary to produce binary outcomes?
12+
13+
the answer is a sigmoid function.
14+
15+
$p(x) = \frac{1}{1 + e^{-z}}$ where $z = \beta_0 + \beta_1X_1 + \beta_2X_2 + ... + \beta_nX_n$
16+
17+
this bounds the output between 0 and 1, and we can create a decision boundary at $p(x) = 0.5$ which is where $z = 0$
18+
19+
but what are we modeling here? the probabilities?
20+
21+
the key insight is that logistic regression doesn't directly model probabilities in a linear way - it models the log-odds.
22+
23+
Why log-odds? Because:
24+
25+
1. probability constraints: probability must be between 0 and 1, which isn't compatible with linear modeling (that produces unbounded values)
26+
27+
2. log-odds transformation: when we take $\log\left(\frac{p}{1-p}\right)$, we transform the bounded 0-1 range into an unbounded range ($-\infty$ to $+\infty$)
28+
29+
3. linear relationship: This allows us to model log-odds as a linear function of features:
30+
31+
$\log\left(\frac{p}{1-p}\right) = z = \beta_0 + \beta_1X_1 + \beta_2X_2 + ... + \beta_nX_n$
32+
33+
The magic happens in this transformation. Consider:
34+
35+
- If $p = 0.5$, log-odds $= 0$
36+
- If $p > 0.5$, log-odds $> 0$
37+
- If $p < 0.5$, log-odds $< 0$
38+
- As $p$ approaches $1$, log-odds approaches $+\infty$
39+
- As $p$ approaches $0$, log-odds approaches $-\infty$
40+
41+
So we're essentially saying:
42+
43+
1. we want to model probability $p$
44+
2. but we can't directly use linear regression on $p$ (bounded)
45+
3. so we transform $p$ to log-odds (unbounded)
46+
4. model log-odds linearly
47+
5. transform back to probability using sigmoid
48+
49+
this is why the coefficients in logistic regression represent changes in log-odds, and we can exponentiate them ($e^{\beta}$) to get odds ratios.
50+
51+
but how do we estimate our coefficients ($\beta_0 \to \beta_n$) that maximizes the probability of observing our training data?
52+
53+
we need a way to estimate the best coefficients ($\beta_0, \beta_1, ..., \beta_n$) that maximize the probability of observing our training data.
54+
55+
to do that, we need MLE.
56+
57+
we use MLE to find the coefficients that make our observed data most likely:
58+
59+
first we want to likelihood function, to calculate it, we calculate the probability of its actual outcome for each data point
60+
61+
for a binary classification, the likelihood is
62+
63+
$L(\beta) = \prod p(x)^y \cdot (1-p(x))^{(1-y)}$
64+
65+
Where $y$ is the true label (0 or 1)
66+
67+
to make optimization easier, we take the log so it converts the multiplication into an addition, known as the log likelihood
68+
69+
$$\log(L(\beta)) = \sum [y \cdot \log(p(x)) + (1-y) \cdot \log(1-p(x))]$$
70+
71+
and unlike linear regression's closed-form solution, logistic regression uses iterative methods like gradient descent.
72+
73+
the goal is to find $\beta$ values that maximize this log-likelihood, essentially finding the most probable model given the data.
74+
75+
---
76+
77+
more resources
78+
79+
- visualization by [MLU explain](https://mlu-explain.github.io/logistic-regression/)
80+
- [Concise Implementation of Softmax Regression — Dive into Deep Learning 1.0.3 documentation](https://d2l.ai/chapter_linear-classification/softmax-regression-concise.html)

posts/020425.md

Lines changed: 39 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,39 @@
1+
---
2+
title: 'cross encoders and colbert'
3+
tags: 'journal'
4+
date: 'Apr 2, 2025'
5+
---
6+
7+
TIL about [cross encoders](https://osanseviero.github.io/hackerllama/blog/posts/sentence_embeddings2/) and [colbert](https://huggingface.co/colbert-ir/colbertv2.0) while reading about to introduce joint attention between query and candidates to prepare for my interview.
8+
9+
Cross Encoders:
10+
11+
- process query and document pairs together through a single transformer
12+
- compute relevance scores directly without creating separate embeddings
13+
- highly accurate but computationally expensive for large document collections
14+
- cannot pre-compute document representations, requiring full processing for each query
15+
- use case: re-ranking top-k search results from a first-pass retrieval system
16+
17+
ColBERT:
18+
19+
- uses late interaction architecture with separate encodings for queries and documents
20+
- creates contextualized embeddings for **each token** rather than a single vector
21+
- performs efficient token-level interactions between query and document representations
22+
- enables both pre-computation of document representations and fast retrieval
23+
- use case: semantic search over millions of documents with better accuracy than bi-encoders
24+
25+
---
26+
27+
some herbs that made me very heaty today and i will avoid in the future, especially when i'm stressed
28+
29+
1. 淮山 (huái shān) - Chinese Yam
30+
2. 党参 (dǎng shēn) - Codonopsis Root
31+
3. 北祈 (běi qí) - Probably a typo or local name; might mean Astragalus Root
32+
4. 龙眼肉 (lóng yǎn ròu) - Longan Fruit
33+
5. 首乌 (shòu wū) - Fo-Ti Root
34+
6. 当归 (dāng guī) - Angelica Sinensis
35+
7. 熟地 (shú dì) - Prepared Rehmannia Root
36+
8. 黄精 (huáng jīng) - Polygonatum
37+
9. 川芎 (chuān xiōng) - Ligusticum Wallichii
38+
10. 构纪子 (gòu jì zǐ) - Possibly Goji Berries
39+
11. 大枣 (dà zǎo) - Red Dates/Jujube

posts/030425.md

Lines changed: 31 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,31 @@
1+
---
2+
title: 'interview day'
3+
tags: 'journal'
4+
date: 'Apr 3, 2025'
5+
---
6+
7+
i went to my ethics class because a missed class = 10% loss in my final grade
8+
9+
and i had to miss my other class which was a 7% loss in grade.
10+
11+
i had to choose one.
12+
13+
i left ethics class 30 min early and walked home.
14+
15+
i stayed at transamerica park and went through all the past LC questions i've studied, cramming them in my brain, hoping that i will hit the jackpot on one of them.
16+
17+
i went home at 1 pm when i started to get hungry.
18+
19+
i heat up my chinese herbal chicken soup and ate it with some rice.
20+
21+
i spent the next hour and a half revising through ML concepts and also some DP recurrence relations
22+
23+
the clock was inching closer to 3pm. i was growing more and more anxious. this was my first ever big tech interview.
24+
25+
it started out with introductions, then the leetcode question dropped in the coderpad. i first panicked, then felt relief when i realized i've seen this question before, then proceeded to feel my heart drop when i was asked to prove the solution. i failed to produce it and started writing my code only 20 min in. before i could finish writing it i was asked to implement quicksort. and i froze because the last time i learned that was 3 years ago in my DSA class in iowa. after that was over, i talked about my research for the rest of the interview and i asked a few questions about their semantic search system.
26+
27+
i felt like i blew it after the interview. i immediately got frustrated that i could not produce the proof and i did not even finish writing my code. and i could've done better at explaining my research and answering their questions. i went to A's apartment to chill and relax. after that we went home and i finished watching past lives with T. i also claimed the chatgpt free offer for students and generated 20 ghibli images of her and us.
28+
29+
this was another major defining moment of my life. hopefully the interviews that come after feels easier, not technical wise, but in trusting myself and in God. there was really no reason to feel like this interview was life or death. i don't define my future. i have no control. i just do my best and let God handle the rest.
30+
31+
i can go back to my usual life now. no more leetcoding and ml design studying every waking second of my life. the past 7 days were intense, and it can only go up from here.

posts/040425.md

Lines changed: 13 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,13 @@
1+
---
2+
title: 'post interview low'
3+
tags: 'journal'
4+
date: 'Apr 4, 2025'
5+
---
6+
7+
i went to bed with my mind still thinking about the questions i couldn't answer, i woke up at 6:30 am and couldn't fall back asleep thinking about the questions. i eventually got up and took the call with J and mafia to help with her commencement speech. the questions started circling in my head again and i started looking into the answers. i also got my interviewer's email during this time, and i thought of sending an email thanking them, along with the solutions. more for my own peace of mind, than an attempt to redeem myself.
8+
9+
i got the rejection email soon after sending that email. i was 85% certain that i wouldn't be passing, but i felt the sting. 7 days of non-stop prep and full focus on leetcode and ml system design, all for nothing. i couldn't even ansifwer basic questions. i went back to nap because of the lack of sleep, and then dragged myself out of bed again to go to the seminar on campus. i'll be late if i walk so i took a lime scooter for the first time. they're so expensive, it's 55c/min but including tax and other fees, a 10 minute ride was about $8. after seminar i stayed in class, working on homework. the lecture was about Unets and word2vec, and bayes theorem in language modeling and markov theorems.
10+
11+
when i walked back to class, i felt the low of the rejection. i felt like an idiot. in hindsight, if i had just given the problem more thought, if i had better mathematical foundations, if i had the ability to grok a concept, and if i didn't breeze through details and actually spent time to understand things at a deeper level, i wouldn't have had trouble in that interview. i kept blaming myself. i blew my only opportunity to work in big tech, a cushy job with high pay, working with some of the smartest people in the bay.
12+
13+
that was the hardest interview i've had in my life, and this past week was been intense. i acknowledge that maybe this isn't what God wants for me now. He has something greater planned for me. i have to remember i'm merely following where God leads me. there are more doors ahead of me.

posts/050425.md

Lines changed: 19 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,19 @@
1+
---
2+
title: 'do not worry'
3+
tags: 'journal'
4+
date: 'Apr 5, 2025'
5+
---
6+
7+
> 25 “Therefore I tell you, do not worry about your life, what you will eat or drink; or about your body, what you will wear. Is not life more than food, and the body more than clothes? 26 Look at the birds of the air; they do not sow or reap or store away in barns, and yet your heavenly Father feeds them. Are you not much more valuable than they? 27 Can any one of you by worrying add a single hour to your life?
8+
>
9+
> 28 “And why do you worry about clothes? See how the flowers of the field grow. They do not labor or spin. 29 Yet I tell you that not even Solomon in all his splendor was dressed like one of these. 30 If that is how God clothes the grass of the field, which is here today and tomorrow is thrown into the fire, will he not much more clothe you—you of little faith? 31 So do not worry, saying, ‘What shall we eat?’ or ‘What shall we drink?’ or ‘What shall we wear?’ 32 For the pagans run after all these things, and your heavenly Father knows that you need them. 33 But seek first his kingdom and his righteousness, and all these things will be given to you as well. 34 Therefore do not worry about tomorrow, for tomorrow will worry about itself. Each day has enough trouble of its own
10+
>
11+
> Matthew 6:25-34 NIV
12+
13+
i spent morning in bed on instagram and twitter and youtube. the rejection got me feeling low. watched the bible project video on [Jesus’ Perspective on Wealth](https://www.youtube.com/watch?v=GpqOdHV3dmU) it was an important reminder to not seek wealth or to collect and hoard, because we take nothing away with us once we leave this world.
14+
15+
after lunch, we went to la promenade cafe at 3 pm. i spent the whole time from 4 till 8 pm worked on sesame ai contrary research report and my research paper.
16+
17+
i bought 葱油饼 from shanghai beside because i got so hungry. it was not worth $10. we ate sushi at TERUAKI near the bus stop. i felt car sick on the #1 bus on the way back to chinatown. i am not a fan of taking the bus.
18+
19+
started watching ocean waves with t. the movie is not making sense to me so far; the storyline does not feel connected. i can't figure out what the protagonist's main goal is. what is his goal? are we just witnessing a love triangle? or is there more? it's going a bit slow so far. i guess it'll become more interesting later.

0 commit comments

Comments
 (0)