qwertyboss

The only difference between ordinary and extraordinary is just that little "extra".

Atomic habits Chapter “The Role of Family and Friends in Shaping Your Habits”

• “We don’t choose our earliest habits, we imitate them. We follow the script handed down by our friends and family, our church or school, our local community and society at large. In many ways, these social norms are the invisible rules that guide your behaviour each day.”

• “We imitate the habits of three groups in particular:
The close.
The many.
The powerful.”

• The human mind knows how to get along with others. It wants to get along with others. This is our natural mode. You can override it—you can choose to ignore the group or to stop caring what other people think—but it takes work. Running against your culture requires extra effort.

• Once we fit in, we start looking for ways to stand out. This is one reason we care so much about the habits of highly effective people. Many of our daily habits are imitations of people we admire or envy.
qwertyboss Author

Late post: had read this yesterday, didnt post here.

0 Likes

Maybe Makerlog is NOT FOR ME 😓

Anyone knows about any platform for accountability group like MakerLog but for posting your learning/habits/side projects in public?

A development analogy would be
- logging level "INFO" is Twitter/Linkedin
- I am looking for a logging level "DEBUG" platform

You can literally put anything here, there is no restriction that you should only write about indie projects

0 Likes
qwertyboss Author

Thanks a ton Kenny! Because of your words I think I have better clarity now. 🤝

0 Likes

Landmark recognition: 1 hour 🗺

4 / 20 hours
Training on 100 GB of data is turning out to be difficult than expected.

ML from scratch Linear regression

Found an interesting resource for a few algorithms -> https://dafriedman97.github.io/mlbook/content/c1/construction.html

Landmark recognition 🌍: 1 hour

3 / 20 hours
Current: Baseline model
Next: CV strategy

Read Atomic Habits Chapter of 20: How to make Habits irresistable

> Temptation building
• Desire is the engine that drives behaviour. Every action is taken because of the anticipation that precedes it. It is the craving that leads to the response. It is as if the brain is saying, “See! I knew I was right. Don’t forget to repeat this action next time.

• Habits are dopamine driver feedback. Every behaviour that is habit-forming - taking drugs, eating junk, playing video games, watching porn, browsing social media is associate with a high level of dopamine. And whenever dopamine rises, so does your motivation to act.”

• Example: Use The habit stacking + temptation bundling formula.

“After [CURRENT HABIT WHICH I DO], I will [HABIT I NEED TO DO].
After [HABIT I NEED TO DO], I will [HABIT I WANT TO DO].”

“If you want to watch sports, but you need to make sales calls:
After I get back from my lunch break, I will call three potential clients (need).
After I call three potential clients, I will check ESPN (want).”

Workout: Push day 💪🏼

I had late meetings so had to do cheat workouts in between the calls, only 1 set per muscle group.
Carl Poppa 🛸

that's more than most people do. good work!

0 Likes
qwertyboss Author

Thanks for the support Carl 😅

0 Likes

Landmark recognition: 1 hour

2 / 20 hours done

Setting up the data pipeline and experimenting the right validation strategy.

Read Atomic Habits: 7 of 20 Chapter "The Secret to Self-Control"

> Cue-induced wanting

“The people with the best self-control are typically the ones who need to use it the least”

“People who appear to have tremendous self-control, it turns out those individuals aren’t all that different from those who are struggling. Instead, “disciplined” people are better at structuring their lives in a way that does not require heroic willpower and self-discipline”

ML from Scratch Multi Layer Perceptron (1/2)

Note: It is important to break the symmetry while weight initialization or else all the neurons will be activated in the same fashion.

ML from Scratch Logistic regression

- Data Cleanup: Encode features, Standardise features, etc.
This is necessary because the sigmoid function squashes really small and really large inputs.

- Weight Initialization
If the weight is too small it will bring you to the vanishing gradient problem if the weight is too large it will bring you to the exploding gradient problem. Instead, initialize the weight with a specific range.
qwertyboss Author

Breaking the Weight initialization symmetry and why it matters: https://towardsdatascience.com/neural-network-breaking-the-symmetry-e04f963395dd

0 Likes

Workout: Push day 💪🏼

I didn't have enough energy so did a cheat workout with only 1 set per muscle group.

Started with new kaggle competition

This is an experimental challenge I will be only working a total of 20 hours on this project (excluding model training) and let's see how much progress can I make.

Competition link: https://www.kaggle.com/c/landmark-recognition-2021/overview/evaluation

Last Date: 2nd October

Revise basic stuff

Create a list of basic ML topics which you want to learn the theory, code and maths.