The last 2 years have taught me to basically never believe:
— The media
— The government
— Pharma companies
— Any woke companies
— Anyone who got a needle in their arm
I simply refuse to believe them.
And why would you?
They’ve proven time and time again they do not have your best interests in mind. And keep in mind, this does extend to other parts of life, too.
Namely, college…
Now, I’m not saying that college is good nor bad.
Simply put, I am saying though, college is not required to succeed in life. You do not need it. You only “need” it to get your foot in the door in most jobs, most of which, you can probably find a way around it if you’re clever enough (i.e. work an internship for a summer, far cheaper, more fun, better experience work-wise than
college).
It got me thinking:
Why do colleges rarely offer sales degrees?
It makes you sense.
You can study almost anything else that’s connected to business, but it’s near-impossible to find a full-on Sales degrees (it’s always lumped in like Sales & Marketing).
Naturally, college doesn’t want people to learn skills like sales, they want you to learn a skill that you can then be put into a box.
Society though.
They want you in the box.
Listening to everything they say.
Unwilling or unable to look outside it.
And they want you in that box until the end of time.
That box becomes a cubicle and then eventually becomes a grave.
Life is too short to not do what you want to do with it.
That’s why I write daily emails. Too help the everyday person have a shot at achieving all of it.
You’ll learn how to start, scale, and systemize business that can easily provide the life that you desire.
Get those daily emails here:
Leave a Reply