Here in the U.S., there's almost an unspoken culture of looking down upon people bettering themselves. I believe you've mentioned it before, and this is definitely an extension of that.
With education, this topic becomes even nastier because society simultaneously pushes the narrative that you should educate yourself. So.... some people will devalue your merits and knowledge no matter how learned you become, and others will never even look in your direction if you're without a degree. It's such a toxic dynamic.
In my opinion, education will always be our most powerful tool against the dog-water nonsense we have to deal with as Americans. Purely speaking in terms of knowledge, no matter what you major in, higher education is life-changing. So yes... with regards to going to college, it's worth weathering the storm of disrespect in order to have a clearer and more informed perspective of the world and all its moving parts.
Another excellent, thought-provoking piece.