College was an interesting experience for me, because I really hated it.
I meet teenage women of color at times (I give back and that crap, so I volunteer at places) and they think I’m so neat and they are impressed that I went to college and want to know what school I went to. I get conflicted as to what to tell them. Should I tell them the truth? My truth seems so harsh.
Most people of color always say this, “I loved college it made me such a better person and blah, blah, I was so happy to have the opportunity to learn to kiss ass properly.â€
I hate the taste of ass.