Over the past few years, I’ve noticed friends and acquaintances picking up golf (and pickleball, but that’s another story). As someone who has no interest in that sport whatsoever, I’m OK with just watching them enjoy it because they like playing. But when I hear that they learned to play golf solely so they could rub elbows with their company’s C-Suite (and be told that I should too if I want to move up), it rubbed me the wrong way.
I get it. In movies, we see the stereotypical portrayal of multi-million dollar deals getting closed out on golf courses. Maybe that’s where this idea comes from. But the notion that I need to subject myself to this idea just to prove I’m capable of moving up whatever hypothetical ladder just irks me. It feels antiquated, exclusionary, and downright insulting.
As a woman who has worked (and kind of still does) in a mostly male-dominated industry, this hit a raw nerve. It’s not just about being told to learn golf… it’s about the underlying assumption that my hard work, and skills are not enough. Instead of being recognized for what I bring to the table, I have to perfect my hole-in-one1 just to earn a seat at it.