Outside of work, when’s it ever beneficial to tell someone you’re a doctor?

Kind of a serious question. Is it ever beneficial to disclose that you’re a doctor to people outside of work? Do you find people treating you better or worse because of it? (And I know I’m about to get an onslaught of “dating apps” but I’m not talking about that lol)