Hi,
This seems like a dumb question, but does anyone know any healthcare jobs/career paths I can go into that deal with caring for women and educating women on women's health (can be reproductive, etc), but specifically DO NOT involve pregnancy of any kind? Like once the pt becomes pregnant I want it to absolutely not be my territory unless they're looking to not be pregnant anymore. This is nothing against pregnancy or pregnant women or babies, I love babies, it's just super not my specialty.
Is that just what gynecologists do? Where I'm from OB is always combined with GYN and I have absolutely no interest in obstetrics. I do, however, have a strong interest in women's clinics, and in women's healthcare.
I worked for many years as a CNA and then in the ER as a tech. I loved working as a tech, and I loved the ER, but I ultimately left healthcare due to toxic hospital environments. I think the only job I'd go back for is a job where I can serve female patients. I understand that many patients are underserved, but I specifically saw a lot of examples of doctors/nurses not believing women or dismissing their symptoms. I'm motivated by the whole "be the change you want to see" sort of thing. I can't do a lot, but I'd like to create a safe space for female patients. School is no object. I'll go back to any school for any number of years if I can find the right end goal. Any suggestions?
Thank you for reading🥹❤️
0 nhận xét:
Đăng nhận xét