There's this trend that's going on and I don't like it. Women are trying to eliminate manhood! They say: what a man can do, a woman can do better. They have taken over families, women now marry men etc, The only feat they are yet to achieve is how to impregnate their fellow women. I hear some extremist feminists are imploring science to explore 'self-fertilization' or better still, they can implore the baboons, or other mammals to re-produce. And their quest is even aided by a growing band of 'blind' men, in the guise of woman eMANcipation.
In the end, they will find a reason to eliminate men from the surface of the earth, God forbid! It all started with Eve who preferred to listen to a serpent, rather than her husband, which brought us all to this sorry state. WOMEN, what have MEN done