How do you come to this conclusion? Walk into any bookstore (at least if you live in a state where they still exist) and you'll find that 90% of the "relationships and self-help" section consists of books by women telling men how to be.
Quid?
How do you come to the conclusion that I think men should read books written by woman? In general one should read books by successful people.