Many people see a culture war taking place right now. What is considered good and wholesome is open to heated debate. What one side calls traditional family values, the other side calls hate; what one one side calls a woman's right to choose, the other calls murder of the defenseless. What does this mean? Is one side right and the other wrong? Are both sides wrong, or part right and part wrong? And what of society itself? Does that fact that so many people are now embracing what was considered seriously disordered if not outright evil not so long ago mean that society is "evolving," or that it is degenerating? What does history tell us about this?