I believe one of the reasons our society is so degenerate is because Western Church (and here I mean different churches/denominations as a whole) has lost the moral compass and isn't teaching what is right and wrong any more. Do whatever thou will shall be the foundation of your morality is the 1st satanist commandment from what I heard and that's exactly the position many churches take nowadays.
Say what you want about Islam, but their religious leaders have no problem teaching that nudity in entertainment, immodest clothing, fornication etc are grave sins. Yet for modern Christians, everything is a grey area. Like yeah, living together before marriage is kinda bad, but we should really look at the heart, and their hearts are in the right place so...Or yeah, divorce is bad but Jesus would want you to be happy and if you fell out of love and this loser wasn't nice enough to you and always forgot your wedding anniversary it's OK to divorce him, etc etc
Of course, there are some preachers who attack this status quo but they are extreme minority. Most Christians will always talk about not being judgemental, and let the one without sin cast the 1st stone. this attitude of moral relativism isn't new, either. I believe it was Leo Tolstoy in Anna Karenina who mentioned that the only thing people remember out of the Gospel is this woman taken in adultery, all the rest is conveniently forgotten.
All the admonition against sin, that is. Yet the New Testament mentions Hell fire countless times (of course, modern Bible translations try to get rid of the concept altogether by using words like Sheol which don't have the same negative connotations). But modern churches chiefly avoid talking about it and are mostly busy sucking up to politicians as not to lose their privileges and inventing reasons why the latest round of liberal degeneracy being pushed is actually OK and shouldn't bother Christians at all.
That is at the best, at the worst they will push downright worship of the foreign power and supporting suicidal immigration policies as being "Christlike". Is there any wonder young people, especially men, don't want anything to do with it anymore? Women of a certain type often thrive there, though. Not only does modern church fulfill a social function for them, but it also gives them a perfect opportunity to bring out their inner Karen, or I should say their inner pharisee and to virtue-signal.
There cannot be a moral revival of our society without the spiritual guidance. In the West, it used to be Christianity, but unless the Church starts confronting sin again and substitutes feminine pearl-clutching with masculine spiritual warfare type of activities, I'm afraid some other, more aggressive culture will win in the end.