I know I haven't posted in a while, but I thought that this little revelation I had was kinda important.
Please: no one feel offended by this, I'm not directing this at any of my friends, as very few people (if any) I know are like this. I'm not trying to start any fights, just trying to talk about how creeped out I am right now.
I am now genuinely scared of certain hardcore Christians. I just watched bits and pieces of "Jesus Camp" and it really freaked me out how I felt that the adults were manipulating the children for their own benefit. Like, I'm sure that it's all partially out of tradition, and passing on what the parents believe to be good morals, but it just scares me nonetheless.
Like, I feel like as a father my job would be to lead my children to an appropriate path for them, right? Like, how do I know that Christianity has the appropriate morals for them?
Even scarier is the feeling that perhaps I was somewhat brainwashed at birth, y'know? What if I was the product of Evangelicals trying to gain the majority in the US government? Just the things that the huge pastor says in the middle of the movie, plus on the woman's tapes as she listens to them at the close of the movie scare me. It's obvious that (perhaps only these people, but these people are REALLY influential) there are political motives in the churches. This completely contradicts and compromises our previous doctrine of "Separation of Church and State". The idea of that was to maintain the ideals our nation was built on, a nation that every second of the movie they claim to believe whole-heartedly in. So.... why try to reverse it?
The whole thing just really freaks me out, and I'm seriously considering moving to a different country as soon as I get my degree.