After starting, commenting on and, just reading hundreds of threads on this forum I've noticed something. A few of you that are American seem to blame America for everything. Doesn't matter what subject is brought up......."America brought this on" is a response I'm seeing more frequently. Do most of you believe this and just don't speak up or is it a minority that's just very vocal. If it's the latter, what do you suggest America should do to politically to make us perfect. If that's attainable? I am keeping an open mind here.