I am little unsure of writing this as I do not want to upset anyone and know this is a sensitive subject for many. My question is, why do the American people feel it necessary to carry or have the right to carry guns? It seems so strange in what is supposed to be a sophisticated society that people feel they need to carry a weapon like that. Everytime there is a shooting incident like there has been this week its all over the news here in the UK, how easy it is to get a gun in America and the right to carry one. I just dont get it. Its not the Wild West anymore is it? Why do they feel so threatened? What's the need?