As someone who lives in the US, circumcision is the norm and normally done before a male can give consent. It's often sighted as a good thing because many believe that the circumcised are cleaner than those who are not and by doing it to them when they are so young they won't remember the pain and will heal faster. The question is is it moral to alter someone's body without their consent in a way that is not a medical emergency and if not, why is it such a universally excepted practice here to the point that many parents must specifically tell doctors NOT to do it (and even then it could be done anyway)? Is there a positive side to circumcision without consent or a negative side to leaving it untouched? Are motivations religiously driven or driven by society? Should it be allowed as an option for anyone, only those of specific faiths which require it, or not at all?
Originally I was on the side saying it should be done. Then, I talked to men of both groups (friends, online friends, the 2 men I've dated, and even my dad!) and found that it had a negative impact on them. The older ones found that they were more likely to have bedroom difficulties if they had been circumcised than those who hadn't. They were less sensitive in general. I even spoke to a man who was angry at his parents for allowing it had happened to him without his consent and hoped he would one day be able to have it replaced in a sense. Some men said it was probably cleaner and women liked how it looked because the natural ones looked kinda weird (societal norms I guess), but the negative outweighed the positive especially for the older ones. The only person I talked to who believed it should be mandatory was a female nurse who said in the nursing home environment the circumcised ones were easier and you didn't have to worry about forgetting to roll it back in place and cut off circulation to it. I will admit that all I've said is anecdotal, but it bothers me as a woman who wants to have kids and I would love to hear valid points from both sides.