Violence against women in the United States is the use of domestic abuse, murder, sex-trafficking, rape and assault against women in the United States. It has been recognized as a public health concern.[1][2] Culture in the United States has led towards the trivialization of violence towards women, with media in the United States possibly contributing to making women-directed violence appear unimportant to the public.[3]