This article may not have a worldwide view of the subject. (July 2023) |
Center-right politics are a set of opinions about politics that usually but not always agree with right-wing politics and when it doesn't is still normally more right-wing than people who support left-wing politics are. Right-centrists can be Rockefeller Republicans, Red Tories or conservatives.