They don't because the west banned slavery on moral grounds, and enforced it on the rest of the world. That advance is actually just western power in effect.
That ban on slavery is not being enforced on the rest of the world. The west had no problem playing football in Qatar in stadiums built by slave labour.
That seems like a simplistic take, given that slavery in pratice still exists and we just decide not to call it slavery due to technical loopholes. The countries most closely associated with the global economic oil supply for example, are largely run on slave labour.
"The west" is no longer a well defined thing. America is its own thing now, and I don't think it fits in with any traditional notion of "The West" anymore, outside of historical inclusion. And without America the term just means Europe, so you might as well just refer to things directly instead of coming up with a new term: America, Europe, Canada, etc.
It provides no analytical value anymore to talk about "the west" as a shared family of identities or cultures. That concept was more an ephemeral artifact of some colonial history combined with the post WW2 global landscape and the fact that the US was the last industrialized country remaining that didn't have its industrial base bombed to smithereens.
I mentioned Canada in my comments, but only out of vanity as I'm a Canadian. Really when most people talk about "the west" what they have in their mind's eye is US and Europe. The other countries are largely considered lesser auxiliaries, including mine (although Canada has had a higher prominence in recent years).
What I don't understand is what analytical value the term "the west" holds anymore, OUTSIDE of that historical artifact. What meaningful statement can you make about "the west" as you define it these days?