Ehh, among American academic mathematicians, including 0 is the fringe position. It’s not a “debate,” it’s just a different convention. There are numerous ISO standards which would be highly unusual in American academia.
FWIW I was taught that the inclusion of 0 is a French tradition.
This isn’t strictly true. I went to school for math in America, and I don’t think I’ve ever encountered a zero-exclusive definition of the natural numbers.
I’m an American mathematician, and I’ve never experienced a situation where 0 being an element of the Naturals was called out. It’s less ubiquitous than I’d like it to be, but at worst they’re considered equally viable conventions of notation or else undecided.
I’ve always used N to indicate the naturals including 0, and that’s what was taught to me in my foundations class.
Of course they’re considered equally viable conventions, it’s just that one is prevalent among Americans and the other isn’t.
I have yet to meet a single logician, american or otherwise, who would use the definition without 0.
That said, it seems to depend on the field. I think I’ve had this discussion with a friend working in analysis.