I just found out about this debate and it’s patently absurd. The ISO 80000-2 standard defines ℕ as including 0 and it’s foundational in basically all of mathematics and computer science. Excluding 0 is a fringe position and shouldn’t be taken seriously.
I could be completely wrong, but I doubt any of my (US) professors would reference an ISO definition, and may not even know it exists. Mathematicians in my experience are far less concerned about the terminology or symbols used to describe something as long as they’re clearly defined. In fact, they’ll probably make up their own symbology just because it’s slightly more convenient for their proof.
My experience (bachelor’s in math and physics, but I went into physics) is that if you want to be clear about including zero or not you add a subscript or superscript to specify. For non-negative integers you add a subscript zero (ℕ_0). For strictly positive natural numbers you can either do ℕ_1 or ℕ^+.
From what i understand, you can pay iso to standardise anything. So it’s only useful for interoperability.
Yeah, interoperability. Like every software implementation of natural numbers that include 0.
Ehh, among American academic mathematicians, including 0 is the fringe position. It’s not a “debate,” it’s just a different convention. There are numerous ISO standards which would be highly unusual in American academia.
FWIW I was taught that the inclusion of 0 is a French tradition.
I’m an American mathematician, and I’ve never experienced a situation where 0 being an element of the Naturals was called out. It’s less ubiquitous than I’d like it to be, but at worst they’re considered equally viable conventions of notation or else undecided.
I’ve always used N to indicate the naturals including 0, and that’s what was taught to me in my foundations class.
Of course they’re considered equally viable conventions, it’s just that one is prevalent among Americans and the other isn’t.
This isn’t strictly true. I went to school for math in America, and I don’t think I’ve ever encountered a zero-exclusive definition of the natural numbers.
I have yet to meet a single logician, american or otherwise, who would use the definition without 0.
That said, it seems to depend on the field. I think I’ve had this discussion with a friend working in analysis.