History of Zero-based Months?

Link post

JavaScript is a language of many silly things, and one of them is:

> new Date()
Wed Aug 24 2022 …
> new Date().getFullYear()
2022
> new Date().getMonth()
7
> new Date().getDate()
24

It represents 2022-08-24 as (2022, 7, 24). One-based indexing for the year and day, but zero-based indexing for the month.

In this case, however, the problem was copied from Java:

getMonth: The value returned is between 0 and 11, with the value 0 representing January.

getDate: The value returned is between 1 and 31 representing the day of the month.

I’d love to blame Java, but they seem to have copied the problem from C:

localtime(3):
tm_mday: The day of the month, in the range 1 to 31.
tm_mon:  The number of months since January, in the range 0 to 11.

Looking at the Unix History repo, the first mention of “month (0-11)” is in 1973′s Research Unix V4:

The value is a pointer
to an array whose components are
.s3
.lp +5 5
0      seconds
.lp +5 5
1      minutes
.lp +5 5
2      hours
.lp +5 5
3      day of the month (1-31)
.lp +5 5
4      month (0-11)
.lp +5 5
5      year \*- 1900
.lp +5 5
6      day of the week (Sunday = 0)
.lp +5 5

While this may have been an original decision by Dennis Ritchie, it’s also possible it was copied from an even earlier system. Does anyone know?

Comment via: facebook