In JavaScript, a leap year is a year that is divisible by 4, except for years that are divisible by 100 but not divisible by 400.
In simpler terms, if a year is divisible by 4, it is a leap year unless it is divisible by 100. However, if it is divisible by 400, it is still a leap year.
This means that the year 2000 was a leap year, even though it was divisible by 100, because it was also divisible by 400.
To check whether a given year is a leap year in JavaScript, you can use the following code:
function isLeapYear(year) { if (year % 4 === 0) { if (year % 100 === 0) { if (year % 400 === 0) { return true; } else { return false; } } else { return true; } } else { return false; } }
This function takes a year as its argument and returns true if the year is a leap year, and false otherwise.
It first checks whether the year is divisible by 4 using the modulus operator (%).
If it is not divisible by 4, then it is not a leap year and the function returns false.
If the year is divisible by 4, then it checks whether it is divisible by 100. If it is divisible by 100, then it checks whether it is also divisible by 400.
If it is divisible by 400, then it is a leap year and the function returns true. If it is not divisible by 400, then it is not a leap year and the function returns false.
If the year is divisible by 4 but not by 100, then it is a leap year and the function returns true.
To use this function, you can call it with a year as its argument, like this:
console.log(isLeapYear(2000)); // true console.log(isLeapYear(2004)); // true console.log(isLeapYear(2100)); // false console.log(isLeapYear(1900)); // false
In this example, the console.log statements call the isLeapYear function with various years as arguments and output whether each year is a leap year or not.
In conclusion, the JavaScript function provided above can be used to determine whether a given year is a leap year or not.