arr = Array(10).fill(false) arr[-2] = true console.log(arr) [false, false, false, false, false, false, false, false, false, false, -2: true] console.log(arr.length) // 10
I'm so surprised that adding an element in negative index of the array, adds a key-value pair in the array. And also, the length of the array is not incremented?
Extract the definition from MDN Array.length
The length property of an object which is an instance of type Array sets or returns the number of elements in that array. The value is an unsigned, 32-bit integer that is always numerically greater than the highest index in the array.
It only counts from a 0-based index up to the highest numerical. So anything invalid or negative is ignore. So, the length is not really the number of elements that you visually see.
Relationship with the
length property MDN
Arrays cannot use strings as element indexes (as in an associative array) but must use integers. Setting or accessing via non-integers using bracket notation (or dot notation) will not set or retrieve an element from the array list itself, but will set or access a variable associated with that array's object property collection. The array's object properties and list of array elements are separate, and the array's traversal and mutation operations cannot be applied to these named properties.
Here indicates the differences between setting a value and a property. Anything set beyond the valid range of the array is considered property, just like anything other objects
Arrays are numerically indexed, but the tricky thing is that they also are objects that can have string keys/properties added to them (but which don't count toward the length of the array):
Indexes in an array start from
0 onwards, i.e. they can be positive.
When you try to access a negative index in an array like
a[-1] it will act as a key and
-1 will get stored as
let a = ; a = 9; a[-1] = 10; // will act as key "-1" added to object a with value 10 console.log(a); // console.log(a["-1"]); //10 console.log("Length of array " + a.length); // 1, key "-1" will not contribute to the lenght of an array
Only to make it clear that this is standard behavior and not a kind of lazy developper job, here are some official spec concerning the Array object :
Assert: IsPropertyKey(P) is true. If P is "length", then Return ? ArraySetLength(A, Desc). Else if P is an array index, then Let oldLenDesc be OrdinaryGetOwnProperty(A, "length"). Assert: oldLenDesc will never be undefined or an accessor descriptor because Array objects are created with length data property that cannot be deleted or reconfigured. Let oldLen be oldLenDesc.[[Value]]. Let index be ! ToUint32(P). If index ≥ oldLen and oldLenDesc.[[Writable]] is false, return false. Let succeeded be ! OrdinaryDefineOwnProperty(A, P, Desc). If succeeded is false, return false. If index ≥ oldLen, then Set oldLenDesc.[[Value]] to index + 1. Let succeeded be OrdinaryDefineOwnProperty(A, "length", oldLenDesc). Assert: succeeded is true. Return true. Return OrdinaryDefineOwnProperty(A, P, Desc).
So the array will treat the key as an array index if it matches the array index definition, which is :
An integer index is a String-valued property key that is a canonical numeric String (see 7.1.16) and whose numeric value is either +0 or a positive integer ≤ 253 - 1. An array index is an integer index whose numeric value i is in the range +0 ≤ i < 232 - 1.
Otherwise it will treat the key as an ordinary property.