Consider the following:
let x = 0;
console.log(x)
// Prints 0
console.log(x++)
// Prints 0
console.log(x)
// Prints 1But on the other hand:
let y = 0;
console.log(y)
// Prints 0
console.log(++y)
// Prints 1
console.log(y)
// Prints 1And sure. But this one works as one would expect:
let z = 0;
console.log(z)
// Prints 0
console.log(z+1)
// Prints 1
console.log(z)
// Prints 1Comparison. Consider this scenario:
let i = 0;
console.log(++i == i++)
// Returns Truelet x = 0;
console.log(x++)
// Prints 0
let y = 0;
console.log(++y)
// Prints 1And also (this one broke me)
let x = 0;
console.log(x++ == ++x)
// Returns false
x = 0;
console.log(++x == x++)
// Returns trueI think it’s because ++x is more like what one usually intends: add 1 to x and return that. Whereas x++ is saying return x, then add one.
So x++ == ++x returns false because it returns x first:
let x = 0;
x++ == ++x
0 == ++1The other one is true because:
let x = 0;
++x == x++
0 + 1 = 1 == 1 (and after the expression is complete, add one again)Finally:
let x = 0;
x = x++
// X returns 0
x = 0
x = x++
// In other words...
x = 0
x = 0 // add 1 _after_ assigning x = 0 to the no longer existing original value
x = 0
x = ++x
// x returns 1