Here's a snippet that shows validity information:
<!DOCTYPE html><html lang="en"><head><meta charset="utf-8" /><title>input with decimal step</title></head><body><input id="inp" type="number" min="1.2" max="1.3" step="0.010000000000000009" /><ul id="eventsList"></ul><script>const inp = document.getElementById('inp');const ul = document.getElementById('eventsList');inp.addEventListener( 'input', logInfo );inp.addEventListener( 'change', logInfo );inp.addEventListener( 'click', logInfo );function logInfo( e ) {const li = document.createElement('li');li.textContent = e.type + ". Value: " + inp.value + ", valid: " + inp.validity.valid + ", stepMismatch: " + inp.validity.stepMismatch;ul.appendChild( li );}</script></body></html>
1.2
, 1.21
, 1.22
, 1.23
, ..., 1.28
, to 1.29
) before it stops, though it never reports stepIsmatch: true
.1.2
and 1.21
) before it sets stepMismatch: true
.My money's on Chrome doing their own thing given their interpretation (or at least, their implementation) of the spec is more forgiving and is less likely to result in end-user-frustration when step
is set to a strange value.
What's interesting is that the Number
type in browsers seems able to accurately represent 0.010000000000000009
without any tell-tale IEEE 754 rounding (ditto Python), whereas C#/.NET's Single
(32-bit) and Double
(64-bit) floating-point types both choke and give me rather bad approximations.