W3’s validator has not been updated in more than 10 years and it did not work correctly even back than.
Could bring that up with the devs but it hasn’t been the early 2000s for a long time, nobody seems to care as long as it works.
Which is a shame. Browser should be strict when rendering.
- Multiple IDs with the same name? Jail!
- Open tags? Jail!
- Invalid order of tags? Believe it or not: Jail!
XHTML.
Oh my gods I wish. I was working on a webapp some months ago and was having the weirdest issue with how things were updating. There was a hard to discern pattern to it, but eventually when I really dug into it, I realised that it was generating elements with duplicate IDs, then it all made sense.
If the browser yelled at me, at least in the dev tools, it would’ve saved me a lot of work trying to figure things out.
Interesting fact: Firefox (or Gecko to be accurate, because there was no single “Firefox” browser back then - there was Netscape Navigator and Mozilla Application Suite) had such rendering mode, but it was quickly abandoned.[1]
- https://hsivonen.fi/doctype/: “In the summer of 2000 before Netscape 6 was released, Gecko actually had parser modes that enforced HTML syntax rules and one of these modes was called the “Strict DTD”. These modes were incompatible with existing Web content and were abandoned.”
I choose to imagine that these offenses result in developers cooling their heels in the slammer rather than a browser being a picky eater.
Works is different from works consistently across all browsers, or even versions of the same browser. I know most web developers got into their heads that only Chrome (and maybe Firefox if they are feeling generous) matters, but open source projects shouldn’t incentivise this.
Because HTML rules are the tools of the bourgeois. /s