Nobody needs to learn all of the frameworks and libraries that exist. Nobody needs to keep up with all of them. Each of them are tools designed to help solve a particular problem with a particular approach.
Even Java (one of the stronger Object-oriented static type systems), includes the ability to coerce one type into another.
"Inconsistent semantics" is too subjective to hold the claim alone.
World Wide Web was meant for hyperlinked documents, not web applications. NCSA repurposed the web using CGI to run command line programs over the WWW browser. After commercialization of the web, various companies repurposed CGI to run their business interface over WWW browser.
It failed to standardize on low-level primitives. Instead of multiple derived architectures competing on their merits, we've standardized a monolithic architecture which tries to support all use cases, and which fossilizes bad design decisions for fear of "breaking the web".
software development is immature, unprofessional and driven by fads. It's the geek equivalent of evolution of a peacock's tail: an opportunity for a class of developers to show "prowess", even if it means adding unneeded and costly complexity, and to be rewarded for it, creating a feedback loop.
Some of 'best practices' agreed upon by most developers are questionable: using CDNs, avoiding page reload when following links, rendering on client, using rocket science build tools, using NoSQL, using async I/O on server. Blindly copying approaches of social media giants (Facebook, Twitter).
This is indeed how technology moves forward. A lot of web projects don't need advanced technologies but some of them do. Those of which, wouldn't be possible without the "over-complicated" programming.
Lots of business add popular cliches to website requirements: look a-la "silicon valley", page transition animations, infinite scroll, various popups, lots of icons, fancy fonts, material/iOS look on mobile. Undecorated static pages with text/photos are sufficient to virtually no one.
programming on any popular platform is getting complicated. Just look at Gradle (raging android devs on the background) - allowing one click build after coding down hundred lines. Modular project structure is ok though, I think.
The web is getting complicated because it grows, it's not unnecessary. People who have problems with catching up maybe should try to specialize more, because as the field grows it's increasingly harder to grasp all of it. Similar situation is with biology; it branched to biotech, molecular bio, etc.
Web development started with just one language (HTML) So much has been tacked on it has become a frankenstein environment. Now for modern web pages you need at the very least 3 (HTML, JS, CSS) together with some Image Editing skills. It then grows from there for dynamic content (DB+Backend langs)
The level of complexity is the prerogative of the developer that chooses their tools, not the existence of the tools themselves. If a tool or its complexity is "not necessary" it is the programmer's fault for choosing the wrong tool for the job.
Years ago it was a simple script tag and you were up and running but now it takes too much time to set up a project structure that must include transpilers, modules and build tools. This complicates deployment and file handling at the end of the development process.
frameworks are needed to allow us to create more powerful apps without building things from scratch each time, therefore after many years we now have lots to choose from, lots of learning curves and lots of overhead
WebAssembly, if executed correctly, could present a generic bytecode that any platform can target and use as an interface between HTML and the browser, which is what makes front-end web so difficult to begin with. Additionally, this presents more opportunities for isomorphic platforms.
Common web development languages (Namely JS and PHP) lack any kind of static typing, making traditional OO design impractical, necessitating the use of large frameworks to make up for deficiencies of the language.