The virtue of “Not Invented Here Syndrome”

Not Invented Here Syndrome (NIHS) is talked about in the derogatory terms when software development being discussed. There are some good reasons for this, but I would argue that it is not a bad thing in moderation. If you tell a fellow developer about some of a project you are about to embark on, one of the first questions you get is undoable, “What stack and frameworks are you using?”. This is not strange since most of us heavily rely on high abstractions to get things done. In the JS world of NPM, the use of libs are heavily relied upon, even ridiculous once such as left-pad which ended up breaking a good portion of NPM packages when it was removed.

Web2.0 with v8, es5, css3 and more brought us a lot of great things in terms of what the web could be used for. Abstraction like ember, angular, react and more were quickly built to realize the potential of building applications mimicking things previously only really possible in a desktop environment. This was great and in a few short years, most of us, moved from writing backend rendered web pages with some js-sugar to writing frontend rendered web application with frameworks.

The hope is that the progress in hardware will cure all software ills. However, a critical observer may observe that software manages to outgrow hardware in size and sluggishness.
- Wirth’s law

The speed of software halves every 18 months
- Gates’s law

This certainly feels true when it comes to the web. We have in a few years gone from a “slow internet” with poor preforming js vms using http1.1 to high “speed internet” with great js vms using http2 and in that process, somehow made the experience of the web much slower.

These days it is not uncommon to see regular web pages, without any real aspect of an application, to transfer >10mb of data and take more then 3-4sec for the first elements to start rendering. A reoccurring thing is a wordpress blog with a lot of plugins and themes combined with a css framework, a js framework, and way too many icons, fonts and images. Things render at different times when xhr requests resolve and when you try to click something, it moves because something, all of sudden, renders above it.

So let us look at a normal page, not particularly bad, not particularly good.
Loading nytimes.com from a desktop consists of

  • ~140 http requests,
  • ~3mb of data being transferred
  • DOM starts to render at 1.3 sec
  • ~9sec of loading time

Is this really reasonable for a page that, more or less, only shows text and images with very little additional functionality?

Are we sacrificing performance and usability on the alter of abstraction?

Circling back to NIHS. Instead of relying on other people’s code, which most of us do not bother to read, writing things your self gives you a deeper understanding of how things work, where problems occur and in general just make you a better developer. I’m not advocating writing a web apps or webpages in assembly but rather reflect on if you need the abstraction you are reaching for. If you are doing form checking, do you really need angular, react or even jquery? do you need bootstrap to make things look pretty in mobile? grids and flex boxes are not magic. Equally important is to realize when you do need the abstractions, but I would argue that much of the web would be far better off with less js, css and server side libs.

Some similar thoughts on the topic
Preventing the Collapse of Civilization https://youtu.be/pW-SOdj4Kkk
30 million line problem https://youtu.be/kZRE7HIO3vk


Author:
Rasmus Holm, CTO/Developer