Question

So after a long day at work, I sit down and see an alert for Windows SkyDrive in the system tray:

Files can't be uploaded because the path of this file or folder is too long. Move the item to a different location or shorten its name.

C:\Users\Matthew\SkyDrive\Documents\Projects\Programming\angular-app\server\node_modules\grunt-contrib-nodeunit\node_modules\nodeunit\node_modules\tap\node_modules\runforcover\node_modules\bunker\node_modules\burrito\node_modules\traverse\example\stringify.js 

... and for a while, I laughed at that technological limitation.

But then, I wondered: is that amount of directory recursion within a Node project really necessary? It would appear that the paths beyond "angular-app\server\node_modules" are simply dependencies of the project as a whole and might be better expressed as:

C:\Users\Matthew\SkyDrive\Documents\Projects\Programming\angular-app\server\node_modules\grunt-contrib-nodeunit\
C:\Users\Matthew\SkyDrive\Documents\Projects\Programming\angular-app\server\node_modules\nodeunit\
C:\Users\Matthew\SkyDrive\Documents\Projects\Programming\angular-app\server\node_modules\tap\
C:\Users\Matthew\SkyDrive\Documents\Projects\Programming\angular-app\server\node_modules\runforcover\
C:\Users\Matthew\SkyDrive\Documents\Projects\Programming\angular-app\server\node_modules\bunker\
C:\Users\Matthew\SkyDrive\Documents\Projects\Programming\angular-app\server\node_modules\burrito\
C:\Users\Matthew\SkyDrive\Documents\Projects\Programming\angular-app\server\node_modules\traverse\

I hadn't really given it much thought before, as package management in Node seems like magic compared to many platforms.

I would imagine that some large-scale Node.js projects even contain many duplicate modules (having the same or similar versions) which could be consolidated into a lesser amount. It could be argued that:

  • The increased amount of data stored and transmitted as a result of duplicate dependencies adds to the cost of developing software.

  • Shallower directory structures (especially in this context) are often easier to navigate and understand.

  • Excessively long path names can cause problems in some computing environments.

What I am proposing (if such a thing does not exist) is a Node module that:

  • Recursively scans a Node project, collecting a list of the nested node_modules folders and how deeply they are buried in relation to the root of the project.

  • Moves the contents of each nested node_modules folder to the main node_modules folder, editing the require() calls of each .js file such that no references are broken.

  • Handles multiple versions of duplicate dependencies

If nothing else, it would make for an interesting experiment. What do you guys think? What potential problems might I encounter?

Was it helpful?

Solution 2

See fenestrate, npm-flatten, flatten-packages, npm dedupe, and multi-stage-installs.

Quoting Sam Mikes from this StackOverflow question:

npm will add dedupe-at-install-time by default. This is significantly more feasible than Node's module system changing, but it is still not exactly trivial, and involves a lot of reworking of some long-entrenched patterns.

This is (finally) currently in the works at npm, going by the name multi-stage-install, and is targeted for npm@3. npm development lead Forrest Norvell is going to spend some time running on Windows in the new year, so please do create windows-related issues on the npm issue tracker < https://github.com/npm/npm/issues >

OTHER TIPS

See if

npm dedupe

sets you right.

API Doc here

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top