If they do it, just send them a link to pnpm.
pnpm is a Node.js package manager that uses a global storage and hard links to link the packages from the global storage.
>One version of a package is saved only ever once on a disk. So you save dozens of gigabytes of disk space!
It exists for several years already, is well maintained and used by some big projects.
The promise of Yarn was great, but I've had problems with them since day 1 because the offline feature didn't work with Github dependencies (as opposed to published npm packages).
I opened 2 issues with them about this and even eventually had a member of my team submit a pull request to fix this. Curiously, the very naive pull request that we sent (it didn't have any unit tests) was ignored for almost a month before being merged without guidance on unit testing. A look through their git history around that time shows that it doesn't seem like they really believe in unit testing?.
After that I switched to pnpm and never looked back. For my use case, it's better than Yarn in every way. Though I have to say, NPMv5 looks like it merits further a further look.
It does look much improved since that time when I was struggling with them. However, I can't just keep switching package managers all the time for no reason. I have to actually write business logic. pnpm is great and I see no reason to go back and re-try Yarn. PNPM is faster for me and I have gotten much more rapid replies when I open issues on their GitHub than I do on Yarn, despite Yarn's larger community.
Or you could use pnpm, which hard links the packages in node_modules
to the global store. This is more backwards-compatible than Plug'n'Play, which TypeScript doesn't support natively.
Don't do this, it would cause too many issues than its worth. npm and yarn both do caching but if you need to save on disk usage look at pnpm or the new (experimental) Yarn PnP api.
I am not a npm contributor, so I can only guess. Probably they were in rush. They needed to solve all the issues that npm 2 had with its nested node_modules structure. It was basically unusable on Windows. So maybe they did not know how to make a workable node_modules without flattening packages in it. There was ied at that time but it did not work with all packages. For instance, it did not support peer dependencies. I have spent more than 6 months to make pnpm usable in all cases. Peer dependencies is actually a bit more complex scenario but they also work with pnpm.
The only downside that I know about is that some package might not work with the non-flat node_modules. Since both npm and Yarn create flat node_modules, packages can require other packages that are not in their package.json
. This is really a bug in those packages, not in pnpm. In those cases, I would recommend fixing the issues in those packages. If this is hard, pnpm has a --shamefully-flatten
flag that creates a flat node_modules.
Yes, they used nested node_modules in npm v2. Flat node_modules was introduced in npm v3. There were some issues that they could not solve with a nested node_modules. However, Alexander Gugel has came up with a great solution in his package manager called ied, to use symlinks inside node_modules to create the nesting. This approach was adopted by pnpm and it works greatly
Here's an article about how pnpm uses symlinks to create the nesting https://pnpm.js.org/docs/en/symlinked-node-modules-structure.html
pnpm uses a content-addressable filesystem to store all files from all module directories on a disk.
When using npm or Yarn, if you have 100 projects using lodash, you will have 100 copies of lodash on disk. With pnpm, lodash will be stored in a content-addressable storage.
As a result, you save gigabytes of space on your disk and you have a lot faster installations
This is my fear. While I find JavaScript package resolution appallingly confusing (not the least because of all the different module schemes that are out there and their cryptic names - which one is "CommonJS" again? Node? Webpack? Who knows...), I don't really want to import a URL.
I feel like Maven, or Ruby's Bundler get it exactly right. (I don't understand wtf is going on with "easy_install pip" so I can't speak to Python.)
The only thing that needs fixing is transitive dependency hell, and that might be more a developer mindset issue than anything else. (Why is webpack 1k+ dependencies? That is just so stupid I don't even know where to begin.)
I also want a package manager with a central cache and maximum re-use, not vendor-ed everything and maximal non re-use. Something like pnpm https://pnpm.js.org/en/ So that is one obvious good thing about deno.
pnpm is both a package manager and monorepo manager (npm and lerna in one).
The repository of pnpm itself is a monorepo. This repo has local scripts differ from package to package.
I personally also have a monorepo in Typescript that uses pnpm. This repo has global scripts run once in root directory. It is also super duper strict.
pnpm has support for multi-package repositories as well. It has a set of commands that run concurrently in each package of a monorepo (https://pnpm.js.org/docs/en/pnpm-recursive.html)
pnpm recursive install|update|link|dislink|outdated|list
and more are coming.
Switch from npm to pnpm. NPM redownloads each package each time you install it. Highly inefficient. PNPM caches every package so it's downloaded once. Stored there, and "linked" to every time
Use Deno because it's like NodeJS but is much safer and faster (no node_modules, caches packages, etc.)
Npm also caches installs you just get a new copy in the project rather than a global cache, which you can get in node with the use of Pnpm
So let me see if I follow because there are a lot of different issues here.
1) Your company has a badly configured proxy to npm's repository that keeps timing out when you do npm install
inside a project.
2) Your packages are installing globally without you running npm -g whatever
3) You want some sort of global local cache for the packages so every time you run npm install
the proxy your company set doesn't get overloaded.
The right thing to do would be to talk to someone about fixing 1 first, but knowing how the corporate world works most like no one knows what's up with that.
The second issue is not possible unless you are explicitly passing the -g option to npm, by default npm will always install packages in a local node_modules folder.
To sort of fix number 3 and avoid overwhelming the proxy you could run npm --prefer-offline
to avoid checking for packages already in the npm-cache (not the global packages). There is also pnpm that works more like what you were expecting, it uses a global install folder and hard-links packages on each project, thus avoiding redownloding and duplication.
So, I assume that is milligram
you are talking about?
Actually I think that is a mistake on milligram
's part. They have an ignore
inside the package.json
. If they think that field will prevent those files from being published in the npm package, they are wrong. The correct way is to use a .npmignore
file (and that is what they should do, frankly). In the end, they should only publish the dist
and example
directories plus license
and package.json
files. You should file a bug report.
Now, onto your other points.
> First installing node+npm, then bower/yarn
That is, if you want to install the npm package. You can alternatively just use their Cloudflare CDN version or (like others have said) UNPKG CDN and bypass the whole npm package. milligram just don't have great docs in that regard.
On the other hand, if you do any serious web development, you will need node
and npm
at some point eventually. bower
is no longer recommended for new project (even by its maintainers) and yarn
is just an alternative to npm
(a great one, but alternative nonetheless).
> dig through and maintain this project structure
No. They are your dependencies. You may need to read them once in a while for debugging but generally you treat them as black boxes. You don't go around messing with your node_modules
in the same sense you don't extract jar
s you download from Maven or decompile your .dll
s.
> Its 57 items totalling over 1.3 MB
Putting aside milligram
's bug, the npm ecosystem is often bashed for massive node_modules
directories. Generally my response to that is if disk space is your concern, then use pnpm instead on npm
. But for your use case, just use a CDN.
This may be worth looking into:
https://pnpm.js.org/
Awhile ago, some of my co-workers were trying to only have 1 node_modules folder for all projects, but ran into some issues. Unfortunately, I don't know the details. Something along this line may help with the amount of hard drive space and the time it takes to do an npm install.
I saw this somewhere like glitch console? and was like did NPM do something bad and another challenger is entering the market, now that I know what this is I might check it out, on my local.
There are some tools to mitigate this, its main problem is that npm projects dont share dependencies. You can try https://pnpm.js.org/ where all your dependencies are linked from one place in filesystem. Or some experimental feature of yarn (not tried myself yet) https://www.freecodecamp.org/news/getting-rid-of-node-modules-with-yarn-plugn-play-a490e5e747d7/
Aside from this... gatsby needs a lot of tools to build your page, and they are all in node_modules
Check this out, this is why I prefer pnpm over npm
​
>"One version of a package is saved only ever once on a disk. So you save dozens of gigabytes of disk space!"
​
>pnpm uses hard links and symlinks to save one version of a module only ever once on a disk. When using npm or Yarn for example, if you have 100 projects using the same version of lodash, you will have 100 copies of lodash on disk. With pnpm, lodash will be saved in a single place on the disk and a hard link will put it into the node_modules where it should be installed.
​
pnpm uses a strict (non-flat) node_modules and some packages in the ecosystem are "broken" as they rely on packages that are not declared in their package.json. You can read about this here.
I would suggest to submit PRs to such packages and add the missing dependencies.
Alternatively, you can use the shamefully-flatten config of pnpm
To add a third candidate to the list, https://pnpm.js.org/ has the very nice feature of only storing a [particular version of a] package once, no matter how many different packages use it.
I haven't seen that ad. pnpm is a package manager, so it is not "writing code w/o a package manager".
To try out pnpm, you basically can install it via npm (standalone script is also available): npm i -g pnpm
. The commands are the same as in npm, like: pnpm install
There are also additional commands, like the recursive commands