this post was submitted on 11 Feb 2024
128 points (95.7% liked)

Programming

17369 readers
450 users here now

Welcome to the main community in programming.dev! Feel free to post anything relating to programming here!

Cross posting is strongly encouraged in the instance. If you feel your post or another person's post makes sense in another community cross post into it.

Hope you enjoy the instance!

Rules

Rules

  • Follow the programming.dev instance rules
  • Keep content related to programming in some way
  • If you're posting long videos try to add in some form of tldr for those who don't want to watch videos

Wormhole

Follow the wormhole through a path of communities [email protected]



founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 4 points 9 months ago (2 children)

The standard library thing is a really valid point, but how do you avoid recursive dependencies? Do you just not allow library packages to depend on anything?

pip is saner

Is it? It is very bare bones in my experience, I could never bring myself to use it until they make it a more fully fledged tool, such as the cargo you mentioned, yes

[–] [email protected] 5 points 9 months ago (1 children)

Other package managers, like nuget, throw errors if all dependencies on a package cannot be met by a single version.

This is probably the result of it copying all libraries in the same output directory and that .net cannot load 2 different versions of the same library so more an application restriction.

The downside of this is that packages often can't use newer features if they want to not block the users of that library and that utility libraries have to have his backwards compatibility so applications can use the latest version while dependent libraries target an older version. Often applications keep using older versions with known security issues.

[–] [email protected] 1 points 9 months ago

Damn, sounds like a big headache x.x

[–] [email protected] 5 points 9 months ago (1 children)

npm downloads every dependency recursively. If a depends on d (= 1.2.3) and b depends on d (= 1.2.4), then both versions of d get downloaded into a and b's respective node_modules.

All other package managers I'm aware of resolve dependencies into a flat list then download, and you can only have one version of the same package on your system.

[–] [email protected] 1 points 9 months ago (1 children)

You mean npm duplicates even if the the two dependency versions are compatible?

you can only have one version of the same package on your system.

That couldn't be, right? Otherwise, if you installed two packages that rely on different incompatible versions of another package, one of the two would break. Reading a bit they should check for "satisfiability", I found some really interesting things on the topic looking around:

[–] [email protected] 3 points 9 months ago (1 children)

You mean npm duplicates even if the the two dependency versions are compatible?

By default yes, unless you explicity use the "peer dependency" system which isn't the default. The "default" naive implementation is for every package in your node_modules to have a node_modules of its own, all the way down recursively. There are tricks nowdays to deduplicate packages with the exact same version, but not to automatically detect "compatible" versions and use those instead (in my experience nothing would work if that was the case, deleting package-lock.json causes way too many issues due to the... uh, let's call it "brave" approach of JS devs to stability).

That couldn’t be, right? Otherwise, if you installed two packages that rely on different incompatible versions of another package, one of the two would break

Correct. This is intended behavior which is solved in several ways:

  1. Correctly declaring your dependencies. If newer versions of a dependency break your package, disallow them, but that is not normally needed for minor version changes.
  2. Focus on quality. Semver exists for a reason, and 1.2.3 should not break something built against 1.1.2. JS and NPM's cascade of stupid implementations bred a culture of "move fast and break things", but that's not the norm in any other commonly used ecosystem
  3. Linux distros almost exclusively use curated repositories, so they are (mostly) internally consistent and incompatibilities are rare and quickly fixed. A good package manager will resolve dependencies and automatically detect incompatibilities, proposing several fixes (typically abort the upgrade or uninstall one of the problematic packages)
  4. Not breaking down packages into a constellation of smaller packages. glibc6 is glibc6, not glibc_string (1.2.3) + glibc_memory (2.6.5) + glibc_fs (1.5.3) + glibc_stdio (1.9.2) + glibc_threads (6.1.0) + ...
    Internally glibc6 is a bunch of modules, but they get bundled into one package specifically to simplify dependency management.

Not being able to install two versions of the same package sounds restrictive, but it's a HUGE security benefit: glibc6 (1.2.3) is vulnerable to CVE-2024-1, then updating to glibc6 (1.2.4) secures your entire system at once. With NPM though, you have to either wait for every. single. dependency on that vulnerable package down your tree to recursively update, or patch those versions yourself (at your own risk because again, small version changes often break things since developers think that NPM's dependency model means they don't have to actually provide stability guarantees).

[–] [email protected] 2 points 9 months ago

Wow, awesome explanation! I think I understand now