[nlohmann] download instead of bundle#21916
[nlohmann] download instead of bundle#21916ferdymercury wants to merge 3 commits intoroot-project:masterfrom
Conversation
|
|
||
| # **PLEASE UPDATE ALSO THE FOLLOWING LINE WHEN UPDATING THE VERSION** | ||
| # 11 Nov 2025, https://github.com/nlohmann/json/releases/tag/v3.12.0 | ||
| set(ROOT_NLOHMANN_VERSION 3.12.0) |
There was a problem hiding this comment.
I don't get it: what's the advantage of "unvendoring" this library if we're still pinning its version and have to manually update it? I get it for most other cases, but here we're only getting rid of one file..
There was a problem hiding this comment.
Hmm some reasons:
- main one: if there is a security vulnerability found, you can update (or put down) the LCG package and you fix it for all at the same time, without having to force users to git pull a new code.
- that single header file is 1MB big, so it's many files in one actually (saving 10 files x 100kb or 1 file x 1000kb looks the same to me, and in both cases you need to have/maintain a separate CMakelists building it rather than a call to ExternalProject). If now in 3.13 nlohmann changes eg the indentation or apply clang format or whatever the whole 1MB file will have to be written and you get extra noise in the ROOT git history. So to me, it seems cleaner to just change the version and the hash than to update
- reducing AI-bot spam surface like the ones we got recently for libgif, etc.
- enabling in the future to set -DJSON_MultipleHeaders=ON so that one can be more granular about what parts of JSON to include instead of just a huge 1MB big header. Most probably the effect on build time is negligible, but still, it looks cleaner to me if one could be more granular.
- it's clearer the version you are using and you do not need the ugly hack of parsing the header file to retrieve the version.
I agree with you that there is not any real advantage or that none are very convincing. But I'd still propose this PR to be consistent with the rest of the builtins, for symmetry. In my opinion, there is only clang/llvm and afterimage who need to be burnt-in due to clear reasons, the other should be real externals.
There was a problem hiding this comment.
- main one: if there is a security vulnerability found, you can update (or put down) the LCG package and you fix it for all at the same time, without having to force users to git pull a new code.
Does not work. We put hash sum into cmake file and cannot manipulate download package.
There was a problem hiding this comment.
Right. But at least it would be an instant way of communication to all the user-building base.
There was a problem hiding this comment.
Right. But at least it would be an instant way of communication to all the user-building base.
The only way will be deletion of compromised packages from LCG servers.
But we should foresee such situation in current cmake files - to clearly signal this to users
There was a problem hiding this comment.
Basically having cmake ask for the latest version through a well-known URL or something like that
Local hash guarantee - to large extend - that nobody will try to manipulate download procedure.
Otherwise we open door for man-in-the-middle attacks.
There was a problem hiding this comment.
Otherwise we open door for man-in-the-middle attacks.
ExternalProject_Add supports TLS: couldn't we rather use that?
There was a problem hiding this comment.
ExternalProject_Add supports TLS: couldn't we rather use that?
But you propose to make some extra functionality on top of this.
Means you will request latest version of some package from the server and at this point you should fully trust reply obtained via network.
There was a problem hiding this comment.
Maybe I'm wrong, but if the first TLS request to get the hash from the server succeeds (which, with certificates, should "guarantee" no MITM), then all other requests to that server will be secure, no?
Though I can see it's a more complicated setup and maybe not worth the gain
There was a problem hiding this comment.
Nothing to object to the discussion above. Maybe just that ROOT is moving since a while towards depending on packages on the system where it's built, e.g. installed via a package manager. Seen from that perspective, the old "built-in" mechanism (de facto a distribution channel) becomes nothing more than a fall-back in case nothing else can be done for a particular package on which (some functionality of) ROOT depends on.
Test Results 21 files 21 suites 3d 2h 12m 57s ⏱️ For more details on these failures, see this check. Results for commit b3f84a1. ♻️ This comment has been updated with latest results. |
| URL https://github.com/nlohmann/json/archive/refs/tags/v3.12.0.tar.gz # TODO move to LCG | ||
| URL_HASH SHA256=${ROOT_NLOHMANN_HASH} |
There was a problem hiding this comment.
How about directly taking the header from nlohmann's release?
URL https://github.com/nlohmann/json/releases/download/v3.12.0/json.hpp
URL_HASH SHA256=aaf127c04cb31c406e5b04a63f1ae89369fccde6d8fa7cdda1ed4f32dfc5de63
This way, one could stop after the download step.
There was a problem hiding this comment.
Does LCG also accept header files? If I remember correctly downloading from GitHub sometimes is less reliable than LCG for pulling / timeouts.
In general, I think it's better to be consistent with tar packages, plus this single header is huge, 1 MB, so it's better to download something compressed :)
Plus this would prevent in the future easy switch to enabling in the future to set -DJSON_MultipleHeaders=ON so that one can be more granular
There was a problem hiding this comment.
It is always possible to get packages from several sources, however, that led in the past to unstable contexts, where the failure of one of the sources made ROOT not compilable.
No description provided.