It has recently been reported that GitHub has changed the method of generating files Auto-generated ".tar.gz" and ".tgz" on launch pages.
This change caused changes to checksums and massive crashes in build systems automated, which verify the integrity of files downloaded from GitHub against previously stored checksums, such as those placed in package metadata or build scripts.
As of version 2.38, of Git, included by default integrated implementation of gzip, This made it possible to unify support for this compression method across all operating systems and improve file creation performance. GitHub picked up the change after upgrading the git version on their infrastructure.
The default compression for Git files has recently changed. As a result, files downloaded from GitHub may have different checksums even though the content hasn't completely changed.
GitHub does not guarantee the stability of checksums for automatically generated files. These are marked with the words “Source code (zip)” and “Source code (tar.gz)” in the Versions tab. If you need to rely on a consistent checksum, you can upload files directly to GitHub Releases.
These are guaranteed not to change.
The problem was than files generated tablets by the gzip implementation zlib build are different binaries of the files generated by the gzip utility, which results in different checksums for archives created by different versions of git when the "git archive" command is executed.
Consequently, after updating git on GitHub, slightly different files started appearing on the release pages that failed verification with the above checksums.
The issue manifested itself in various build systems, continuous integration systems, and toolkits for building packages from source. For example, about 5800 ports of FreeBSD were broken, the sources of which were downloaded from GitHub.
In response to the first complaints about failures, GitHub representatives initially noted that checksums were never guaranteed constants for files.
After it was shown that making build systems affected by the change would require a significant amount of work to update metadata across the various ecosystems, GitHub had a change of heart, reverting the change and reverting to the old file generation method. .
As expected, people started complaining. The initial response from GitHub employee (and top Git contributor) brian m. Carlson was less than completely understanding:
I'm saying that policy has never been correct and we have never guaranteed stable checksums for files, just like Git has never guaranteed it. I apologize for things not working here and there hasn't been clearer communication about this in the past, but our policy hasn't changed in over 4 years.
Git Developers they have not made a decision yet and are only discussing possible actions. Our Options considered include resorting to using the gzip utility default; adding the “–stable” flag to preserve compatibility with older files; link the built-in implementation to a separate file format; using the gzip utility for old commits and the built-in implementation for commits from a certain date; guaranteeing the stability of the format only for uncompressed files.
The complexity of the decision is explained by the fact that reverting to the external utility call does not completely solve the checksum invariance problem, since a change in the external gzip program can also cause a change in the file.
Currently, there is a patch set for review that reverts to the default behavior (invoking an external gzip utility) and uses the built-in implementation when the gzip utility is not present on the system. The patches also add a note to the documentation that the output of "git archive" is not guaranteed to be stable and that the format is subject to change in the future.
Finally if you are interested in knowing more about it, you can check the details in the following link
Be the first to comment