Quote Originally Posted by beaky
Fair enough, you win although I'm amazed xdellta did so badly.
The filesystem is compressed. A fairly small change to the uncompressed file system will change the alignment of the data to the compressed blocks. Every compressed block will resultantly be different. The compressed representation of something slightly different tends to be substantially different. Consequently, there will be very few similarities for xdelta to take advantage of.

My system uncompresses the filesystem, but leaves the structure untouched. The whole file system is chopped into manageable chunks. xdelta is then used to create a patch against those chunks.

On reconstruction, each chunk is joined back together and re-compressed.

My system will only work if the order of the files remain substantially the same and alignment is not moved by more than a few megabytes.

I have made a proposal for xdelta. This involves creating a rolling window for the rolling checksum to allow large files to be diffed with small memory footprint. Xdelta does not currently work on most systems when the target files are gigabytes.