Here it is:
https://github.com/Norbyte/lslib/tree/master/LSLibThe prolematic part starts here:
https://github.com/Norbyte/lslib/blob/master/LSLib/LS/LSFReader.cs#L483The problem is that the LSFv2 format introduced chunked compression, where the LZ4 stream is split into multiple parts. However, simply putting the chunks together corrupts the output stream subtly, ie. some output bytes become consistently different while most of the others stay the same. Because of this, streams over 0x8000 bytes (that are chunked) will probably decompress incorrectly. This is what causes the out of memory error (array sizes becoming impossibly large as the decompressed length values are corrupted).
I don't really know why it could be, but I'll experiment with it a bit when I have some time.