FAQ SearchLogin
Tuxera Home
View unanswered posts | View active topics It is currently Fri May 07, 2021 02:27



Post new topic Reply to topic  [ 13 posts ] 
Value too large for defined data type -can't read compressed 
Author Message

Joined: Sun Jan 30, 2011 18:54
Posts: 14
Post Value too large for defined data type -can't read compressed
ntfs-3g-2013.1.13 is incompatible with Windows 8 compressed files.

I have dozens of files which cannot be read, NTFS-3g spews this error message when I try to access them.

jpa wrote:
Hi,

Code:
ntfs_attr_pread error reading '/Users/Administrator/AppData/Local/Microsoft/Windows/Themes/Bing's Be/DesktopBackground/hawaii.jpg' at offset 2555904: 3913 <> -1: Value too large for defined data type

This probably means the file is corrupt.

First, what is its apparent size ? Is it greater than 2555904 ?

Then is it really a compressed file ? There is no point in compressing a jpeg file.... What are the outputs of :
Code:
# mind the quotes
getfattr -e hex -n system.ntfs_attrib_be "/Users/Administrator/AppData/Local/Microsoft/Windows/Themes/Bing's Be/DesktopBackground/hawaii.jpg"
du "/Users/Administrator/AppData/Local/Microsoft/Windows/Themes/Bing's Be/DesktopBackground/hawaii.jpg"
du --apparent-size "/Users/Administrator/AppData/Local/Microsoft/Windows/Themes/Bing's Be/DesktopBackground/hawaii.jpg"

Finally, can you read the file on Windows ? If so, rename it within the same directory (in order to keep it unchanged for further examination), then copy it to an uncompressed directory on the same partition, then rename it to its original name in the original directory (you obvious have to do that on Windows, and do not confuse renaming and copying).

Regards

Jean-Pierre


I have many files of different sizes. What's the difference, my Windows Documents folder is compressed entirely - I don't care which files are compressible and which are not.

Yes, all those files can be read perfectly from Windows 8 and even from Windows XP PE environment.

What's the point of renaming those files? It makes zero sense to me. I can perfectly rename them all - rename() doesn't read the file, it only changes its name in MFT. I cannot copy them anywhere - they cannot be read.


Thu May 09, 2013 14:12
Profile

Joined: Sun Jan 30, 2011 18:54
Posts: 14
Post Re: Value too large for defined data type -can't read compressed
Here's one such file:

Code:
dd if=nvidiaInspector.zip of=/tmp/nvidiaInspector.zip bs=4K
dd: reading `nvidiaInspector.zip': Value too large for defined data type
48+0 records in
48+0 records out
196608 bytes (197 kB) copied, 0.000662981 s, 297 MB/s


ntfs-3g cannot read 4096 bytes starting from 196609.

You can download it here:

Code:
$ ls -la
-rw-rw-r--   1 root   root   237310 Apr  1 14:18 nvidiaInspector.zip

$ md5sum nvidiaInspector.zip
6c579337b00ad4de438b1047c3ee52be  nvidiaInspector.zip


Thu May 09, 2013 14:25
Profile

Joined: Sun Jan 30, 2011 18:54
Posts: 14
Post Re: Value too large for defined data type -can't read compressed
Code:
unique: 266, opcode: LOOKUP (1), nodeid: 34, insize: 60
LOOKUP /Users/User/Downloads/nvidiaInspector.zip
   NODEID: 70
   unique: 266, error: 0 (Success), outsize: 136
unique: 267, opcode: OPEN (14), nodeid: 70, insize: 48
   unique: 267, error: 0 (Success), outsize: 32
OPEN[0] flags: 0x8000 /Users/User/Downloads/nvidiaInspector.zip
unique: 268, opcode: FLUSH (25), nodeid: 70, insize: 64
FLUSH[0]
   unique: 268, error: -38 (Function not implemented), outsize: 16
unique: 269, opcode: READ (15), nodeid: 70, insize: 80
READ[0] 16384 bytes from 0
   READ[0] 16384 bytes
   unique: 269, error: 0 (Success), outsize: 16400
unique: 270, opcode: READ (15), nodeid: 70, insize: 80
READ[0] 32768 bytes from 16384
   READ[0] 32768 bytes
   unique: 270, error: 0 (Success), outsize: 32784
unique: 271, opcode: READ (15), nodeid: 70, insize: 80
READ[0] 65536 bytes from 49152
   READ[0] 65536 bytes
   unique: 271, error: 0 (Success), outsize: 65552
unique: 272, opcode: READ (15), nodeid: 70, insize: 80
READ[0] 122880 bytes from 114688
Failed to decompress file: Value too large for defined data type
ntfs_attr_pread error reading '/Users/User/Downloads/nvidiaInspector.zip' at offset 114688: 122622 <> 81920: Value too large for defined data type
Failed to decompress file: Value too large for defined data type
ntfs_attr_pread error reading '/Users/User/Downloads/nvidiaInspector.zip' at offset 196608: 40702 <> -1: Value too large for defined data type
   unique: 272, error: -75 (Value too large for defined data type), outsize: 16


Thu May 09, 2013 14:54
Profile

Joined: Sun Jan 30, 2011 18:54
Posts: 14
Post Re: Value too large for defined data type -can't read compressed
Other known files which cannot be decompressed:

HxDSetupEN.zip
md5sum: 18df5e00110513f15882709d06947f95
Failure at 851967 byte.

[url=http://download1.msi.com/files/downloads/uti_exe/vga/MSIAfterburnerSetup231.zip[/url]
md5sum: 8cca8339dd40860171e4ddcc81022f4d
Failure at 22937601 byte.


Thu May 09, 2013 15:06
Profile

Joined: Sun Jan 30, 2011 18:54
Posts: 14
Post Re: Value too large for defined data type -can't read compressed
OK, here's the i686 binary which I use.

ntfs-3g.tar.gz (along with debug information).

SHA1: 9896458f59f41991134505f290852c690a5f0820

Maybe it's my platform (CentOS 6.4, i686), compiler (GCC 4.5.4, vanilla) or compilation flags (-m32 -O2 -march=pentium2), who knows?


Thu May 09, 2013 15:19
Profile
NTFS-3G Lead Developer

Joined: Tue Sep 04, 2007 17:22
Posts: 1286
Post Re: Value too large for defined data type -can't read compressed
Hi,

Quote:
ntfs-3g-2013.1.13 is incompatible with Windows 8 compressed files.

That could well be the case, though I have doubts about it, as you only mention files which are unlikely to be compressible, and a new algorithm would have impacted all files.
Quote:
Here's one such file:

I have processed this file using your copy of ntfs-3g. It cannot be compressed (with the usual ntfs compression algorithm), but its contents can.

Nothing wrong found. Your copy of ntfs-3g is correctly compiled and sane.

To clarify a possible change of compression algorithm by Windows 8, can you post the file metadata (please post the metadata for your unreadable copy of the file I have downloaded) :

a) determine its inode : ls -li nvidiaInspector.zip
b) display the metadata : ntfsinfo -fvi inode-number device

Also please post your mount options.

Regards

Jean-Pierre


Thu May 09, 2013 22:30
Profile

Joined: Sun Jan 30, 2011 18:54
Posts: 14
Post Re: Value too large for defined data type -can't read compressed
ntfsinfo:
Code:
Forced to continue.
Dumping Inode 98008 (0x17ed8)
Upd. Seq. Array Off.:    48 (0x30)
Upd. Seq. Array Count:   3 (0x3)
Upd. Seq. Number:        9 (0x9)
LogFile Seq. Number:     0x3c118c7b
MFT Record Seq. Numb.:   4 (0x4)
Number of Hard Links:    1 (0x1)
Attribute Offset:        56 (0x38)
MFT Record Flags:        IN_USE
Bytes Used:              368 (0x170) bytes
Bytes Allocated:         1024 (0x400) bytes
Next Attribute Instance: 7 (0x7)
MFT Padding:    00 00
Dumping attribute $STANDARD_INFORMATION (0x10) from mft record 98008 (0x17ed8)
        Attribute length:        96 (0x60)
        Resident:                Yes
        Name length:             0 (0x0)
        Name offset:             0 (0x0)
        Attribute flags:         0x0000
        Attribute instance:      0 (0x0)
        Data size:               72 (0x48)
        Data offset:             24 (0x18)
        Resident flags:          0x00
        ReservedR:               0 (0x0)
        File Creation Time:      Wed Apr 10 18:06:59 2013 UTC
        File Altered Time:       Mon Apr  1 08:18:07 2013 UTC
        MFT Changed Time:        Thu May  9 12:14:27 2013 UTC
        Last Accessed Time:      Thu May  9 12:29:17 2013 UTC
        File attributes:         ARCHIVE COMPRESSED (0x00000820)
        Maximum versions:        0
        Version number:          0
        Class ID:                0
        User ID:                 0 (0x0)
        Security ID:             913 (0x391)
        Quota charged:           0 (0x0)
        Update Sequence Number:  255845648 (0xf3fe510)
Dumping attribute $FILE_NAME (0x30) from mft record 98008 (0x17ed8)
        Attribute length:        128 (0x80)
        Resident:                Yes
        Name length:             0 (0x0)
        Name offset:             0 (0x0)
        Attribute flags:         0x0000
        Attribute instance:      6 (0x6)
        Data size:               104 (0x68)
        Data offset:             24 (0x18)
        Resident flags:          0x01
        ReservedR:               0 (0x0)
        Parent directory:        79485 (0x1367d)
        File Creation Time:      Wed Apr 10 18:06:59 2013 UTC
        File Altered Time:       Mon Apr  1 08:18:07 2013 UTC
        MFT Changed Time:        Thu May  9 12:10:58 2013 UTC
        Last Accessed Time:      Wed Apr 10 18:06:59 2013 UTC
        Allocated Size:          241664 (0x3b000)
        Data Size:               237310 (0x39efe)
        Filename Length:         19 (0x13)
        File attributes:         ARCHIVE COMPRESSED (0x00000820)
        Namespace:               POSIX
        Filename:                'nvidiaInspector.zip'
Dumping attribute $DATA (0x80) from mft record 98008 (0x17ed8)
        Attribute length:        80 (0x50)
        Resident:                No
        Name length:             0 (0x0)
        Name offset:             0 (0x0)
        Attribute flags:         0x0001
        Attribute instance:      4 (0x4)
        Lowest VCN               0 (0x0)
        Highest VCN:             63 (0x3f)
        Mapping pairs offset:    72 (0x48)
        Compression unit:        4 (0x4)
        Data size:               237310 (0x39efe)
        Allocated size:          262144 (0x40000)
        Initialized size:        237310 (0x39efe)
        Compressed size:         241664 (0x3b000)
        Runlist:        VCN             LCN             Length
                        0x0             0xa617c         0x3b
                        0x3b            <HOLE>          0x5
End of inode reached
Total runs: 2 (fragments: 1)


Mount options:
Code:
/proc/mounts: rw,relatime,user_id=0,group_id=0,default_permissions,allow_other,blksize=4096
/etc/fstab: defaults,fmask=0133,dmask=0022,nls=utf8,rw,compression


Fri May 10, 2013 01:32
Profile
NTFS-3G Lead Developer

Joined: Tue Sep 04, 2007 17:22
Posts: 1286
Post Re: Value too large for defined data type -can't read compressed
Hi,

I do not see anything special in the metadata or mount options. An error appears when reading the last compression block. In your case compression blocks are 65536 bytes long, and reading 122880 bytes from position 114688 has to be decomposed into three parts whose sizes are 16384, 65536 and 45056.

This last one fails on your computer (but not on mine with your ntfs-3g). Note that this slightly overflows from the actual file size, as is customary for the last block of a file.

I can imagine two possibilities :

1) you are using some special device (a SSD ?) which recognizes that the last requested cluster was not fully written, so it refuses to read it. This is unlikely, but possible if the physical storage unit (sector) is smaller than a cluster (4096 bytes in your case).

2) there is a special compression pattern in the last block.

By reading the physical block, we can know whether either of these possibilities applies. According to the metadata the last block is at cluster 0xa61ac (this is 0xa617c + 0x30), and has an actual size of 11 clusters (11*4096 = 45056), so please do :
Code:
dd if=device of=lastblock bs=4096 skip=680364 count=11

If you get a read error, there is something related to your device. If you can read it, there could be some special pattern in it. Then please post the "lastblock" file for examination (note : to post an attachment to this forum, you have to compress it first).

Regards

Jean-Pierre


Fri May 10, 2013 10:49
Profile

Joined: Sun Jan 30, 2011 18:54
Posts: 14
Post Re: Value too large for defined data type -can't read compressed
1) My device is not special in any ways - it's a normal 4K sector 1TB 7200RPM HDD.
2) Here's the resulting file - I have no read errors on my HDD - it's brand new.

Thank you for your help. I hope the attached file will give you insight on what's going on here (it's not the only file I cannot read using ntfs-3g).


Attachments:
lastblock.zip [42.2 KiB]
Downloaded 979 times
Sun May 12, 2013 22:34
Profile
NTFS-3G Lead Developer

Joined: Tue Sep 04, 2007 17:22
Posts: 1286
Post Re: Value too large for defined data type -can't read compressed
Hi,

Quote:
I hope the attached file will give you insight on what's going on here

It did !

There is garbage after the meaningful end of the file, and the usual end marker is not present. On a compressed file, the problem is you cannot know in advance how much compressed data you have to read to get the expected uncompressed size.

Can you please try applying the attached patch, and check your compressed files ?

Thank you for reporting the issue.

Regards

Jean-Pierre


Attachments:
compress-lastblock.patch.gz [1.58 KiB]
Downloaded 973 times
Mon May 13, 2013 12:10
Profile
NTFS-3G Lead Developer

Joined: Tue Sep 04, 2007 17:22
Posts: 1286
Post Re: Value too large for defined data type -can't read compressed
Hi again,

I found an easier (and probably safer) way to fix the issue. Please apply to the original source and try.

Regards

Jean-Pierre


Attachments:
compress-lastblock-v2.patch.gz [736 Bytes]
Downloaded 987 times
Mon May 13, 2013 16:47
Profile

Joined: Sun Jan 30, 2011 18:54
Posts: 14
Post Re: Value too large for defined data type -can't read compressed
jpa wrote:
Hi again,

I found an easier (and probably safer) way to fix the issue. Please apply to the original source and try.

Regards

Jean-Pierre


Thank you! This patch has fixed the problem for me. I hope the next ntfs-3g release will contain this fix (if you mentioned my name in Changelog as a person who reported the bug, I'd be even happier ;-) ).

// Artem S. Tashkinov


Wed May 15, 2013 08:28
Profile
NTFS-3G Lead Developer

Joined: Tue Sep 04, 2007 17:22
Posts: 1286
Post Re: Value too large for defined data type -can't read compressed
Hi,

Quote:
This patch has fixed the problem for me.

Good news !
Quote:
I hope the next ntfs-3g release will contain this fix

It should. However I made an advanced release last week, and next one will be in several months.

Thank you for reporting the issue and helping to solve it.

Regards

Jean-Pierre


Wed May 15, 2013 16:59
Profile
Display posts from previous:  Sort by  
Post new topic Reply to topic  [ 13 posts ] 


Who is online

Users browsing this forum: No registered users and 2 guests


You cannot post new topics in this forum
You cannot reply to topics in this forum
You cannot edit your posts in this forum
You cannot delete your posts in this forum
You cannot post attachments in this forum

Search for:
Jump to:  
cron
Powered by phpBB © 2000, 2002, 2005, 2007 phpBB Group.
Original forum style by Vjacheslav Trushkin.