Jump to content

filegetsize 2gb limit


Guest frank_que
 Share

Recommended Posts

Guest frank_que

Is't possible to modify the function filegetsize to get the size of files greater than 2gb?

Thanx

Frank

Link to comment
Share on other sites

Hi,

you're right, strange result for a 2,89G file: -1190395904 under XP with NTFS.

I think it's a bug but i have a solution:

$before = DriveSpaceFree ( "C:\" )

;This takes time
$FC = FileCopy ( "myvideo.vob", "myvideo.tmp" )

;This is faster;-)
$FC = FileDelete ( "myvideo.vob" )

$after = DriveSpaceFree ( "C:\" )
If $FC = 1 then
MsgBox(4096,"Filesize is:",$after - $before)
FileDelete ( "myvideo.tmp" )
EndIf
Link to comment
Share on other sites

I wonder if this is a windows problem, when I search for files at least 20,971,520kb in size, I'm returned a ton of files, all of them under that size. Same for a search of at least 2,097,152kb...

So, that's winXP...

"I'm not even supposed to be here today!" -Dante (Hicks)

Link to comment
Share on other sites

Guest frank_que

I've looked into autoit sources: It uses the getfilesize() function.

From msdn library documentation: ( http://msdn.microsoft.com/library/default....getfilesize.asp )

"The GetFileSize function retrieves the size of the specified file.

The file size that can be reported by this function is limited to a DWORD value.

To retrieve a file size that is larger than a DWORD value, use the GetFileSizeEx function."

I'm not a c++ programmer, can someone help me to implement it?

Tnx

Frank

Link to comment
Share on other sites

Yeah, I think the small problem is not "GetFileSize", but the conversation to a result:

vResult = (int)GetFileSize(fileIn, NULL);

You see it's converted to integer and a file greater than 2 GB had a problem.

cause the limit of "int" is:2.147.483.647

The highest what this function can is DWORD (4.294.967.295).

So if someone has files bigger than this he has the problem again, but for the size up to 4 GB why we not use this instead (c++):

...
char *dwMaxSize = "0";
...
_ultoa(GetFileSize(fileIn, NULL),dwMaxSize,10);
vResult = dwMaxSize;
...

Only an idea...

Holger

Link to comment
Share on other sites

  • Administrators

Yeah, I think the small problem is not "GetFileSize", but the conversation to a result:

vResult = (int)GetFileSize(fileIn, NULL);

You see it's converted to integer and a file greater than 2 GB had a problem.

cause the limit of "int" is:2.147.483.647

The highest what this function can is DWORD (4.294.967.295).

So if someone has files bigger than this he has the problem again, but for the size up to 4 GB why we not use this instead (c++):

...
char *dwMaxSize = "0";
...
_ultoa(GetFileSize(fileIn, NULL),dwMaxSize,10);
vResult = dwMaxSize;
...

Only an idea...

Holger

We can use a double instead to get to 4GB. Maybe doing getsizeEx when running under NT OSes.

Holger, you do know that code you posted has a buffer overrun yeah? It's just one of your submissions has a similar thing in that I changed :ph34r:

Link to comment
Share on other sites

@Jon: I hope I understand :ph34r:

I think you mean the line:

char *dwMaxSize = "0";

Your'e absolutely right.

I read a bit over buffer overruns and security problems a time ago...

So I read it again and now I understand it.

Better for is would be (I hope):

char dwMaxSize[10];

I will install later NT4.0 here on my test-pc as another partition an check what it says to a 10 GB file with "GetFileSizeEx"...

Link to comment
Share on other sites

  • Administrators

@Jon: I hope I understand  :ph34r:

I think you mean the line:

char *dwMaxSize = "0";

Your'e absolutely right.

I read a bit over buffer overruns and security problems a time ago...

So I read it again and now I understand it.

Better for is would be (I hope):

char dwMaxSize[10];

I will install later NT4.0 here on my test-pc as another partition an check what it says to a 10 GB file with "GetFileSizeEx"...

Yeah, that's the one.

char *dwMaxSize = "0";

just allocates 2 bytes "0" and "\0".

Link to comment
Share on other sites

Why? Do you have files bigger than 4GB?

I dont, but it is not a reason for keeping a buggy function in Autoit.

For example a .rar achivie can be at most 8,589,934,591 GB so the 4gB limit can be tight.

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
 Share

  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...