Jump to content

> 2gb file


Rizzet
 Share

Recommended Posts

Hi, i need to read a file that is larger then 2gb to binary. I do not need to read all at once.. (its mostly the last part i need to read)

but with FileOpen()

and FileRead i have to read the start, and that doesnt fit in my "mem/ram"

Edited by Rizzet
Link to comment
Share on other sites

There have been recent discussions about a related topic entailing the use of APIFileOpen or APIFileRead /Write and what not... the key is using APIFileSetPos() or some crap... read it where it sits... don't load it into memory...

I will try to find the link.

Lar.

f_mrcleansmalm_77ce002.jpgAutoIt has helped make me wealthy

Link to comment
Share on other sites

You can read some data at desired offset like this:

Global $file = @AutoItExe
Global $size = 67
Global $offset = 428568 

MsgBox(0, "Reading some data at offset", "67 bytes of data inside @AutoItExe at offset 428568:" _
         & @CRLF & @CRLF & _ReadFile($file, $size, $offset))



Func _ReadFile($file, $size, $offset = 0, $mode = 0)
    
    $mode = Not ($mode = 0) ; zero or nothing not to return binary data
    
    Local $aCall_h = DllCall("msvcrt.dll", "hwnd:cdecl", "fopen", "str", $file, "str", "r")
    If @error Or Not $aCall_h[0] Then
        Return SetError(1) ; error openning file
    EndIf

    Local $aCall_pos = DllCall("msvcrt.dll", "int:cdecl", "fsetpos", "ptr", $aCall_h[0], "ptr*", $offset)
    If @error Or $aCall_pos[0] Then
        Return SetError(2) ; error setting position
    EndIf
    
    Local $tBuffer
    If $mode Then
        $tBuffer = DllStructCreate("byte[" & $size & "]")
    Else
        $tBuffer = DllStructCreate("char[" & $size & "]")
    EndIf
    
    If @error Then
        Return SetError(3) ; error creating local buffer
    EndIf
    
    Local $aRet = DllCall("msvcrt.dll", "none:cdecl", "fread", "ptr", DllStructGetPtr($tBuffer), "int", 1, "int", $size, "ptr", $aCall_h[0])
    If @error Then
        Return SetError(4) ; error reading
    EndIf
    
    Local $aClose = DllCall("msvcrt.dll", "int:cdecl", "fclose", "ptr", $aCall_h[0]) ; closing file
    
    Return SetError(0, $aClose[0], DllStructGetData($tBuffer, 1))

EndFunc
Edited by trancexx

♡♡♡

.

eMyvnE

Link to comment
Share on other sites

Sorry for hijacking this topic but let's say I want to generate a hash of a large file, would this be the best method to use then?

Because Larry said "read it where it sits" I assume this method should be best to get the data without reading it into the memory but is it possible to work with the data too, like passing it to another function still without reading it into memory?

Splitting it to read it into smaller parts wouldn't work if I want to generate a hash.

I can't try it myself right now because I'm not at home but the idea just hit me.

Link to comment
Share on other sites

Sorry for hijacking this topic but let's say I want to generate a hash of a large file, would this be the best method to use then?

Because Larry said "read it where it sits" I assume this method should be best to get the data without reading it into the memory but is it possible to work with the data too, like passing it to another function still without reading it into memory?

Splitting it to read it into smaller parts wouldn't work if I want to generate a hash.

I can't try it myself right now because I'm not at home but the idea just hit me.

I don't get that part about not reading it to the memory. Every variable that is assigned is in the memory.

You will need to read all of the file to get it's hash. That's the whole point of hashes.

Unless you mean something like Hash(Hash(part1), Hash(part2), Hash(part3), ...) = Hash(File) - that would work, but that's a different story.

Edited by trancexx

♡♡♡

.

eMyvnE

Link to comment
Share on other sites

Every hash function I've seen, you end up going through all the data in a file to hash it, just in chunks. I was thinking of making a file hasher using the AutoIt functions, but from this discussion, I'd probably be better of using the API calls to open files and only read in a section of the file at a time. Not hard to do, just more difficult than I thought.

@Pain, if you want some help, I'd be glad to give you some pointers with what your trying to do.

Link to comment
Share on other sites

@trancexx

I know I have to read all data to generate a hash but if the file are several GB I won't be able to.

(I'm using FileOpen and FileRead atm)

So I were thinking that maybe I could read it from "where it sits" (the hdd).

Hash(Hash(part1), Hash(part2), Hash(part3), ...) wouldn't give a correct hash or I misunderstood what the "," does.

; MD5 hashes
$part1 = "hello ";f814893777bcc2295fff05f00e508da6
$part2 = "world";7d793037a0760186574b0282f2f435e7

Hash(Hash($part1)Hash($part2));a9241ba5acd28b215123d94a556f0dcc
Hash("hello world");5eb63bbbe01eeed093cb22bb8f5acdc3

@SkinnyWhiteGuy

I appreciate it, I might send you a pm if I don't manage to make it read into blocks myself.

Link to comment
Share on other sites

Link to comment
Share on other sites

  • 5 weeks later...

hey guys could you guys help me a bit?

i am trying to retrieve a portion from a big file.

i can't use fileread because it would take too long.

i tried using trancexx's code and it worked great...except when i encounter 0x1a.

when i encounter 0x1a, _fileread returns 0x00 instead.

thanks in advance!

http://dev.dyonisii.com/

Link to comment
Share on other sites

hey guys could you guys help me a bit?

i am trying to retrieve a portion from a big file.

i can't use fileread because it would take too long.

i tried using trancexx's code and it worked great...except when i encounter 0x1a.

when i encounter 0x1a, _fileread returns 0x00 instead.

thanks in advance!

Look here:

http://www.autoitscript.com/forum/index.php?showtopic=76829

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
 Share

  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...