Jump to content

Failed download....


Recommended Posts

Firstly, excuse me for asking anyone to sort through these 214 lines of code. This is probably the first script I have written over 80 lines, and it is certainly the first to serve a tangible purpose. So please excuse me if the code appears sloppy, and if you have comments please share as I am looking to learn.

Now for my question: This script just goes online and looks for images on these servers. The naming scheme is set and it seeks to sort through them in a certain order. It saves every attempt to a log file, and that is how I have been debuging. Now everything seems to work great, according to my log file I am properly requesting the file, I am even able to copy paste the request and get an image, but it doesn't download! Could anyone give me a hint as to why? Is it because I call InetGet($var1,$var2) ??

;~ ***************************************************
;~ *Multi-Downloader
;~ *v1.0.0
;~ *3-1-10
;~ *
;~ *DoubleMcLovin
;~ *Project log ATI001
;~ *
;~ ***************************************************

;define server parameter variables
$date = @MDAY & "/" & @MON & "/" & @YEAR
$time = @HOUR & "." & @MIN & "." & @SEC
dim $x          ;primary loop counter
dim $y          ;secondary loop counter
dim $cover      ;downloads cover (set page = to cover)
dim $dl         ;counts the number of successful downloads
dim $success    ;records the most recent successful page number
dim $server[3]
    $server[0]="img07.nj.us."
    $server[1]="img27.tx.us."
    $server[2]="img28.nj.us."
$chapter = 001      ;chapters are book episodes
$page = 001         ;the page in the volume
$volume = 01        ;volume is the next book
$s1 = 0             ;used to change server
$file = FileOpen("log.txt",1)   ;makes new file for writing
;kill script
HotKeySet("{PAUSE}", "EndScript")
;set date and time in logfile
FileWriteLine($file, "Download started on: " & $date & " at: " & $time)
;main
While $x < 20       ;provides room for the program to search helplessly twenty times
    $volume = StringFormat("%02s",$volume)          ;These three lines format the numeric sequence
    $page = StringFormat("%03s",$page)              ;properly so that they number as "001" or "01"
    $chapter = StringFormat("%03s",$chapter)        ;appropriately.
    ;the url for downloading
    $url = "http://"
        $url &= $server[$s1] 
        $url &= "mangafox.com/store/manga/63/" 
        $url &= $volume 
        $url &= "-" 
        $url &= $chapter 
        $url &= ".0/compressed/Rurouni_Kenshin_v" 
        $url &= $volume 
        $url &= "_c" 
        $url &= $chapter 
        $url &= "_" 
        $url &= $page 
        $url &= ".jpg"
    ;the path for saving
    $path = "E:\system66\media\manga\rouroni.kenshin\v." 
        $path &= $volume 
        $path &= "\Rurouni_Kenshin_v" 
        $path &= $volume 
        $path &= "_c" 
        $path &= $chapter 
        $path &= "_" 
        $path &= $page 
        $path &= ".jpg"

    ;attempts to get the image
    InetGet($url,$path,1,0)
    $jj = InetGet($url,$path)
    ;debug log entry
    FileWriteLine($file,$url)
    FileWriteLine($file,$path)
    FileWriteLine($file,$jj)
    FileWriteLine($file,@error)
    FileWriteLine($file,":::::::::::::::")
    ;if the file fails to download
    If @error = 0 Then
        ;if the loop tried every server for this image
        If $x >= 2 Then
            ;has the loop tried to go to the next few pages?
            If $y >= 1 Then ;If server and page checks failed we will change the chapter and roll the page back to the last good page
                $chapter += 1
                $x += 1
                $y += 1
                $page = $success    ; This lets us change the chapter and continue the page count.
                                    ; Remember: Pages are given room for 10 page changes before the chapter changes
            Else
                If $y >= 10 Then    ;Has the loop checked server and pages and chapter?
                    $page = 0
                    $volume += 1    ;Since the server, page, and chapter are not to blame, lets go to the next
                                    ;volume and reset the server, the $y and the $x
                    $x,$y = 0
                    $s1 = 0         ;sets the first server
                    ;Download the new cover image
                    While 1
                        $volume = StringFormat("%02s",$volume)
                        $page = StringFormat("%03s",$page)
                        $chapter = StringFormat("%03s",$chapter)
                        ;the url for the cover
                        $curl = "http://" 
                            $curl &= $server[$s1] 
                            $curl &= "mangafox.com/store/manga/63/" 
                            $curl &= $volume
                            $curl &= "-"
                            $curl &= $chapter 
                            $curl &= ".0/compressed/Rurouni_Kenshin_v" 
                            $curl &= $volume 
                            $curl &= "_000_Cover.jpg"
                        ;the save file path for the url
                        $cpath = "E:\system66\media\manga\rouroni.kenshin\v." 
                            $cpath &= $volume 
                            $cpath &= "\Rurouni_Kenshin_v" 
                            $cpath &= $volume 
                            $cpath &= "_000_Cover.jpg"
                        InetGet($curl,$cpath,1,0)
                        ;debug log entry
                        FileWriteLine($file,$curl)
                        FileWriteLine($file,$cpath)
                        FileWriteLine($file,@error)
                        FileWriteLine($file,":::::::::::::::")
                        If @error = 0 Then      ;Did the download fail?
                            $x += 1             ;counts the loop
                            If $x >= 4 Then     ;Have we tried all the servers?
                                FileWriteLine($file, "Download finished on: " & $date & " at: " & $time)
                                Exit ;were done here!
                            Else
                                ;changes the server until the pack is downloaded
                                If $s1 < 2 Then
                                    $s1 += 1
                                ElseIf $s1 >= 2 Then
                                    $s1 = 0
                                EndIf
                            EndIf
                        Else                    ;If the download was good, $x=0
                            $x = 0
                            ExitLoop
                        EndIf
                    WEnd
                Else
                    ;If $y is not greater than or equal to 10
                    $page += 1                  ;change the page then start the DL loop
                    $x = 0                      ;allows us to count the server checks
                    While 1
                        $volume = StringFormat("%02s",$volume)
                        $page = StringFormat("%03s",$page)
                        $chapter = StringFormat("%03s",$chapter)
                        ;page url
                        $url = "http://" 
                            $url &= $server[$s1] 
                            $url &= "mangafox.com/store/manga/63/" 
                            $url &= $volume 
                            $url &= "-" 
                            $url &= $chapter 
                            $url &= ".0/compressed/Rurouni_Kenshin_v" 
                            $url &= $volume 
                            $url &= "_c" 
                            $url &= $chapter 
                            $url &= "_" 
                            $url &= $page 
                            $url &= ".jpg"
                        ;page savefile path
                        $path = "E:\system66\media\manga\rouroni.kenshin\v." 
                            $path &= $volume 
                            $path &= "\Rurouni_Kenshin_v" 
                            $path &= $volume 
                            $path &= "_c" 
                            $path &= $chapter 
                            $path &= "_" 
                            $path &= $page 
                            $path &= ".jpg"
                        InetGet($url,$path,1,0)
                        $jj = InetGet($url,$path)
                        ;debug log entry
                        FileWriteLine($file,$url)
                        FileWriteLine($file,$path)
                        FileWriteLine($file,$jj)
                        FileWriteLine($file,@error)
                        FileWriteLine($file,":::::::::::::::")
                        If @error = 0 Then          ;Did the download fail?
                            $x += 1                 ;counts how many times we have tried this page
                            If $x >= 3 Then         ;have we tried all the servers?
                                $page += 1          ;Change the page and reset the server count
                                $x = 0
                            Else
                                ;changes the server until the pack is downloaded
                                If $s1 < 2 Then
                                    $s1 += 1
                                ElseIf $s1 >= 2 Then
                                    $s1 = 0
                                EndIf
                            EndIf
                        Else                        ;did the pack download?
                            $x = 0
                            ExitLoop                ;back to main loop
                        EndIf
                    WEnd
                EndIf
            EndIf
        Else    ;has the image checked all servers?
            $x += 1
            If $s1 < 2 Then ;server rotation scheme
                $s1 += 1
            ElseIf $s1 >= 2 Then
                $s1 = 0
            EndIf
        EndIf
    Else            ;the file successfully downloaded
        $page += 1  ;change the page
        $x = 0      ;reset all loop counters
        $y = 0      ;
    EndIf
WEnd
FileWriteLine($file, "Download finished successfully on: " & $date & " at: " & $time)

Func EndScript()
    FileWriteLine($file, "Download stopped on: " & $date & " at: " & $time)
    FileClose($file)
    Exit
EndFunc
Link to comment
Share on other sites

DoubleMcLovin,

Use InetGetInfo to test status of background downloads (example in doc for InetGet)...also the first instance of InetGet (foreground) is immediately followed by a background download (using the same parameters)...pretty sure that this is NOT what you want...

Kylomas

Forum Rules         Procedure for posting code

"I like pigs.  Dogs look up to us.  Cats look down on us.  Pigs treat us as equals."

- Sir Winston Churchill

Link to comment
Share on other sites

You should check to see if InetGet fails or see how many bytes it downloads. These lines

If @error = 0 Then ;Did the download fail?

are well past the InetGet line and only tell you if the previous function (FileWriteLine) failed or not.

@kylomas.

The OP is not using the background option.

Serial port communications UDF Includes functions for binary transmission and reception.printing UDF Useful for graphs, forms, labels, reports etc.Add User Call Tips to SciTE for functions in UDFs not included with AutoIt and for your own scripts.Functions with parameters in OnEvent mode and for Hot Keys One function replaces GuiSetOnEvent, GuiCtrlSetOnEvent and HotKeySet.UDF IsConnected2 for notification of status of connected state of many urls or IPs, without slowing the script.
Link to comment
Share on other sites

DoubleMcLovin,

My mistake about foreground vs. background load...issue with successive downloads of the same file and not checking status remain, however...sorry..

have working example of this if interested...

kylomas

Forum Rules         Procedure for posting code

"I like pigs.  Dogs look up to us.  Cats look down on us.  Pigs treat us as equals."

- Sir Winston Churchill

Link to comment
Share on other sites

Even after correcting that error, the download still wont go through... InetGetInfo is only for background downloads if I am not mistaken, so that doesn't help. But I do know that InetGet is providing setting @error=6 when there is not a file on the server, and when there is and it isn't downloading... Not sure where else to go with this one, InetGetInfo should just return an empty array each time since the download is failing each time even if I were to set it as a background dl and check with that.

new code:

;~ ***************************************************
;~ *Multi-Downloader
;~ *v1.0.0
;~ *3-1-10
;~ *
;~ *DoubleMcLovin
;~ *Project log ATI001
;~ *
;~ ***************************************************

;define server parameter variables
$date = @MDAY & "/" & @MON & "/" & @YEAR
$time = @HOUR & "." & @MIN & "." & @SEC
dim $x          ;primary loop counter
dim $y          ;secondary loop counter
dim $cover      ;downloads cover (set page = to cover)
dim $dl         ;counts the number of successful downloads
dim $success    ;records the most recent successful page number
dim $server[3]
    $server[0]="img07.nj.us."
    $server[1]="img27.tx.us."
    $server[2]="img28.nj.us."
$chapter = 001      ;chapters are book episodes
$page = 001         ;the page in the volume
$volume = 01        ;volume is the next book
$s1 = 0             ;used to change server
$file = FileOpen("log.txt",1)   ;makes new file for writing
;kill script
HotKeySet("{PAUSE}", "EndScript")
;set date and time in logfile
FileWriteLine($file, "Download started on: " & $date & " at: " & $time & @CRLF & @CRLF)
;main
While $x < 20       ;provides room for the program to search helplessly twenty times
    $volume = StringFormat("%02s",$volume)          ;These three lines format the numeric sequence
    $page = StringFormat("%03s",$page)              ;properly so that they number as "001" or "01"
    $chapter = StringFormat("%03s",$chapter)        ;appropriately.
    ;the url for downloading
    $url = "http://"
        $url &= $server[$s1] 
        $url &= "mangafox.com/store/manga/63/" 
        $url &= $volume 
        $url &= "-" 
        $url &= $chapter 
        $url &= ".0/compressed/Rurouni_Kenshin_v" 
        $url &= $volume 
        $url &= "_c" 
        $url &= $chapter 
        $url &= "_" 
        $url &= $page 
        $url &= ".jpg"
    ;the path for saving
    $path = "e:\system66\media\manga\rouroni.kenshin\v." 
        $path &= $volume 
        $path &= "\Rurouni_Kenshin_v" 
        $path &= $volume 
        $path &= "_c" 
        $path &= $chapter 
        $path &= "_" 
        $path &= $page 
        $path &= ".jpg"

    ;debug log entry
    FileWriteLine($file,$url)
    FileWriteLine($file,$path)
    FileWrite($file,"Size: " & InetGetSize($url) & @CRLF)
    ;attempts to get the image
    InetGet($url,$path,1,0)
    ;if the file fails to download
    If @error <> 0 Then
        FileWrite($file,"Error: " & @error & @CRLF)
        FileWriteLine($file,":::::::::::::::")
        ;if the loop tried every server for this image
        If $x >= 2 Then
            ;has the loop tried to go to the next few pages?
            If $y >= 1 Then ;If server and page checks failed we will change the chapter and roll the page back to the last good page
                $chapter += 1
                $x += 1
                $y += 1
                $page = $success    ; This lets us change the chapter and continue the page count.
                                    ; Remember: Pages are given room for 10 page changes before the chapter changes
            Else
                If $y >= 10 Then    ;Has the loop checked server and pages and chapter?
                    $page = 0
                    $volume += 1    ;Since the server, page, and chapter are not to blame, lets go to the next
                                    ;volume and reset the server, the $y and the $x
                    $x,$y = 0
                    $s1 = 0         ;sets the first server
                    ;Download the new cover image
                    While 1
                        $volume = StringFormat("%02s",$volume)
                        $page = StringFormat("%03s",$page)
                        $chapter = StringFormat("%03s",$chapter)
                        ;the url for the cover
                        $curl = "http://" 
                            $curl &= $server[$s1] 
                            $curl &= "mangafox.com/store/manga/63/" 
                            $curl &= $volume
                            $curl &= "-"
                            $curl &= $chapter 
                            $curl &= ".0/compressed/Rurouni_Kenshin_v" 
                            $curl &= $volume 
                            $curl &= "_000_Cover.jpg"
                        ;the save file path for the url
                        $cpath = "e:\system66\media\manga\rouroni.kenshin\v." 
                            $cpath &= $volume 
                            $cpath &= "\Rurouni_Kenshin_v" 
                            $cpath &= $volume 
                            $cpath &= "_000_Cover.jpg"
                        ;debug log entry
                        FileWriteLine($file,$curl)
                        FileWriteLine($file,$cpath)
                        FileWrite($file,"Size: " & InetGetSize($curl) & @CRLF)
                        InetGet($curl,$cpath,1,0)
                        If @error <> 0 Then     ;Did the download fail?
                            FileWrite($file,"Error: " & @error & @CRLF)
                            FileWriteLine($file,":::::::::::::::")
                            $x += 1             ;counts the loop
                            If $x >= 4 Then     ;Have we tried all the servers?
                                FileWriteLine($file, "Download finished on: " & $date & " at: " & $time)
                                Exit ;were done here!
                            Else
                                ;changes the server until the pack is downloaded
                                If $s1 < 2 Then
                                    $s1 += 1
                                ElseIf $s1 >= 2 Then
                                    $s1 = 0
                                EndIf
                            EndIf
                        Else                    ;If the download was good, $x=0
                            $x = 0
                            ExitLoop
                        EndIf
                    WEnd
                Else
                    ;If $y is not greater than or equal to 10
                    $page += 1                  ;change the page then start the DL loop
                    $x = 0                      ;allows us to count the server checks
                    While 1
                        $volume = StringFormat("%02s",$volume)
                        $page = StringFormat("%03s",$page)
                        $chapter = StringFormat("%03s",$chapter)
                        ;page url
                        $url = "http://" 
                            $url &= $server[$s1] 
                            $url &= "mangafox.com/store/manga/63/" 
                            $url &= $volume 
                            $url &= "-" 
                            $url &= $chapter 
                            $url &= ".0/compressed/Rurouni_Kenshin_v" 
                            $url &= $volume 
                            $url &= "_c" 
                            $url &= $chapter 
                            $url &= "_" 
                            $url &= $page 
                            $url &= ".jpg"
                        ;page savefile path
                        $path = "e:\system66\media\manga\rouroni.kenshin\v." 
                            $path &= $volume 
                            $path &= "\Rurouni_Kenshin_v" 
                            $path &= $volume 
                            $path &= "_c" 
                            $path &= $chapter 
                            $path &= "_" 
                            $path &= $page 
                            $path &= ".jpg"
                            
                        ;debug log entry
                        FileWriteLine($file,$url)
                        FileWriteLine($file,$path)
                        FileWrite($file,"Size: " & InetGetSize($url) & @CRLF)
                        InetGet($url,$path,1,0)
                        If @error <> 0 Then         ;Did the download fail?
                            FileWrite($file,"Error: " & @error & @CRLF)
                            FileWriteLine($file,":::::::::::::::")
                            $x += 1                 ;counts how many times we have tried this page
                            If $x >= 3 Then         ;have we tried all the servers?
                                $page += 1          ;Change the page and reset the server count
                                $x = 0
                            Else
                                ;changes the server until the pack is downloaded
                                If $s1 < 2 Then
                                    $s1 += 1
                                ElseIf $s1 >= 2 Then
                                    $s1 = 0
                                EndIf
                            EndIf
                        Else                        ;did the pack download?
                            $x = 0
                            ExitLoop                ;back to main loop
                        EndIf
                    WEnd
                EndIf
            EndIf
        Else    ;has the image checked all servers?
            $x += 1
            If $s1 < 2 Then ;server rotation scheme
                $s1 += 1
            ElseIf $s1 >= 2 Then
                $s1 = 0
            EndIf
        EndIf
    Else            ;the file successfully downloaded
        $page += 1  ;change the page
        $x = 0      ;reset all loop counters
        $y = 0      ;
    EndIf
WEnd
FileWriteLine($file, "Download finished successfully on: " & $date & " at: " & $time)

Func EndScript()
    FileWriteLine($file, "Download stopped on: " & $date & " at: " & $time)
    FileClose($file)
    Exit
EndFunc

EDIT::

I have adjusted the script to use InetGetSize using the same variables as InetGet to determine the size(or existance) of the file. The files that do exist show up properly in the log file with size, but it still gives @error 6 for the download and wont get it... Anyone have ideas?

Edited by DoubleMcLovin
Link to comment
Share on other sites

Can you try using option 9 (1 + 8) to force a binary transfer?

This wonderful site allows debugging and testing regular expressions (many flavors available). An absolute must have in your bookmarks.
Another excellent RegExp tutorial. Don't forget downloading your copy of up-to-date pcretest.exe and pcregrep.exe here
RegExp tutorial: enough to get started
PCRE v8.33 regexp documentation latest available release and currently implemented in AutoIt beta.

SQLitespeed is another feature-rich premier SQLite manager (includes import/export). Well worth a try.
SQLite Expert (freeware Personal Edition or payware Pro version) is a very useful SQLite database manager.
An excellent eBook covering almost every aspect of SQLite3: a must-read for anyone doing serious work.
SQL tutorial (covers "generic" SQL, but most of it applies to SQLite as well)
A work-in-progress SQLite3 tutorial. Don't miss other LxyzTHW pages!
SQLite official website with full documentation (may be newer than the SQLite library that comes standard with AutoIt)

Link to comment
Share on other sites

Can you post an example (innocent) complete url that fails, so that we can try from our end?

This wonderful site allows debugging and testing regular expressions (many flavors available). An absolute must have in your bookmarks.
Another excellent RegExp tutorial. Don't forget downloading your copy of up-to-date pcretest.exe and pcregrep.exe here
RegExp tutorial: enough to get started
PCRE v8.33 regexp documentation latest available release and currently implemented in AutoIt beta.

SQLitespeed is another feature-rich premier SQLite manager (includes import/export). Well worth a try.
SQLite Expert (freeware Personal Edition or payware Pro version) is a very useful SQLite database manager.
An excellent eBook covering almost every aspect of SQLite3: a must-read for anyone doing serious work.
SQL tutorial (covers "generic" SQL, but most of it applies to SQLite as well)
A work-in-progress SQLite3 tutorial. Don't miss other LxyzTHW pages!
SQLite official website with full documentation (may be newer than the SQLite library that comes standard with AutoIt)

Link to comment
Share on other sites

According to my log file, here is a url that has size and doesn't download:

http://img28.nj.us.mangafox.com/store/manga/63/01-001.0/compressed/Rurouni_Kenshin_v01_c001_005.jpg

e:\system66\media\manga\rouroni.kenshin\v.01\Rurouni_Kenshin_v01_c001_005.jpg

Size: 193067

Error: 6

The first field is the url, the second is the destination (it without the file)

Link to comment
Share on other sites

I used this:

Local $url = "http://img28.nj.us.mangafox.com/store/manga/63/01-001.0/compressed/Rurouni_Kenshin_v01_c001_005.jpg"
Local $size = InetGet($url, @ScriptDir & "\img.jpg", 9)     ; <-- reload + binary
ConsoleWrite($size & @LF)       ; <-- 193067 and img.jpg is fine

and it worked a number of times without any failure. Sorry for that, as I suspect it would have been more interesting for you that I would say it failed flatly here.

Does it _always_ fail for you if you run it, say, ten times?

This wonderful site allows debugging and testing regular expressions (many flavors available). An absolute must have in your bookmarks.
Another excellent RegExp tutorial. Don't forget downloading your copy of up-to-date pcretest.exe and pcregrep.exe here
RegExp tutorial: enough to get started
PCRE v8.33 regexp documentation latest available release and currently implemented in AutoIt beta.

SQLitespeed is another feature-rich premier SQLite manager (includes import/export). Well worth a try.
SQLite Expert (freeware Personal Edition or payware Pro version) is a very useful SQLite database manager.
An excellent eBook covering almost every aspect of SQLite3: a must-read for anyone doing serious work.
SQL tutorial (covers "generic" SQL, but most of it applies to SQLite as well)
A work-in-progress SQLite3 tutorial. Don't miss other LxyzTHW pages!
SQLite official website with full documentation (may be newer than the SQLite library that comes standard with AutoIt)

Link to comment
Share on other sites

I'm afraid I've no good explanation. Provided the URL in your example is the actual content of the URL string in the InetGet call, I'm clueless.

This wonderful site allows debugging and testing regular expressions (many flavors available). An absolute must have in your bookmarks.
Another excellent RegExp tutorial. Don't forget downloading your copy of up-to-date pcretest.exe and pcregrep.exe here
RegExp tutorial: enough to get started
PCRE v8.33 regexp documentation latest available release and currently implemented in AutoIt beta.

SQLitespeed is another feature-rich premier SQLite manager (includes import/export). Well worth a try.
SQLite Expert (freeware Personal Edition or payware Pro version) is a very useful SQLite database manager.
An excellent eBook covering almost every aspect of SQLite3: a must-read for anyone doing serious work.
SQL tutorial (covers "generic" SQL, but most of it applies to SQLite as well)
A work-in-progress SQLite3 tutorial. Don't miss other LxyzTHW pages!
SQLite official website with full documentation (may be newer than the SQLite library that comes standard with AutoIt)

Link to comment
Share on other sites

Fine.

Good downloads!

This wonderful site allows debugging and testing regular expressions (many flavors available). An absolute must have in your bookmarks.
Another excellent RegExp tutorial. Don't forget downloading your copy of up-to-date pcretest.exe and pcregrep.exe here
RegExp tutorial: enough to get started
PCRE v8.33 regexp documentation latest available release and currently implemented in AutoIt beta.

SQLitespeed is another feature-rich premier SQLite manager (includes import/export). Well worth a try.
SQLite Expert (freeware Personal Edition or payware Pro version) is a very useful SQLite database manager.
An excellent eBook covering almost every aspect of SQLite3: a must-read for anyone doing serious work.
SQL tutorial (covers "generic" SQL, but most of it applies to SQLite as well)
A work-in-progress SQLite3 tutorial. Don't miss other LxyzTHW pages!
SQLite official website with full documentation (may be newer than the SQLite library that comes standard with AutoIt)

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
 Share

  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...