Jump to content
Docfxit

Speed up the Copying of all files in a folder over the network

Recommended Posts

Docfxit

I would like to speed up the copying of all files in a folder on a PC connected on the LAN to a folder on the current PC.

In the comments where it says ";>>>> Reads 1800 files over the network"

It seems to cycle through the array for 1800 files one file at a time.

Is there a way to change the code so it doesn't read all 1800 elements of the array?

Thank you,

Docfxit

_Au3RecordSetup()
    ;AutoIt_Debugger_Command:Disable_Debug
If @OSArch="X86" Then
Local $SourcePath=Wsusoffline32bit
Local $DestPath="C:\Dnload\wsusoffline32bit"
EndIf
If @OSArch="X64" Then
Local $SourcePath ="\Wsusoffline64bit"
Local $DestPath="C:\Dnload\wsusoffline64bit"
EndIf

#include <Array.au3>
#include <File.au3>
#include <Date.au3>
    ;AutoIt_Debugger_Command:Enable_Debug

; Map X drive to \\myserver2\stuff2 using the user "jon" from "domainx" with password "tickle"
DriveMapAdd("O:", "\\DocfxitLT\Dnload", 0, "DocfxitLT\Gary", "password")
Sleep(10000)
_CopyWithProgress($SourcePath, $DestPath, 1)

Func _CopyWithProgress($SourcePath, $DestPath, $Replace=0)
If Not FileExists("o:" & $SourcePath) Then MsgBox(0,"Source Path = " & "o:" & $SourcePath & Not Found)

If Not StringInStr(FileGetAttrib($DestPath), "D") And Not DirCreate($DestPath) Then Return SetError(2, 0, "")
If $Replace <> 0 And $Replace <> 1 Then SetError(3, 0, "")

Local $PathName = StringRegExpReplace($SourcePath, "^.*\\", "")
Local $Progress=0, $Counter, $ReadySize, $MidlePath, $Ready, $TimeRemained
Local $CurrentFilePath, $CurrentFileName, $CurrentFilePathName, $CurrentParentDirName

ProgressOn("Copy Files...", "Copy: " & $PathName, "Getting dir structure" & @LF & "Please wait...", 50, 15)

Local $TotalDirSize = DirGetSize("o:" & $SourcePath)
Local $FilesArr = _FileListToArrayEx("o:" & $SourcePath)
Local $FilesCount = UBound($FilesArr)-1
Local $ProgressStep = 100 / $FilesCount

If IsArray($FilesArr) Then
For $i = 1 To UBound($FilesArr)-1           ;>>>> Reads 1800 files over the network
$CurrentFilePath = $FilesArr[$i]            ;>>>> Reads 1800 files over the network
$CurrentFileName = StringRegExpReplace($CurrentFilePath, "^.*\\", "")
$CurrentFilePathName = StringReplace($CurrentFilePath, $SourcePath & "\", "")

$CurrentParentDirName = _GetParentDirName($CurrentFilePath)

$Progress += $ProgressStep
$Counter += 1

$ReadySize = FileGetSize($CurrentFilePath)

$MidlePath = _GetMidlePath($CurrentFilePath)
$Ready = $Counter & "/" & $FilesCount
$TimeRemained = _GetTimeRemained($TotalDirSize, $ReadySize, $FilesCount, $Counter)

ProgressSet($Progress, 'Copy... from "' & $CurrentParentDirName & '" to "' & $CurrentParentDirName & '"' & @LF & _
$MidlePath & @LF & "Approximate Time Remaining: " & $TimeRemained, "Ready: " & $Ready)
FileCopy($CurrentFilePath, $DestPath & "\" & $CurrentFilePathName, 8+$Replace)
Next        ;>>>> Reads 1800 files over the network
EndIf
ProgressOff()
EndFunc

Func _FileListToArrayEx($sPath, $sMask='*')
    ;AutoIt_Debugger_Command:Disable_Debug
Local $i, $j, $blist, $rlist[1]=[0], $dlist = _DirListToArray($sPath)
_ArrayAdd ($dlist, $sPath)
For $i=1 To $dlist [0] +1
$blist = _FileListToArray ($dlist [$i], $sMask, 1)
If Not @error Then
For $j=1 To $blist [0]
_ArrayAdd ($rlist, $dlist[$i] & "\" & $blist [$j])
Next
EndIf
Next
$rlist [0] = UBound ($rlist) - 1
Return $rlist
    ;AutoIt_Debugger_Command:Enable_Debug
EndFunc

Func _DirListToArray($sPath)
    ;AutoIt_Debugger_Command:Disable_Debug
Local $rlist[2]=[1, $sPath], $blist, $alist=_FileListToArray ($sPath, '*', 2)
If IsArray ($alist) Then
For $i=1 To $alist [0]
_ArrayAdd ($rlist, $sPath & "\" & $alist [$i])
$blist = _DirListToArray ($sPath & "\" & $alist [$i])
If $blist[0]>0 Then
For $j=1 To $blist [0]
_ArrayAdd ($rlist, $blist [$j])
Next
EndIf
Next
EndIf
$rlist[0] = UBound($rlist) - 1
Return $rlist
    ;AutoIt_Debugger_Command:Enable_Debug
EndFunc

Func _GetMidlePath($sPath)
If StringLen($sPath) <= 50 Then Return $sPath
Local $StartPath = StringLeft($sPath, 25)
Local $EndPath = StringTrimLeft($sPath, StringInStr($sPath, "\", 0, -2)-1)
Return $StartPath & "..." & $EndPath
EndFunc

Func _GetParentDirName($FullName)
Local $LastSlashPos = StringInStr($FullName, "\", 0, -1)
Local $SecondLastSlashPos = StringInStr($FullName, "\", 0, -2)
Return StringMid($FullName, $SecondLastSlashPos+1, $LastSlashPos-$SecondLastSlashPos-1)
EndFunc

Func _GetTimeRemained($TotalSize, $CurrentSize, $FilesCount, $CurrentFilesCount)
Local $NumLevl = 0.5

If $TotalSize <= $CurrentSize Then Return _SecsToTime(0)

Switch $FilesCount - $CurrentFilesCount
Case 0 To 100
$NumLevl = 0.1
Case 100 To 1000
$NumLevl = 0.5
Case 1000 to 2000
$NumLevl = 1
Case Else
$NumLevl = 2
EndSwitch

$Secs = ($TotalSize * $NumLevl) / (3600 * $CurrentFilesCount) - ($CurrentSize * $NumLevl) / (3600 * $CurrentFilesCount)
Return _SecsToTime($Secs)
EndFunc

Func _SecsToTime($iTicks, $Delim=":")
If Number($iTicks) >= 0 Then
$iHours = Int($iTicks / 3600)
$iTicks = Mod($iTicks, 3600)
$iMins = Int($iTicks / 60)
$iSecs = Round(Mod($iTicks, 60))
If StringLen($iHours) = 1 Then $iHours = "0" & $iHours
If StringLen($iMins) = 1 Then $iMins = "0" & $iMins
If StringLen($iSecs) = 1 Then $iSecs = "0" & $iSecs
Return $iHours & $Delim & $iMins & $Delim & $iSecs
EndIf
Return SetError(1, 0, 0)
EndFunc

Func _Au3RecordSetup()
    ;AutoIt_Debugger_Command:Disable_Debug
    Opt('WinWaitDelay', 100)
    Opt("WinTitleMatchMode", 4)
    Opt('WinDetectHiddenText', 1)
    Opt('MouseCoordMode', 0)
    Opt("TrayIconDebug", 1) ;0-off
    ; Set so that tray displays current line number
    HotKeySet("{ESC}", "_Exit") ;If you press ESC the script will stop
    ;AutoIt_Debugger_Command:Enable_Debug
EndFunc   ;==>_Au3RecordSetup

Func _Exit()
Sleep(100)
Exit
EndFunc ;==> _Exit()

 

Edited by Docfxit

Share this post


Link to post
Share on other sites
gruntydatsun

yes, you can change the step on the for loop

For $i = 1 To UBound($FilesArr)-1

just add STEP 2 to the end to increase $i by 2 every loop

Share this post


Link to post
Share on other sites
spudw2k

Skipping indexes in the array probably isn't a good idea if you are copying files; you wouldn't want to skip any. 

You might consider taking periodic updates of the copy progress of the entire directory versus each individual file.  You could also split or offload the copy process into multiple, simultaneous copy jobs.  I'd recommend perhaps spawning robocopy to do the copy job and periodically checking how much data (bytes) has been copied while the robocopy process is running.

Share this post


Link to post
Share on other sites
gruntydatsun

nah spud!!  skip a few indexes... whats the worst that could happen :shifty:

Share this post


Link to post
Share on other sites
spudw2k
gruntydatsun

You could speed it up even more if we made it quarter assed :)

Share this post


Link to post
Share on other sites
SlackerAl

At the risk of spoiling the fun...

You don't give much information about the actual problem: Are the files large, small or mixed? Have you checked your network, network storage, processor and local disk performance to see which is the limiting factor? Is any of the network side being loaded by other users? What is the network file store? A single disk? Is it mechanical or SSD? You really need to understand more about the problem, or if there is even a problem.

Your code always overwrites, depending on the type of data, file size, security and how critical the copy is, you could check file size, checksum or timestamp to see if you already have a copy of the file and skip it. Or perhaps you already know for sure everything is always new... but you need to tell us if you want help.

 

Edit: Or do you really just want to copy part of the array or alternate elements as your original post asks?

Edited by SlackerAl

Problem solving step 1: Write a simple, self-contained, running, replicator of your problem.

Share this post


Link to post
Share on other sites
Docfxit
On 10/30/2017 at 3:59 AM, SlackerAl said:

You don't give much information about the actual problem: Are the files large, small or mixed? Have you checked your network, network storage, processor and local disk performance to see which is the limiting factor? Is any of the network side being loaded by other users? What is the network file store? A single disk? Is it mechanical or SSD? You really need to understand more about the problem, or if there is even a problem.

Your code always overwrites, depending on the type of data, file size, security and how critical the copy is, you could check file size, checksum or timestamp to see if you already have a copy of the file and skip it. Or perhaps you already know for sure everything is always new... but you need to tell us if you want help.

 

Edit: Or do you really just want to copy part of the array or alternate elements as your original post asks?

To help clarify the problem.  There is no problem.  I would just like to speed up the copy process if I can. 

The files are mixed.  I have not checked my network. How would a person check the network?  The network storage is on a laptop that has an SSD.  I'm not concerned about the performance of the laptop.

The code won't be overwriting files because it will only be run once so all files will always be new.

I definitely want to copy ALL files.  I don't see how my original post suggests anything less.

I would like to concentrate on optimizing the code to make it run faster.

I don't understand why AutoIt takes the time to read through the array of 1800 files three times every time it copies one file.

There are three places in the code that I marked with: ;>>>> Reads 1800 files over the network

where the code reads through the array of 1800 files.  Is that really necessary?

 

I hope that clarifies what I am interested in resolving.

Let me know if it doesn't.

Thank you,

Docfxit

Share this post


Link to post
Share on other sites
BrewManNH
2 hours ago, Docfxit said:

I don't understand why AutoIt takes the time to read through the array of 1800 files three times every time it copies one file.

Because you tell it to, and not where you indicated it does. You use _FileListToArray twice in 2 different functions.

Want to speed it up, get rid of those functions you're using, such as _FileListToArrayEx and _DirListToArray and just use normal UDFs. _ArrayAdd slows down the script when you're just adding one item at a time due to having to call ReDim for every time you use it, and ReDim is slow.

 

  • Like 1

If I posted any code, assume that code was written using the latest release version unless stated otherwise. Also, if it doesn't work on XP I can't help with that because I don't have access to XP, and I'm not going to.
Give a programmer the correct code and he can do his work for a day. Teach a programmer to debug and he can do his work for a lifetime - by Chirag Gude
How to ask questions the smart way!

I hereby grant any person the right to use any code I post, that I am the original author of, on the autoitscript.com forums, unless I've specifically stated otherwise in the code or the thread post. If you do use my code all I ask, as a courtesy, is to make note of where you got it from.

Back up and restore Windows user files _Array.au3 - Modified array functions that include support for 2D arrays.  -  ColorChooser - An add-on for SciTE that pops up a color dialog so you can select and paste a color code into a script.  -  Customizable Splashscreen GUI w/Progress Bar - Create a custom "splash screen" GUI with a progress bar and custom label.  -  _FileGetProperty - Retrieve the properties of a file  -  SciTE Toolbar - A toolbar demo for use with the SciTE editor  -  GUIRegisterMsg demo - Demo script to show how to use the Windows messages to interact with controls and your GUI.  -   Latin Square password generator

Share this post


Link to post
Share on other sites
Earthshine

I would recommend using ROBOCOPY that is built into your  Windows installation. it's VERY robust and tenacious in case of network problems.


My resources are limited. You must ask the right questions

 

Share this post


Link to post
Share on other sites
SlackerAl

You should re-read your first post and ask yourself how we are supposed to understand your problem from that description.

Your subject and body of text give no information about your actual problem, which is you build a dynamic array of your data 3 times and this is time consuming.

You don't post a simple replicator of your problem (which could have been easily done), so there is nothing for us to test.

Then this:

On 10/29/2017 at 2:02 AM, Docfxit said:

Is there a way to change the code so it doesn't read all 1800 elements of the array?

16 hours ago, Docfxit said:

I definitely want to copy ALL files.  I don't see how my original post suggests anything less.

Seems the first 3 people to look at this all read it the same way.


Problem solving step 1: Write a simple, self-contained, running, replicator of your problem.

Share this post


Link to post
Share on other sites
Danyfirex

Hello. you probably could use _WinAPI_ShellFileOperation.

 

Saludos

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.