Jump to content

Is there a function for compiling .au3 files? [automated parallel processing]


Recommended Posts

Right now I am manually building parallel processors for multithreading. Tested, definitely 3x faster processing time. But I was wondering if there is an au3 function to auto output sections of code into sperate temporary .au3 files & then automatically compile them into .exes, use them for processing data then delete the temp .au3 file & .exes after processing that information?

Func FindProcessor($r,$ID)
    ;Debug("FindProcessor")
    WriteTxt("ProcessID",$ID)
    WriteTxt("ProcessRow",$r)
    WriteTxt("Accepted",0)
    Sleep(250)
    For $i=1 To $MaxJobs
        If ReadTxt("P"&$i)=0 Then
            WriteTxt("P"&$i&"_NewJob",1)
            ;Debug("PID: "&$i&", ProcessID: "&$ID&", ProcessRow: "&$r)
            Do
                Sleep(250)
            Until ReadTxt("Accepted")=1
            Return 1
        EndIf
    Next
    Return 0
EndFunc

Func LoadProcessors()
    Debug("Loading Parallel Processors")
    For $i=1 To $MaxJobs
        WriteTxt("P"&$i,0)
        WriteTxt("P"&$i&"_NewJob",0)
        ShellExecute(@ScriptDir&"\Processor_"&$i&".exe")
        Sleep(2000)
    Next
EndFunc

 167927998_2021-07-1713_55_03.png.7b472b69e471378be5301969f4fb50b5.png

 

 

Link to comment
Share on other sites

You don't need to have multiple executable with only slight variations, instead make your script callable multiple times by passing command-line arguments on what to do for each instance.

EasyCodeIt - A cross-platform AutoIt implementation - Fund the development! (GitHub will double your donations for a limited time)

DcodingTheWeb Forum - Follow for updates and Join for discussion

Link to comment
Share on other sites

1 hour ago, TheDcoder said:

You don't need to have multiple executable with only slight variations, instead make your script callable multiple times by passing command-line arguments on what to do for each instance.

True.  
Unless the one of the aims of the script is actually to render novel images/filenames for each execution so as to avoid signature detection.

Code hard, but don’t hard code...

Link to comment
Share on other sites

3 hours ago, TheDcoder said:

You don't need to have multiple executable with only slight variations, instead make your script callable multiple times by passing command-line arguments on what to do for each instance.

Ya, I run the same .exe with different arguments on my other computer just fine. But for some reason it doesn't work for this particular laptop, not entirely sure why. 

Link to comment
Share on other sites

3 hours ago, Nine said:

You can set processor affinity if needed.  As @TheDcoder suggest multi-threading is way overrated when you do not really know what your are doing....

It objectively saves time in my particular use case. It has to process 500k-1mill rows of .csv data in multiple layers of nested for loops. Tho it would be nice if I could make build a library for easier & faster execution in the future. 

Link to comment
Share on other sites

15 hours ago, TheDcoder said:

You don't need to have multiple executable with only slight variations, instead make your script callable multiple times by passing command-line arguments on what to do for each instance.

For my particular use case, command-line arguments wouldn't work because all 4 processors run off of the same csv file with 500k-1mill rows. The CSV load time is around 10-30 seconds each load. It is much faster to issue new arguments and executions through .txt files. The processors maintain global csv arrays in a while loop while waiting for new commands. 

;~ Globals()
Func Globals()
    Global $a
    CSVToArray($a,"E:\Data.csv")
    Global $lr=UBound($a,1)-1
    Global $done[1]
    Global $UniqueCt=0
    ;<-----------------------SETTINGS
    Global $sd=0.10
    Global $mrt=1
    Global $mp=1
    Global $mpt=1
EndFunc

;Main(ReadTxt("PID"))
Main(1)
Func Main($Num)
    Globals()
    While 1
        If ReadTxt("P"&$Num&"_NewJob")=1 Then
            WriteTxt("P"&$Num&"_NewJob",0)
            WriteTxt("P"&$Num,1)
            WriteTxt("Accepted",1)
            Local $Row=ReadTxt("ProcessRow")
            ;Debug("Processing Row: "&$Row)
            If $Row>10 Then $Row=$Row-10
            Tracker(ReadTxt("ProcessID"),$Row)
            WriteTxt("P"&$Num,0)
        EndIf
    Sleep(250)
    WEnd
EndFunc

 

Edited by Dana86
Link to comment
Share on other sites

Another route would be to load the large .csv in a database, typically SQLite and process data from here.

This wonderful site allows debugging and testing regular expressions (many flavors available). An absolute must have in your bookmarks.
Another excellent RegExp tutorial. Don't forget downloading your copy of up-to-date pcretest.exe and pcregrep.exe here
RegExp tutorial: enough to get started
PCRE v8.33 regexp documentation latest available release and currently implemented in AutoIt beta.

SQLitespeed is another feature-rich premier SQLite manager (includes import/export). Well worth a try.
SQLite Expert (freeware Personal Edition or payware Pro version) is a very useful SQLite database manager.
An excellent eBook covering almost every aspect of SQLite3: a must-read for anyone doing serious work.
SQL tutorial (covers "generic" SQL, but most of it applies to SQLite as well)
A work-in-progress SQLite3 tutorial. Don't miss other LxyzTHW pages!
SQLite official website with full documentation (may be newer than the SQLite library that comes standard with AutoIt)

Link to comment
Share on other sites

To answer the original question:

6 hours ago, Dana86 said:

But I was wondering if there is an au3 function to auto output sections of code into sperate temporary .au3 files & then automatically compile them into .exes, use them for processing data then delete the temp .au3 file & .exes

No.   
You have to do it manually, the way you are.

Code hard, but don’t hard code...

Link to comment
Share on other sites

.... you don't need to compile an .au3 file to run it, you can use a line like this instead:

Run(@AutoItExe & " /AutoIt3ExecuteScript " & "YourScript.au3")

I don't know if I understand your goal correctly, but it might worth to take a look at this @LarsJ's udf:

https://www.autoitscript.com/forum/topic/202618-implementing-irunningobjecttable-interface/

with that you could load the data of your txt file into an array; share that array via a "ROT Object" and accessing it simultaneously from several scripts running in parallel. In particular, take a look at the script "...\Examples\2) IPC Demo\Array\1) Server.au3" found in the zip attached to the first post of the LarsJ's topic.

or,
as already mentioned above  by @jchd, you could use SQLite ; for example you could import the data into a SQLite archive and then run several scripts to access and process data in the sqlite archive at the same time.
P.S. Here is a nice example (also by @jchd) on how to read and also write simultaneously from multiple scripts on an SQLite DB:

https://www.autoitscript.com/forum/topic/154901-two-processes-sharing-a-file/?tab=comments#comment-1119112

Edited by Chimp
added a space before and after /AutoIt3ExecuteScript

 

image.jpeg.9f1a974c98e9f77d824b358729b089b0.jpeg Chimp

small minds discuss people average minds discuss events great minds discuss ideas.... and use AutoIt....

Link to comment
Share on other sites

9 hours ago, JockoDundee said:

To answer the original question:

No.   
You have to do it manually, the way you are.

Thank you, that was actually my main question. I wanted to use this method to create AI vs logic through .csv and .json files which are limiting. Parallel processing is a must for AI with Au3 vs Python has built in multi threading capabilities for AI & ML.

Edited by Dana86
Link to comment
Share on other sites

10 hours ago, jchd said:

Another route would be to load the large .csv in a database, typically SQLite and process data from here.

I could go that route but things get complicated as the csv file is then used with another python app. CSV hasn't failed me yet.

Link to comment
Share on other sites

You can acces and manage any given SQLite database using virtually any known or unknown platform where a C compiler exists.

There exist several ways to use SQLite from Python.

This wonderful site allows debugging and testing regular expressions (many flavors available). An absolute must have in your bookmarks.
Another excellent RegExp tutorial. Don't forget downloading your copy of up-to-date pcretest.exe and pcregrep.exe here
RegExp tutorial: enough to get started
PCRE v8.33 regexp documentation latest available release and currently implemented in AutoIt beta.

SQLitespeed is another feature-rich premier SQLite manager (includes import/export). Well worth a try.
SQLite Expert (freeware Personal Edition or payware Pro version) is a very useful SQLite database manager.
An excellent eBook covering almost every aspect of SQLite3: a must-read for anyone doing serious work.
SQL tutorial (covers "generic" SQL, but most of it applies to SQLite as well)
A work-in-progress SQLite3 tutorial. Don't miss other LxyzTHW pages!
SQLite official website with full documentation (may be newer than the SQLite library that comes standard with AutoIt)

Link to comment
Share on other sites

  • 2 weeks later...

Wow, look at those cores, bet that machine would perform great in gaming :D

Imagine how much more efficient it would be if it was written in a truly parallel manner?

Spoiler

P.S I was joking about gaming, I know that they don't really gain a lot of benefit from the additional cores, most game engines only use up to 8 cores in a meaningful way.

 

EasyCodeIt - A cross-platform AutoIt implementation - Fund the development! (GitHub will double your donations for a limited time)

DcodingTheWeb Forum - Follow for updates and Join for discussion

Link to comment
Share on other sites

What you can also do is to compile your script as CUI (master) which calls itself with parameters (slaves).

Let's say you have 100 jobs to run. You call your script run 16 jobs at once.

MyScript.exe -jobs 16

MyScript.exe will call itself 16 times and start thus 16 processes. MyScript.exe (master) checks for those 16 processes and when one is done (slot free) then it calls the next job accordingly.

I did similar things with my very old CMD tool SIC2:

and it worked properly at that time.

Please don't send me any personal message and ask for support! I will not reply!

Selection of finest graphical examples at Codepen.io

The own fart smells best!
Her 'sikim hıyar' diyene bir avuç tuz alıp koşma!
¯\_(ツ)_/¯  ٩(●̮̮̃•̃)۶ ٩(-̮̮̃-̃)۶ૐ

Link to comment
Share on other sites

  • 2 weeks later...
On 7/29/2021 at 11:28 PM, UEZ said:

What you can also do is to compile your script as CUI (master) which calls itself with parameters (slaves).

Let's say you have 100 jobs to run. You call your script run 16 jobs at once.

MyScript.exe -jobs 16

MyScript.exe will call itself 16 times and start thus 16 processes. MyScript.exe (master) checks for those 16 processes and when one is done (slot free) then it calls the next job accordingly.

I did similar things with my very old CMD tool SIC2:

and it worked properly at that time.

I use $CmdLine for keeping my python APIs from crashing. There are a lot of weird unexpected crashes in other people's wrappers. 

#include-once
#include "E:\Autoit_FibFinder\Include\MyFunctions.au3"

Globals()
Func Globals()
    Global $lr=$CmdLine[0]
    Global $ProgramDir=$CmdLine[1]
    Global $PingDir=$CmdLine[2]
EndFunc

Main()
Func Main()
    While 1
        Sleep(5000)
        AlwaysOn(10)
    WEnd
EndFunc

Func AlwaysOn($MaxWait)
    XWriteTxt($PingDir,0)
    For $i=1 To $MaxWait
        If Number(XReadTxt($PingDir))=1 Then
;~          MsgBox($MB_SYSTEMMODAL,"$PingDir",$PingDir)
;~          MsgBox($MB_SYSTEMMODAL,"Ping",Number(XReadTxt($PingDir)))
            Return 1
        EndIf
        Sleep(1000)
    Next
    If Number(XReadTxt($PingDir))=0 Then
;~      MsgBox($MB_SYSTEMMODAL,"$PingDir",$PingDir)
;~      MsgBox($MB_SYSTEMMODAL,"Ping",Number(XReadTxt($PingDir)))
        XWriteTxt($PingDir,1)
        Sleep(500)
        Run($ProgramDir)
        Sleep(5000)
        Return 0
    EndIf
EndFunc

As stated above it wouldn't work for my case use AlgoTrading because the data files are too large. And new args need to be inserted whilst the large data csv is loaded into a global array once. 

Link to comment
Share on other sites

On 7/29/2021 at 10:57 PM, TheDcoder said:

Wow, look at those cores, bet that machine would perform great in gaming :D

Imagine how much more efficient it would be if it was written in a truly parallel manner?

  Reveal hidden contents

P.S I was joking about gaming, I know that they don't really gain a lot of benefit from the additional cores, most game engines only use up to 8 cores in a meaningful way.

 

I wish I had more time to game! But been too obsessed with writing new simulations & testing theories with autoit and python. 

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
 Share

  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...