Jump to content

Compile selective preprossesor up and running


Recommended Posts

I got a selective preprocessor for autoit up and running, works with charme and not that hard to do - can be used to speed up running scripts but also exclude a lot of kilos from 'compiled scripts' as this application checks for includes - split these into constants includes and functions includes an checks for used constants and functions then it create an makefile ( to be exchanged with the contsants include entries in your script ) and writes an used functions script to exchange the includes ( script has to be attached to the end of the mother script ) - what I'm working on right now is what else preprocess actions to include? and maybe to suggest for a Autoit3 option to overrule the selective preprocess option like a #IncludeAll or what do I know - any idears...

I have some quistens: do any of you users prefer that compileit to make/use a makefile script or just do the job. Does any need a visual interfase to make selective changes or just a commandline action ( speed ) and what about the call function ( how much relations do I do here ), included or not and can I get some information on the Compileit application to integreate...

I think that the application is going to include an intuition interfase as a non interfase processed though commandline arguments ( related to SciTe )...

I Need some opions here

Download at site:

http://www.sitecenter.dk/latenight/nss-fol...3Preprocess.zip

kjactive :(

Edited by kjactive
Link to comment
Share on other sites

Id think that if you passed an argument to 'Aut2EXE.exe /preprocess /in script.au3' that would preform the preprocess or launch the app to do the preprocess. Otherwise it acts as-is.

Start -> Programs -> AutoIt v3 -> AutoIt Help File -> Index -> (The Function you are asking about)----- Links -----DllStruct UDFsRSA Crypto UDFs
Link to comment
Share on other sites

  • 3 weeks later...

Well I'm almost done with a nice little application, the 'preprocessor' but I have a questen on how deap this application has to check as I noticed that some of your clever written includes do go very far into deep writting subfunctions that has to be included to make the #include function and I like some feedback here on a simple questen like a script that uses two includes like date.au3 and Guilistbox.au3 should save your script something like a 100K code to read before execute...

1.My questen is: how far do the preprocessor has to go in checking for subfunctions and global constans used - example:

#include <date.au3>

$a = _JulianToDate($iJDay)

In this function I find a line with the function '_GetYear($:(' as to this line....

func _JulianToDate($iJDay)

I find a line like this // _DaysInMonth($iYear,_GetYear($:)

// two function in the same line - could be three or four functions included from different UDF's that has to be checked for subfunctions etc....

Endfunc

Just an example but how far do this application has to go - it takes one second to check a 10K script/subscripts to third level BUT TO KEEP ON WOULD BE ....

2. Is there any autoitters that want to help me and make some tests as this is a hard one to test alone...

What do this little felow do at the moment - checks for #includes in main script and make a list, checks for used constants used in main script and make a list, checks for subfunctions in UDF's and make a list, checks for constants used in these functions and make a list, checks for third level subfunctions BUT not constants used here as one spot to stop - sample the whole thing in the right spots and run through a character remove functions with options like to 'remove blank lines' and sample the whole thing into a tempoarry file and open au32exe or compileit as an option - needs some air 'as it's a complex application'

The application can write a makefile to carry out this process without any intuition as with intuition on and do multipreprocess different scripts...

The application save a simple script for 100K or more of weight to load before execute - and speed up process a lot and working on most of my scripts without any complains BUT who wants to help me and what do I do With this huge fellow...

One of the beauties is that it's posible to include subfunctions needed with a single comment line in the script as it is posible to exclude the selective preprocessor from checking #includes but just include the whole 100K with a single character on line in script as it's posible to look up different functions directly in the preprocessor and have a look to do a copy and paste etc....

As far as for me the selective preprocessor is finished only needs to write the topics but if any of you autoitters wants to help me I carry out this hard last process....

Autoit3 is nice but to carry 100K or more of extra weight in execute process - nop, Any comments ?

kjactive :(

Edited by kjactive
Link to comment
Share on other sites

That looks really nice, I can't wait to see it in action.

As previously discussed, w0uter's point is simply insurmountable, however if a scripter knows they're going to do that then use an #include-all directive in the include file.

my first concern would be:

include file A requires function X from include file C

include file B requires function Y from include file C

does it catch that?

I can't imagine that anybody is going to be using nested includes more than 10 levels deep, but then at one point in history someone assumed that a computer would never have more than a meg of ram. And here we are now.

Slightly off topic, but since you're half-way there anyway...

Back in the day (on VAX/VMS anyway, other platforms also I'm sure) there was a program called XREF that took source code and returned a reference of all function and variable usage information. Line numbers for declarations, calls, assignments, references. Extreemely useful for debugging large programs.

601DisengageEnd Program

Link to comment
Share on other sites

Au3Preprocess can't handle a dynamically called function with 'CALL' but then again this function can't take any parameters and cannot be a built-in AutoIt function and seldom used by novises anyway and peoble that uses this function problably knows what they are doing and can exclude the preprocess to this particually UDF...

at the moment Au3Preprocess scan main script for included libraries and functions ( starting with underscores like _bla() ) and look for these functions in the includes, writes a list and checks this for included subfunctions too BUT wount go deeper than that as this would take up to much time as there is a lot to check and how offen do any writters go that deep? - I have not seen a script that go deeper than third level...

my first concern would be:

include file A requires function X from include file C - Preprocessed

include file B requires function Y from include file C - Preprocessed but at the moment I have some trouble here as two functions in my tests just don't get included but mostly they do but this is where checking ends at the moment...

I just have to do a simple topics as this is nessesary to use this application at this mount and I wount release code jet as it's still in a debug mess but when finished I don't know as this is a very hard one to read but maybe...

I know Xref.exe and a lot of different variants to different compiled languages but no one has intuitions and is related to interpretted languages. Au3Preprocess has a intuition option and could write a makefile as to autopass arguments on to au32exe or compileau3 as choice...

Thanks for the responces - first release in a day or so...

kjactive :(

Edited by kjactive
Link to comment
Share on other sites

Au3Preprocess can't handle a dynamically called function with 'CALL' but then again this function can't take any parameters and cannot be a built-in AutoIt function and seldom used by novises anyway and peoble that uses this function problably knows what they are doing and can exclude the preprocess to this particually UDF...

Actually, contrary to logical thought, the inverse is true. The majority of people using Call() are novices incorrectly using it instead of calling the function directly. I can't recall the number of times I've seen people posting scripts which had Call("Function") where just simply using Function() would be the correct way. I've seen it misused far more times by novices than I've seen it correctly used by experienced users. However, with that said, those who are incorrectly using Call() probably aren't going to be waiting in line for a pre-processor to optimize the included functions. That renders it a moot point as in either case they won't be using the pre-processor.
Link to comment
Share on other sites

well maybe your right but I would include this CALL functions if this was in any manner posible but it's not as these function calls only live in memory and a preprocessor has to read autoit memory to catch that and as I included an easy option to exclude preprocess to specified UDFs then dynamically function call users just has to do that...

kjactive :(

Link to comment
Share on other sites

Well here is the first version of Au3Preprocess - still at late beta stage as rev. 1.5 but tested and working on most of the examples in Autoit3 - I used the latest autoit3 beta revision if this has some interest...

Unzip the file and place the directory in SciTe directory along with all other applications, open the topics and goto the 'Special Notes' page where a property example to include Au3Preprocess into SciTe .au3.property list - change the ID to a uniqe ID related to your list load in a script with a Gui and some used functions from some #includes into the SciTe editor and activate Au3Preprocess 1.5 with Alt+W ( default ) change it to what ever you want and play around with this application...

Ctrl+T opens the topics in the application as ESC and Ctrl+Q quits the application

Remember this is a testing application as a late beta revision and bugs and some troublesome UDF third level scan processes could go wrong - Please read the topics before a major report as this is a complex application with lots of options...

Free to download:

http://www.sitecenter.dk/latenight/nss-fol...3Preprocess.zip

Au3Preprocess scan down to third level to look for and include used functions and global constants and that's enought to my kind of programming as one also has to gain for speed here and a 35K script get scanned in less than two seconds...

There is no support to dynamically called functions ( CALL ) as this is just not posible - I would do if BUT please read 'Special Notes' in the topics as how to exclude specified UDFs from the preprocess actions...

There is no script related jet as there is still a lot of debug lines attached...

kjactive :(

Edited by kjactive
Link to comment
Share on other sites

Just a quick thought, how do you handle HotKeySet, GUISetOnEvent and GUICtrlSetOnEvent? Although in theory, they are less likely to be dynamically assigned via a variable, its still technically legal that the function-name parameter to these functions can be a variable and not a literal string.

Link to comment
Share on other sites

I don't handle/change these lines/options as these are buildin autoit functions and will still be at the same spot as before the Preprocess...

The preprocessor only change Global Constants and external functions from UDFs as starts with an underscore ( as they all do and hopefully is a kind of standard to these in future )...

The Constants is inserted in beginning of the script before intuition opens and the functions get buildin at the end of the script...

Tell me why I have to take special care of functions like HotKeySet, GUISetOnEvent and GUICtrlSetOnEvent? - it's true that all functions can be dynamically called but as I said before somewhere - there can not be a handle to these as they only lives in memory BUT it's posible to exclude a UDF function from being preprocessed if a multible character get commented after a #include line

kjactive :(

Edited by kjactive
Link to comment
Share on other sites

The point I'm getting at is, if you don't handle those 3 functions, your pre-processor is useless in a large number of situations, or at least rendered annoying to use (To other people besides yourself, that is). There are tons of people writing scripts using one or more of those 3 functions now-a-days. I believe that you've made some method to allow hard-coding of functions or files that should always be included, but this seems to be counter-productive as it adds more work to the user for what I would consider very little gain.

I personally couldn't care less if my scripts use 150kb on disk or 170kb. I have two 80GB hard drives and if I ever fill those up, I could probably go out and get a couple much larger drives for relatively cheap. The same goes for performance. I'm not going to quibble over my processor having to do just a little bit more work processing of unused items. If my processor is slow enough that this becomes a significant performance issue, then I drastically overpaid for my processor.

Now if you want to see bloat, try using MFC or STL with C++. Those libraries are likely to instantly double the size of your C++ application. Right now, though, I'm lucky to get a 40kb increase using an icon or some other (small) FileInstall()'d file in my scripts. 40kb is less than 1/3 the total size of the original application and thats a worst case scenario. On average, my compiled scripts are roughly 20 - 25 kb larger than the stock AutoIt3.exe. Given that some of these scripts do complex things or having complex GUIs, an equivalent in MFC or using some other libraries in C++ would yield applications around 300kb+. Not to mention time to develop, exponentially more code, and more complex code.

So my observation of this script at the moment as an outsider is that its usage and ease of usage are both inversely proportional to the size of the script. A large script would theoretically benefit the most from a utility like this since it is likely to be using numerous libraries, however, the chances also go up that a large script will use one of the 4 magic functions that aren't supported, meaning the user is going to have to perform extra work both now and in the future ensuring that all required functions are manually included (This becomes a maintenance issue). Smaller scripts, of course, are less likely to use the 4 functions but are also less likely to receive benefit from having unused things stripped.

Link to comment
Share on other sites

fair enought - it's free to use but why do you use CompileAu3 then? and why optimize ones scripts to the absolute limits ' about the tree functions you mention previous well I made some look up and there is absolute nothing that could ruin these functions as they are treated just like any other functions scanned and if they have some UDF functions to call up this will be included...

I believe that you've made some method to allow hard-coding of functions or files that should always be included:|| Nop just a hard scanning process and only what's found gets buildin...

what you would consider very little gain:|| well yes but one thing is missing in your point of view - memory as the size on a compiled script on disk is 135K it would open a process that could be more than the dobble as this file is archived with a compressor and not to memtion the external .dll that these functions call...

I personally couldn't care less if my scripts use 150kb on disk or 170kb. I have two 80GB hard drives:|| okay but say a simple script use date.au3, constant.su3, guiconstant.au3 with just a combo and a listview control - this is extra 166Kb

and you would probably gain 140Kb in this simple script or in every of these kind of scripts not to mention the memoryloss...

I can't say that I think of Microsoft Foundation Class Library or even MS as clever programming and sometime it even stinks of bad code and memory loss...

A large script would theoretically benefit the most from a utility like this since it is likely to be using numerous libraries|| nop every script small or big takeup the same space and memory, it's the amound of included UDFs in compiled scripts and the hardisks that would benefit the most...

ensuring that all required functions are manually included:|| nop that should this application take care of without any fuzz, it's the whole meaning with this little fellow and there is no extra work as to use CompileAu3 or what ever and it only take about one second to scan a 35Kb script...

As I can understand on your opinion is that you just don't want to care about execute speed: this is just fine or you buy a faster processor, care about hardisk space: you just buy a bigger one, care about the amound of memory applications use: probably just buy some more - How in hell do you think the wheel ever got invented...

Well fair enough but in my opinion a computer language has to be as effecient as it can be and one has to remember that before any action take place autoit3 has to allocate memory, unpack the archive, readin the file char by char, make function and variable lists etc. and a small application could easely end up as 500K in memory...

I think that the amound of UDFs will grow heavy in future as these are a nice features to autoit3 but the same goes to this space problem that already is beginning to show up in other posts...

kjactive :(

Edited by kjactive
Link to comment
Share on other sites

  • Developers

fair enought - it's free to use but why do you use CompileAu3 then?

<{POST_SNAPBACK}>

:( ...... lost you on this one...

CompileAU3 will perform the following tasks for you:

Read the command line options like aut2exe supports.

Read any info from the Scriptname.Ini file as defaults if its available.

Read the input script searching for compiler directives. If found, it will override any other setting.

The information is only saved into an INI file when NO Directives are found inside the script.

A menu is displayed (there is a compiler directive to show/noshow this menu)

Run program(s) defined by the Run_Before directive(s).

Run Tylo's AU3Check program (Optional) to verify the script. When errors are encountered you will be prompted if you want to continue or stop the compile process.

Run RC.exe (optional) return code reported to console.

Run reshack.exe (optional) return code reported to console.

Run aut2exe.exe , return code reported to console.

Run program(s) defined by the Run_After directive(s).

Save all setting to SCRIPTNAME.INI when no directives are found in the script.

SciTE4AutoIt3 Full installer Download page   - Beta files       Read before posting     How to post scriptsource   Forum etiquette  Forum Rules 
 
Live for the present,
Dream of the future,
Learn from the past.
  :)

Link to comment
Share on other sites

Well I think that the compile ( execute wrapper ) system in Scite/AutoIt3 is nice and quiet functional, nice intuition etc. but what I ment previous was that: if my application make extra work for users then I don't get why 'Valik' use compileAu3 as this is just the same. Au3Preprocess is a kind of middle inbetween SciTe/Autoit3 and the Compile system and pass the preprocessed script on to Au2Exe or to CompileAu3 as an option in less than two seconds for a 35Kb script

with no need for intuition but provided as an option - I think that this preprocessor buildup is that fast as it can be...

The application is still not fully optimized as there is a lot of debug code spread around and the preprocessed script could be more effecient if I know how autoit 'interpretter' works like is it line orientated or char orientated and how to make all into one long line ( skip linefeeds etc. )...

But I would like some feedbacks other than 'the wheel is just not here to stay'...

kjactive :(

Edited by kjactive
Link to comment
Share on other sites

What would be nice is to reduce the amount of RAM a compiled .exe file uses. Small programs can use 2 or 3 megs of memory. Would this be possible with your preprocessor? If not, is there a way to reduce RAM consumption for the compiled .exe's?

-John

Link to comment
Share on other sites

How much the preprocessor reduces in memory I can't say right now but some absolutly as a simple script of mine got reduced with 150Kb both to disk space and the dobble in memory as the file on disk I think but autoit3 has to unpack the archive, have the 'interpretter' buildin, the script itself, all the UDFs buildin but all the libraries called it can't do anything about but fair to autoit3 it do provide a lot of options/controls etc. and that costs in memory, some functions is very memory hungry, I know! - that's why I did this little fellow as we need all we can get and as future rows so do the UDFs grows...

This application checks for constants used internal as in external scripts and checks for UDF functions and select what is needed and though the rest up/out, buildin the needed stuff and write a preprocessed scripts with no comments, blank characters removed etc. and pass it on to CompileAu3 just like the CompileIt option in SciTe in about a second - some would not noticed it has been there but me I use it at a late state just like CompileAu3/Aut2Exe...

kjactive :(

Edited by kjactive
Link to comment
Share on other sites

fair enought - it's free to use but why do you use CompileAu3 then?

I don't use CompileAu3. I have my own script which looks for a very basic set of keywords dealing with the compiler but the primary aim of the script is to allow me to (globally) switch between multiple AutoIt installations. I wrote the rudimentary compiler stuff as a bonus. I've never used nor do I ever intend to use CompileAu3.

Well fair enough but in my opinion a computer language has to be as effecient as it can be and one has to remember that before any action take place autoit3 has to allocate memory, unpack the archive, readin the file char by char, make function and variable lists etc. and a small application could easely end up as 500K in memory...

Yes, a language has to be efficient, but when efficiency comes at the price of breaking things by being overly aggressive on removal, thats too far. Also, as far as performance, I believe that all function details are stored in a binary tree, so the performance gain isn't as significant as you might hope. Technically, yes, it will be slightly faster, however, a binary search is still performed so its already pretty efficient. If you want to get that technical about things, a better use of time would be to query somebody in the know as to which sorting algorithm is used to sort all the binary-tree data and then figure out how to re-order the appearance of functions in the script so that they appear in the order that will be most efficient for the sort algorithm to process. I suspect that would yield at least just as much performance gain as reducing the number of items appearing.

With regards to the memory footprint, are you really that concerned about a couple hundred KB lying about doing nothing? If so, then you really picked the wrong OS. Windows is wasting several times that. I'd be more concerned about retrieving some memory from the OS than I would worrying about a few hundred KB being wasted in a script.

What I'm trying to get at is, an implementation of this concept, although nice on paper, falls short in the real world. There are major snags to implementing it; specifically, functions like Call(). Until a proper implementation is designed to handle these types of functions in the most elegant way possible, any implementation is nothing short of a flawed implementation at best. The "waste" in space and performance is insignificant even with yesterday's hardware. AutoIt is one of the smallest and fastest languages around with absolutely no external dependencies other than what the OS provides. Complaining about a few hundred wasted kb when compared to languages like VB, VB .NET, VBS, JS, Python, et cetera is like complaining about a mosquito bite when you've just had your leg blown off. All those languages have these large runtime environments that take up space, slow the language down and make it bigger in memory. AutoIt has none of these disadvantages.

In the future, may I suggest that you please be kind enough to the rest of us to use the forums quoting mechanism for replying to people's posts. I found it quite a chore to wade through your post finding where my quote ended and your response began. Things are much easier to read when the relevant text is placed in quote tags and then commented upon immediately after, as I've done in this post. As your post is written, its difficult to tell if you're saying everything or quoting someone; in fact, had it not been me you were quoting, I may not have realized you were quoting someone at all.

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
 Share

  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...