Sign in to follow this  
Followers 0
Gnat

StdoutRead and multiple reads

6 posts in this topic

I have created a GUI script that will, in part, execute a command @ the command prompt and return the output to a GUICtrlCreateEdit box in the GUI. The only problem is that the StdoutRead has a limitation as described in the help file...

"If the function is called with no second argument or a second argument of less than zero, StdoutRead assumes you that you wish to read the maximum number of characters and returns a string of all the characters that are available (up to 64kb in a single read)."

...If the output is more than 64kb, how do you do multiple reads without getting the same data returned?

If someone can give me a little code snippet, it would be appreciated... :lmao:

Share this post


Link to post
Share on other sites



Instead of using stdout, redirect the output to a file, then read the file into your edit box. You can always delete the file when you've finished with it.

Run(@ComSpec & " /c " & 'commandName >filewithoutput.txt', "", @SW_HIDE)


Serial port communications UDF Includes functions for binary transmission and reception.printing UDF Useful for graphs, forms, labels, reports etc.Add User Call Tips to SciTE for functions in UDFs not included with AutoIt and for your own scripts.Functions with parameters in OnEvent mode and for Hot Keys One function replaces GuiSetOnEvent, GuiCtrlSetOnEvent and HotKeySet.UDF IsConnected2 for notification of status of connected state of many urls or IPs, without slowing the script.

Share this post


Link to post
Share on other sites

...If the output is more than 64kb, how do you do multiple reads without getting the same data returned?

That is what a loop is for so you can gather the 64kb into a variable and loop again for another read. If you use StdoutRead without a loop, then the most you will get is 64kb. With my use, I find no issue with using StdoutRead when used in a loop. Scanning HDDs with Dir would blow away the 64kb limit but StdoutRead in a loop is able to do so.

:lmao:

Share this post


Link to post
Share on other sites

That is what a loop is for so you can gather the 64kb into a variable and loop again for another read. If you use StdoutRead without a loop, then the most you will get is 64kb. With my use, I find no issue with using StdoutRead when used in a loop. Scanning HDDs with Dir would blow away the 64kb limit but StdoutRead in a loop is able to do so.

:lmao:

That is why I asked for a little code snippet...not sure what loop to use, but it is not working for me...

Share this post


Link to post
Share on other sites

Here you go

Global $data, $pid = Run(@ComSpec & ' /c dir c:\*.* /s', '', @SW_HIDE, 2)
Do
    $data &= StdOutRead($pid)
Until @error
FileWrite('c:\test.txt', $data)

I get 2,797Kb of data read. :shocked:

I never did reply and let you know that this worked for me...sorry...but this was the answer to my issue.

Thanks

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!


Register a new account

Sign in

Already have an account? Sign in here.


Sign In Now
Sign in to follow this  
Followers 0