Gnat Posted October 12, 2006 Share Posted October 12, 2006 I have created a GUI script that will, in part, execute a command @ the command prompt and return the output to a GUICtrlCreateEdit box in the GUI. The only problem is that the StdoutRead has a limitation as described in the help file... "If the function is called with no second argument or a second argument of less than zero, StdoutRead assumes you that you wish to read the maximum number of characters and returns a string of all the characters that are available (up to 64kb in a single read)." ...If the output is more than 64kb, how do you do multiple reads without getting the same data returned? If someone can give me a little code snippet, it would be appreciated... Link to comment Share on other sites More sharing options...
martin Posted October 12, 2006 Share Posted October 12, 2006 Instead of using stdout, redirect the output to a file, then read the file into your edit box. You can always delete the file when you've finished with it. Run(@ComSpec & " /c " & 'commandName >filewithoutput.txt', "", @SW_HIDE) Serial port communications UDF Includes functions for binary transmission and reception.printing UDF Useful for graphs, forms, labels, reports etc.Add User Call Tips to SciTE for functions in UDFs not included with AutoIt and for your own scripts.Functions with parameters in OnEvent mode and for Hot Keys One function replaces GuiSetOnEvent, GuiCtrlSetOnEvent and HotKeySet.UDF IsConnected2 for notification of status of connected state of many urls or IPs, without slowing the script. Link to comment Share on other sites More sharing options...
MHz Posted October 13, 2006 Share Posted October 13, 2006 ...If the output is more than 64kb, how do you do multiple reads without getting the same data returned?That is what a loop is for so you can gather the 64kb into a variable and loop again for another read. If you use StdoutRead without a loop, then the most you will get is 64kb. With my use, I find no issue with using StdoutRead when used in a loop. Scanning HDDs with Dir would blow away the 64kb limit but StdoutRead in a loop is able to do so. Link to comment Share on other sites More sharing options...
Gnat Posted October 16, 2006 Author Share Posted October 16, 2006 That is what a loop is for so you can gather the 64kb into a variable and loop again for another read. If you use StdoutRead without a loop, then the most you will get is 64kb. With my use, I find no issue with using StdoutRead when used in a loop. Scanning HDDs with Dir would blow away the 64kb limit but StdoutRead in a loop is able to do so.That is why I asked for a little code snippet...not sure what loop to use, but it is not working for me... Link to comment Share on other sites More sharing options...
MHz Posted October 17, 2006 Share Posted October 17, 2006 Here you go Global $data, $pid = Run(@ComSpec & ' /c dir c:\*.* /s', '', @SW_HIDE, 2) Do $data &= StdOutRead($pid) Until @error FileWrite('c:\test.txt', $data) I get 2,797Kb of data read. Link to comment Share on other sites More sharing options...
Gnat Posted April 9, 2007 Author Share Posted April 9, 2007 Here you go Global $data, $pid = Run(@ComSpec & ' /c dir c:\*.* /s', '', @SW_HIDE, 2) Do $data &= StdOutRead($pid) Until @error FileWrite('c:\test.txt', $data) I get 2,797Kb of data read. I never did reply and let you know that this worked for me...sorry...but this was the answer to my issue. Thanks Link to comment Share on other sites More sharing options...
Recommended Posts
Create an account or sign in to comment
You need to be a member in order to leave a comment
Create an account
Sign up for a new account in our community. It's easy!
Register a new accountSign in
Already have an account? Sign in here.
Sign In Now