Jump to content

Converting console output to progress information


 Share

Recommended Posts

I am working with console application (avdump) that processes file and outputs progress info like "H 00.29% @ 16.00 MB/s / 16.00 MB/s"

I am trying to read this console progress into value to use for displaying progress in my autoit program that launches console one.

I got it working (after like 30 tries :D ), but while actual processing is very smooth and close to constant speed - value I am getting is jumpy and tends to sleep in some specific points usually 27,57 and 87... No idea why exactly this points and they are usually same even while processing different files and different operations. Especially it likes to jump from nothing to 27 at start and I just can't get it right (after 27 it's still not smooth but no such huge jumps).

$pattern="[PH]\s{2}([\d\.]{5})"
Dim $cout

$run=Run("avdump.exe -d -h -log:link.txt """ & $file & """", @ScriptDir, "", $STDOUT_CHILD)
Do
    $search=StringRegExp($cout, $pattern, 1)
    If Not @error Then 
        GUICtrlSetData($label, $search[0])
        $cout=0
    EndIf
    $cout=$cout&StdoutRead($run,1)
Until @error

TIA

Edited by Rarst
Link to comment
Share on other sites

I am working with console application (avdump) that processes file and outputs progress info like "H 00.29% @ 16.00 MB/s / 16.00 MB/s"

In this example string you give, I can see only ONE space in front of the number you are searching for. Maybe there are "one OR two" spaces?

$pattern="[PH]\s{1,2}([\d\.]{5})

Regards, Rudi.

Earth is flat, pigs can fly, and Nuclear Power is SAFE!

Link to comment
Share on other sites

avdump outputs 1 or 2 spaces in different modes ( [PH] is also related - line can start with both of those letters depending on mode). I suspected that initial regexp I did might be bit too complex (reading those MB/s later is not going to hurt either :D ) so simplified for the sake of debuging.

It's my first time messing with StdoutRead so I am not very sure if I am doing it right. :D It seems hard to get timing right - if I query stdout too often I waste a lot of time on RegExpSearch, if I let it read as much as it can - it will eat stream without chance to process it... Anything in between and I can't properly predict how it works. I also tried stucking small Sleep in there and it even messes stuff even more... Why tiny Sleep(1) can totally change the way it behaves?..

Main problems:

1. I can't figure out huge jump at start... Why this thing is sleeping and then rapidly jumps to 27-something percent?..

2. How can it get unstable, jumpy output from totally smooth and stable stream?

Link to comment
Share on other sites

$pattern="H\s\s(.{5})"
Dim $cout
Dim $progress
Dim $loops=0
Dim $looptime
$exit=False
$run=Run("avdump.exe -d -h -log:link.txt """ & $file & """", @ScriptDir, -1, $STDOUT_CHILD)
$lastvaluetime=TimerInit()
Do
    $loopstart=TimerInit()
    $search=StringRegExp($cout, $pattern, 1)
    If Not @error Then 
        GUICtrlSetData($label, $search[0])
        $progress&=@CRLF&" Value= "&$search[0]&" Time used= "&TimerDiff($lastvaluetime)&" Loops used= "&$loops&" Last loop time= "&$looptime&" String used= "&StringStripCR($cout)
        $lastvaluetime=TimerInit()
        $cout=0
        $loops=0
    EndIf
    $cout=$cout&StdoutRead($run,1)
    If @error Then $exit=True
    $looptime=TimerDiff($loopstart)
    If $looptime>1000 Then $progress&=@CRLF&"Big looptime "&$looptime&" detected!"
    $loops+=1
Until $exit

Same stuff, just full of counters and timers. Here is the problem as seen in output:

Value= 28.88 Time used= 1.94745421554974 Loops used= 40 Last loop time= 0.0329650835511217 String used= 0%  @  31.75 MB/s  /  20.16 MB/sH  28.88
 Value= 29.17 Time used= 2.48187968023869 Loops used= 40 Last loop time= 0.0346412742401618 String used= 0%  @  25.64 MB/s  /  20.20 MB/sH  29.17
Big looptime 1998.36962518979 detected!
 Value= 29.45 Time used= 2001.63484465204 Loops used= 40 Last loop time= 0.0335238137808017 String used= 0%  @  21.28 MB/s  /  20.21 MB/sH  29.45
 Value= 29.74 Time used= 2.89450195485739 Loops used= 40 Last loop time= 0.0332444486659617 String used= 0%  @  25.64 MB/s  /  20.25 MB/sH  29.74

There are always 1-3 such loops at specific points (roughly 27,57,87 as mentioned above) that consume few seconds instead of few ms.

And no idea why slow start, there is pause while file is being opened but after that log shows smooth processing, not 0-27 jump.

Big looptime 11017.8640022684 detected!
 Value= 00.29 Time used= 11025.2719270187 Loops used= 28 Last loop time= 0.0706793740545237 String used= f:\The_Gamers.avi
H  00.29
 Value= 00.57 Time used= 7.41937871992111 Loops used= 40 Last loop time= 0.0650920717577234 String used= 0%  @   1.#J MB/s  /   1.#J MB/sH  00.57
 Value= 00.86 Time used= 5.55685149928273 Loops used= 40 Last loop time= 0.0673269926764435 String used= 0%  @ 125.00 MB/s  / 250.00 MB/sH  00.86
Link to comment
Share on other sites

Hi.

Big looptime 11017.8640022684 detected!
 Value= 00.29 Time used= 11025.2719270187 Loops used= 28 Last loop time= 0.0706793740545237 String used= f:\The_Gamers.avi
H  00.29
 Value= 00.57 Time used= 7.41937871992111 Loops used= 40 Last loop time= 0.0650920717577234 String used= 0%  @   1.#J MB/s  /   1.#J MB/sH  00.57
 Value= 00.86 Time used= 5.55685149928273 Loops used= 40 Last loop time= 0.0673269926764435 String used= 0%  @ 125.00 MB/s  / 250.00 MB/sH  00.86
I'd try to pipe ">" the output to a "dump.txt" and then have a close look to that file. Have a close look (Hex Editor) to the Chars in front of what you want to catch.

Are there really exactly two spaces? Chr(255) e.g. looks like a {space} but is a different character.

Did you try the mentioned "\s{1,2}"?

Regards, Rudi.

Earth is flat, pigs can fly, and Nuclear Power is SAFE!

Link to comment
Share on other sites

I don't have problem with regexp - it catches numbers properly (final version to work with all operations would be like [PH]\s{1,2}(\d\d) ).

I narrowed down my problem to unexplained pauses when StdoutRead can't read stream for few seconds (see output above and time wasted for finding next value) while there is no such pauses in output itself. I searched forum and there are posts about some bug in StdoutRead fixed in beta... I tried beta (required changing some stuff to work), but pauses are still there as well.

Link to comment
Share on other sites

I am no expert on StdoutRead, and in fact have come across your post in search of answers to my own puzzle with the command ... but according to the helpfile the Run function for the child process should use the value of $STDIN_CHILD rather than $STDOUT_CHILD as in your post ... perhaps try changing that and see if it helps

Looking more closely at the helpfile for the Beta, the remarks and the example seem to contradict - so possibly you should ignore my comments above ... hopefully someone with knowledge can clarify.

VW

Edited by VeeDub
Link to comment
Share on other sites

1 ($STDIN_CHILD) = Provide a handle to the child's STDIN stream

2 ($STDOUT_CHILD) = Provide a handle to the child's STDOUT stream

4 ($STDERR_CHILD) = Provide a handle to the child's STDERR stream

Directly from autoit docs. :D Makes sense using stdout to read stdout, right?

btw what is yours trouble with StdoutRead? I poked it so much already I might be able to help (but I am using stable release, this command is considerably altered in beta).

Link to comment
Share on other sites

  • 6 months later...

May I interfere with your topic? I think I have some clue.

I was trying to create a progress bar to follow execution of an external child task. I got nothing before exit of task and, at that moment, I got all output at once. I read your post and tried your code with avdump. Run on some 2 Go of avi file, it is rather long and I noticed what you describe, erratic output in he progress bar. I had the idea to redefine the output of my former task and to make it very verbose. I output blabla and once every sec or so output x% x ranging from 1 to 100. I read the output and extract x. Then I discovered that the progress bar is smooth and translates every progress % by %. I think that the output or the function read buffers data until the volume is large enough. Hence the big loop times you noticed.

The solution would be to find how to oblige data not to be bufferized that way.

Regards

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
 Share

  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...