Jump to content
Sign in to follow this  
toto22

SOLVED: FileWriteFromArray adds new line at the end

Recommended Posts

Guys,

my script adds new line to the text file after inserting an array. How can I prevent that?

_FileWriteFromArray(@ScriptDir & '\WatchList.txt', $OldSymbols)

Or the only way to fix that is to remove the "@CRLF" afterwards?

 

Thank you

Edited by toto22

Share this post


Link to post
Share on other sites

@toto22

Taking a look at the _FileWriteFromArray() function, seems that the @CRLF is always added at the end of the file, both for one dimensional and two dimensional array:

FileWrite($hFile, $a_Array[$x] & @CRLF)
FileWrite($hFile, $s_Temp & @CRLF)

 

Edited by FrancescoDiMuro

Click here to see my signature:

Spoiler

Thoughts:

  • I will always thank you for the time you spent for me.
    I'm here to ask, and from your response, I'd like to learn.
    By my knowledge, I can help someone else, and "that someone" could help in turn another, and so on.

/*--------------------------------------------------------------------------------------------------------------------------------------------------------------------------*/

ALWAYS GOOD TO READ:

 

Share this post


Link to post
Share on other sites

@toto22

Pay attention :)

The @CRLF is not added to your array, but it is added during the process of writing your array in the file by the _FileWriteFromArray() function; so, regardless your prior "managing" of the array, the @CRLF would be added indipendently.

You could remove the @CRLF after using _FileWriteFromArray(), or use another function like FileWrite() in a loop, using @CRLF for each line except the last one.

You can find for both the two "ways" to do that, a tons of examples on the Forum :)

 


Click here to see my signature:

Spoiler

Thoughts:

  • I will always thank you for the time you spent for me.
    I'm here to ask, and from your response, I'd like to learn.
    By my knowledge, I can help someone else, and "that someone" could help in turn another, and so on.

/*--------------------------------------------------------------------------------------------------------------------------------------------------------------------------*/

ALWAYS GOOD TO READ:

 

Share this post


Link to post
Share on other sites

@toto22

Just edit the first post and add "Solved" to the title :)


Click here to see my signature:

Spoiler

Thoughts:

  • I will always thank you for the time you spent for me.
    I'm here to ask, and from your response, I'd like to learn.
    By my knowledge, I can help someone else, and "that someone" could help in turn another, and so on.

/*--------------------------------------------------------------------------------------------------------------------------------------------------------------------------*/

ALWAYS GOOD TO READ:

 

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
Sign in to follow this  

  • Recently Browsing   0 members

    No registered users viewing this page.

  • Similar Content

    • By num1g8rfan
      Hi All,
      I have a script whose sole purpose is to walk through about 15,000 - 20,000 files in a single directory structure, making modifications to 5 or 6 lines within each file.  Simple.  I have a small splash screen that is simply counting how many files have been completed out of the overall total.  The script is functioning fine and the splash screams through the initial files, but through time begins to slow to a crawl (It becomes noticeable after about 1,000 to 1,500 files).  
      I have used a combination of methods to write to the file, but none of which appear to make much of any difference:
      Local $FileOpen = FileOpen($FileInList,1) FileWriteLine($FileOpen,@CRLF) FileClose($FileOpen) OR
      FileWriteLine($FileHandle, $NewLineString & @CRLF) OR
      Local $FileHandle = FileOpen($FileInList,2) _FileWriteFromArray($FileHandle,$FileContentArray,1) FileClose($FileHandle) I have checked and the CPU and memory are fine.  My current work-around is to create 5-10 separate directory structures from the original one that contains 20,000 files, run the script and then combine them again in the end.  It works, but isn't very portable.  Any ideas?
      Cleanup_AutoITPost.au3
×
×
  • Create New...