Author Topic: Formatting file with very long line slow with recorded macro - v2011  (Read 2345 times)

JimmieC

  • Senior Community Member
  • Posts: 490
  • Hero Points: 17
I am using a recorded macro to format a serial communications file. I just hold Ctrl-F12 until I reach the end of the file. The last line of the file is extremely long at 7458244 columns. SE is running extremely slow formatting this file.

Boxer deals with files like this by limiting line lengths to 32k. When the file is loaded, there is a warning that long lines will be broken into 32k segments. That is, new line characters are inserted. Because of this, the file is marked as modified (* next to the file name). Of course, the file can be closed without changes written.

I do not purport that limiting line lengths to 32k is a preferred implementation. I only mention it as it may be a factor in operation speed. Boxer is slow on these files but SE is crippling slow.

These files come in spurts and I just never have taken the time to create a better formatting tool. Perhaps these files are more suited to Perl or Python.

I have attached the log file and the macro.

Regards,
Jim

Clark

  • SlickEdit Team Member
  • Senior Community Member
  • *
  • Posts: 6875
  • Hero Points: 530
Re: Formatting file with very long line slow with recorded macro - v2011
« Reply #1 on: April 30, 2015, 04:17:22 PM »
If you can manually modify your macro to avoid using split_insert_line(), your macro would run a lot faster. I'm pretty sure split_insert_line() loads the entire line into memory.

You want to call _split_line(), down(), _begin_line(). I think that will do what you need.