It appears that performance does not scale linearly with the number of files open -- If I have a large number of files open (>50) performance of quitting a file, or loading a file becomes quite slow If I have 20 files open, the performance is good -- no noticeable delays. And app startup is also quite slow when I have 50 files open. I haven't tested if it is n log n, or n squared (it feels much closer to the latter) In any event, The total amount of data loaded is on the order of a few megabytes of text (50 C++ text files of under 100k each) -- nothing so much that any linear algorithms should be noticeable in the time they take. -- it's at the point where closing a file or opening a new file file takes over a second -- it should be under 50ms even if I have 500 files open)
Switching current file remains fast -- it's adding or removing a file from the set of opened files that is slow and that scales very badly.
I have a *very* fast SSD. (modern apple NVMe -- over 2 gig/second, > 300k iops) Plenty of RAM. CPU is fast - over 3Ghz skylake. How could closing a file take billions of cycles?
I've noticed this performance issue for many years. It's still there in VsPro 2018
-- Ian.