I never had reason to think that slickedit is slow to search across files, that is until I started using Linux and grep.
My project has a lot of files. (
http://dev.chromium.org). If I go into slickedit and pick "Find in Files" and set it to the root of my project. Exclude is set to ".git/" and Include is set to *.cpp;*.cc;*.h, regular expressions off. I type in something like "Canvas2DRenderingContext". (PS: there are no matches for Canvas2DRenderingContext)
Compared to
grep -r --include="*.cc" --include="*.h" --include="*.cpp" CanvasRenderingContext2D .
grep is at least an order of magnitude faster. If the file cache is hot, meaning if it's a second search but for a different word grep is probably 2 to 3 orders of magnitude faster than Slickedit. Sometimes it's so fast I'm amazed it actually was able to check all the files. At this point I've mostly stopped using slickedit search for a recursive search :-(
A couple of ideas if you guys feel like looking into it.
1) getting a recursive listing of files like this
ls -1R | wc -l
takes < 2 seconds on second run. I don't know what ls does that makes it so fast.
2) Googling 'why is grep so fast' brought up this
http://lists.freebsd.org/pipermail/freebsd-current/2010-August/019310.html