During August 2012 we quietly added a new crash reporting module to FileLocator Pro. Based on CrashRpt (an open source product hosted on Google Code) it’s one of the most useful quality control features we’ve ever added, although we hope it’s a ‘feature’ most of our users will never have cause to see.
Since then you may have noticed an increase in memory management related upgrades to FileLocator Pro. It’s not a co-incidence.
We’ve had a slow trickle of crash reports over the last few months and while most were odd, quick to fix, edge-case samples the majority have been related to memory management issues. It didn’t take long to see that FileLocator Pro had a problem on low spec’d machines performing searches where the data was in the gigabyte range and involved millions of files. We found a few problems that were simply bugs in the code, e.g. algorithms that reserved more memory than was necessary, but some of the problems were more subtle function related issues.
By default FileLocator Pro will record up to 10,000 lines of text per file and each line can be up to around 20,000 characters. That’s not usually a problem when searching in a limited set of files. Rarely will a file have 10,000 hits or a line have 20,000 characters. However, when searching over a very large data set with criteria that might not be very selective (e.g. searching for the letter ‘a’ – which was the actual search phrase in one of the crash reports we received) it can be a problem. It can be compounded by searching through file types that may not have EOL (End Of Line) markers, such as EXE or DLLs. Finally to make the whole thing just a little bit trickier, what might be a problem on a scrawny 512MB laptop is not necessarily a problem on sturdy 16GB PC.
The trouble is that FileLocator Pro doesn’t know at the beginning of the search if it’ll find a few hundred files with hits on a few lines (easy), a couple of files with hits on 10,000 lines (not a problem) or a million files with each one reporting hits on 10,000 lines (problem… probably).
FileLocator Pro 6.5 introduces a pre-emptive based solution. Based on the amount of memory installed on the machine FileLocator Pro sets an upper limit for un-restricted results per search (from 20MB up to around 200MB). If during a search that limit is reached FileLocator Pro starts restricting the search. Results for each file are reduced to around 20 lines, with a maximum of 256 characters per line, and the restriction is retained until the search finishes. If the search still runs out of memory then rather than crashing, as it did previously, it terminates the search.
Our tests on very low powered machines with just 512MB have shown a huge improvement in stability for very large searches and so far we haven’t received any memory related crash reports. Job done? Not quite but it’s one more step in cementing FileLocator Pro’s place as the ultimate super fast, rock solid, search and data analysis tool.
In a previous post I talked about ‘pushing a button that I think does nothing’. I hope you can see from our response to these bug reports that when you ‘Push the Button’ and send us a crash report it most certainly does something!
Hi Dave, I really appreciate to read such kind of internals on your development process. As a software developer myself I know how hard it is, to justify (to yourself or your CTO…) those uncounted hours of coding that do “only” improve the product in terms of stability, performance, debugging capabilities etc., without introducing new features. Keep it up!
Thanks. 6.5 was an odd release. It was big but contained very little in the way of visible changes (at least for non-German/French speakers). As you say, really useful software is not just about shiny buttons it’s about solid, reliable, performance. Something that works all day not just the first 5 minutes.