In 2007 Joel Spolsky wrote a blog post about gnarly problems, called Where there’s muck, there’s brass. It basically argued that real benefit to consumers comes in solving gnarly problems not nice simple fun ones.
We’ve just had our own ‘mucky’ experience dealing with attachment searching in PST and MSG files. While the MSG format is nowhere near as complicated as the PST format both have nasty surprises when accessing the attachments.
However, once it was all up and running it was impossible not to have a silly grin watching a demo of FileLocator Pro finding some ‘secret’ text inside a PDF, attached to a MSG file, attached to an email in a PST file, that itself was zipped up and attached to an email in another PST file. How cool is that!
In ‘Other News’ we also have a new Q&A site. It’s the same sort of thing as StackOverflow but just for Mythicsoft products. Check it out: http://qa.mythicsoft.com
How would you answer the following question:
Do software developers write software for Windows because
a) Windows is REALLY cool and hip
b) Microsoft creates a great environment in which to write applications
c) 90% of all PC users use Windows
d) Microsoft understands and looks after 3rd party software developers
e) Windows users understand that paying for software provides much needed support for their favourite tools and utilities
My answer: “f) All of the above (except a)”.
What’s my point? Well, it’s a convoluted answer to numerous, very complimentary, requests to port FileLocator Pro to the Mac. Usually I reply something along the lines of “We don’t have the resources to support multiple platforms with minimal market share”. But it’s not as simple as that.
Our automated test system has discovered bugs, often just before a big release, more times than I’d care to admit. It’s an invaluable tool and one recommended by most modern development methodologies. However, I’m not a fan of methodologies in general. Let me preface that with a little bit of personal history…
Back in 1994, while working for a bank in London, I was asked to come up with a specification standard that could be given to any old ‘monkey’ and would produce reliable results. Management were fed up with the code quality and productivity disparities through-out the development teams.
Yesterday Jasenko, one of the developers here, noticed that our newest components weren’t working in ‘Safe Mode’. Since Safe Mode simply runs the component in a separate process by specifying CLSCTX_LOCAL_SERVER instead of CLSCTX_INPROC_SERVER this had us all quite confused. This used to work seamlessly. On closer inspection it appears that the default settings generated by Visual Studio 2008 are to blame.
Two extra changes are now required for out of process activation to work:
I just found this while debugging an issue:
if ( isConfigured() )
if ( isAllowed() )
I can’t tell you how many times I looked at that before I saw where the problem was. My brain was so used to following styling hints to see control of flow that it didn’t notice that the actual true flow was different from the visual flow. @*&^!
As with most other kids in the 1980s I grew up programming in BASIC. Variables weren’t strongly typed they just held values. Type mismatches were reported at runtime with the ‘Type mismatch at line xx’ error. If you wanted to start running a different piece of code you could simply GOTO whatever line you wanted. BASIC was a great language to learn programming in, you were in charge and the computer did the best to keep up.
Pascal on the other hand was something quite different. Pascal didn’t just run (at least the version we used didn’t), it needed to be compiled first. Variables needed to be declared ahead of time with their type specified before you even used them! No longer could you just jump around the code, instead you had to split the program up into functions and carefully control how they interacted. I hated Pascal. I preferred 6502 assembly (with an instruction set so limited there was no multiply operator) to Pascal.
I spend a lot of time moving between C++ and C#. Fortunately the languages are different enough that it’s not too difficult switching between using concepts such as stack allocated objects in C++ and garbage collected objects in C#. However, if I’m not concentrating I do run into trouble when coding in Managed C++ since I expect it to generally behave just like C++, just with access to .NET classes.
Managed C++ is really cool but it has a couple of gotchas. The one that has bitten me more than once is the difference between destructors and finalizers. To make sure that I don’t fall into the trap again I’m going to elaborate on what is a very long comment in one of our source code files.
In standard C++ you would write something like:
I’m currently in the process of moving all our source over to Visual Studio 2008. Most of our 3rd party libraries compiled right out of the box. However, the excellent Boost library was not so simple.
Boost has quite a complex build process that automatically discovers your compiler and builds the library with almost no user interaction. Unfortunately the current release of Boost (1.34) does not recognize VS2008 and the build process will only pick up older versions of the compiler. Fortunately once you know what files need to change it’s not too difficult to add a new compiler to the build process.
For anyone else who’s trying to compile Boost with VS2008 the link below contains the files I updated to fix our build process.