Interview: Federico Mena-Quintero

Federico Mena Quintero will give a talk about profiling desktop applications "because the FOSDEM organizers kindly asked me for a talk on exactly that topic :)"

That's the spirit!

More seriously, for a good part of the last year I learned some things about performance analysis and did some interesting work for GNOME. Profiling is a black art for good and bad reasons: it really requires to work in a methodical way to be productive, and the tools we have are pretty bad. So I want to tell people about some techniques that they may find useful when looking at a program that is being slow for no apparent reason. I'll also tell you why programs become slow over time, even though no one wanted them to end up that way.

What effect would you like your talk to have?

Hopefully, [my talk will] remove some of the aura of voodoo that surrounds performance topics. People think, "how can I make this faster?", and they think of things like "use better compiler flags", "add a cache", "rewrite it in assembler". It's all voodoo until you realize that you have to do some actual *engineering* to figure out why things are slow, to fix them, and to be able to ensure that the bug doesn't reappear.

Given that description, will your talk be of interest to other (desktop) projects besides GNOME?

The content of my talk will certainly be useful to other desktop projects, and much of it is stuff that is pretty general-purpose. It's pretty generic in some parts: to do a performance investigation, you first have to understand the architecture of what you are trying to fix; then you make a careful hypothesis of what is wrong (usually with the aid of profiling tools); then you confirm that hypothesis (say, by replacing the expensive procedure with a no-op); and finally by fixing the problem, ensuring that your fix works, and leaving an infrastructure in place to ensure that the problem doesn't come back in the future. This is useful for any kind of software, not just GNOME or even desktop software.

So, most optimizations do not revolve around bugs or tough compiler/linker hacks?

There's an interesting pattern that appears when you have never tuned performance in a piece of code, especially after many years. The first time you profile things, you find *big* hot-spots: code that is doing way too much work for no good reason -- you can usually kill that code easily and the program will perform much better. You kill a few big problems that way: refreshing things that didn't need to be refreshed, using trivially stupid algorithms, etc.

The next time you profile, you find deeper issues. You find problems with various moving parts, and the fix in each part is not obvious. You find places where results could be cached intelligently, or where you need a good re-architecting of the data structures.

It's only at the end of this process that you look into micro-optimizations. But most of our slow applications are slow because they are doing too much unnecessary work.

In your opinion, would using a higher level language than C (e.g. an OO language like C++, Java, Python, Ruby or C#) make it less likely to run into badly optimized code, in a sense that they provide more ease and less work for developers?

We'll have some of the same problems, and some new ones. It may be possible to get better tools faster, since virtual machines (or runtimes or whatever you want to call them) are often easier to instrument than "raw" compiled code. Instrumentation means adding hooks to your VM to ask questions like, "how many objects get allocated per second?" "what kinds of objects?" "who did those allocations?" "how many threads are waiting on this mutex?" etc. Doing that for raw code, a la Valgrind, is a lot harder.

My hope is that the increase in productivity that these languages offer will give you more time to actually think of a good architecture for your software, and to look at its performance more often. When you code in C or C++ (or any other non-garbage-collected language), you spend all your time chasing stupid memory problems, instead of fixing interesting bugs or performance problems.

You're one of the founders of GNOME, and are still heavily involved into it. What is your personal vision of the future of the GNOME platform from a technical point of view ?

The GNOME platform has reached a point of maturity. Now we have to fix some final things, tuck in some loose ends, document it thoroughly, and most importantly, to figure out a way to improve the platform without breaking compatibility with everything that we already have. These things are not hard to do! It's just a lot of unglamorous work that people are reluctant to do.

Once we do that, we can devote our time to actually writing cool applications and a better desktop. I'd like us to start exploring better interaction models so that people don't get so much trouble when trying to use the file system, for example. But I'd consider the platform to be "almost done", and just in need of polishing and finishing touches.

Also, we haven't figured out a way for independent developers ("ISVs") to write applications that will run on different versions of GNOME. We free software hackers tend to focus on the absolutely-latest versions of the libraries and everything, and we pay no attention to how one can write an app that runs adequately with different versions of the libraries. Those independent developers have to resort to horrible things like dlopen()ing the libraries and doing dlsym() to find if particular functions are available; that's just ugly and it's a huge pain for them.

Do you think that a tighter collaboration for common and interoperable technologies between the major free/open source desktop environments (e.g. GNOME and KDE, amongst others) is a critical goal to achieve in a near future?

I don't worry very much about interoperability between desktops, but rather between the big monolothic stacks that we have: the GNOME libraries, the OpenOffice foundation code, the Mozilla foundation code. It's ridiculous that we can't share font settings easily, or network proxies, or to do cut&paste of rich text reliably.

Applications expect an "environment" to live in, and that environment is very badly defined. GNOME apps expect to be able to read some parameters from an "XSETTINGS manager" and they break when they run in a KDE desktop. KDE apps break in similar ways when run under GNOME. OpenOffice needs patching to be able to use anything from those desktops. Mozilla also thinks that it is a separate platform, and it also needs patching to be usable within those desktops.

Then what is your opinion about the standards and projects on freedesktop.org?

Freedesktop.org is very valuable. The problem it has is that there is no one to oversee that the standards get well implemented in the right places. You keep finding bugs in the way the standards are implemented in various apps and desktops. It would be better to have a compliance test suite, so that you can fix all the bugs in one shot. I'm sure that will happen eventually; it just hasn't happened yet.

Thanks a lot for these insights!


Additional links:

Creative Commons License
This interview is licensed under a Creative Commons Attribution 2.0 Belgium License.