Ubuntu Brainstorm is a website dedicated to collaborative greenlighting–anyone can post an idea for change in Ubuntu, and developers and community members vote and comment on the idea. As part of an initiative to increase the visibility of the highest rated ideas and speed their progress into the distribution, I’ve been asked to post my thoughts on an idea about multimedia performance.

The fundamental concern is a classic one for large systems: changes in one part of the system affect the performance of another part of the system. It’s modestly difficult to measure the performance effects of local changes, but exponentially more difficult to measure the “network effects” of changes across the system. Without good performance metrics, one set of developers working on one component may have no idea that their enhancements and fixes are actually degrading performance of another component. And once you have good metrics, you have to step back and take the larger view, because a 10% performance loss in a rarely used component may be a perfectly acceptable trade-off for a 20% performance improvement in a commonly used component.

Three of the six proposed solutions (#1, #2, and #5) have a common problem, they try to address performance across the system by examining one tiny piece of the whole. There’s no harm in optimizing the performance of sound components like PulseAudio, and it might be worth exploring enabling kernel preemption, but these are micro-optimizations. Without good metrics there’s no way of knowing if one small optimization had any real impact on the whole, or of verifying that it didn’t actually have a negative effect on overall performance. Solution #5 has an added problem (mentioned in the comments) that there are good legal and philosophical reasons to require manual user action for codec installation, which is almost certainly why that solution was voted -56.

The other three proposed solutions (#3, #4, and #6) are variations on a theme, really just different angles on the true solution: testing, testing, testing. The good news is, there’s real work going on here right now. At the recent Ubuntu Developer Summit, automated testing for performance and functionality came out as one of the Top Five focus areas for this development cycle. As one sign of how high testing is on the priority list, a third of the jobs currently posted on the Canonical site emphasize testing: 5 have QA in the title, another is entirely focused on test automation, and 6 more include testing and test automation as key components of the job description. (If you have a strong background in automated testing, please apply.)

There are a number of ways to be part of the solution. You might be interested in helping with building multimedia performance benchmarks using existing tools like cairo-perf-trace and Spandex, or in generating performance tests from real applications. You might be interested in working on user-friendly tools to measure graphics and multimedia performance, and display the change in performance over time.  You might work on automated performance benchmarks for your own project, keeping in mind ways the benchmarks can be integrated into a larger testing framework. And if you aren’t solely interested in performance, but all this talk of testing piques your interest, you might join in to help with the overall test plan for the Natty release, working to improve automated testing, test frameworks, code coverage, or hardware coverage.