Monthly Archives: July 2006

Ballmer: Software is becoming a service

Remember my blog entry on “Software as a Service Industry“. Well, I had a chuckle when I read today’s ZDNet top story: “Ballmer: Software is becoming a service“. See, it’s not just me, lol.

I think Microsoft will have a very hard time turning into a services and solutions company. They’ve spent decades now focusing on building and selling great products. The paradigm shift will certainly confuse their customers for the first little while, if not their employees.

But I see it everyday. Every time a customer comes in with a specific requirement that really only applies to their environment, the stronger I feel that selling software out of a box just won’t cut it any more, at least for complex software we tools builders end up making. With the ongoing costs of development and maintenance of that software, it makes more sense spreading out the revenue to match. And it places an even higher importance on the extensibility of that software, just as we see in Eclipse projects today.

So, we’ll see how this all pans out, but if Mr. Ballmer says its true, it must be true, 🙂

"AMD to buy ATI"

Now, have you not seen that headline enough yet?

Any time there’s a bit of a shakedown in our industry I’m always intrigued. It’s not what the industry analysts have to say about it, and certainly not what you read in the press release from the parties involved. It’s the story behind the story that piques my interest.

So, I blast through all the reports and try to piece together what is really happening and what it means to our future. For the AMD/ATI thing, the Inquirer yet again puts forth a interesting view on the insider story. Whether what they say is true or not we may never know, but I have seen a lot of rumors posted there that eventually became fact, including the AMD/ATI deal.

I do think that they present a good argument for what is happening, and it seems to be driven by the end of the MHz race (thanks, I can cook a roast in my PC case now, enough already!) and the push for many-multi-core a la Sun’s Niagara architecture. AMD also has some pretty cool ideas on how to integrate co-processors that do cool things into their cache-coherent architecture and I’m sure the ATI acquisition will help speed some of these along. And the Inq is pretty sure Intel is working on similar architectures.

So what does that mean for us tools developers? Well, these events really give me more confidence in my prediction that a programming model change is a-coming. Applications will more and more need to take advantage of a multi-threaded environment to get performance gains. We can no longer rely on ever increasing MHz to save us. For C and C++, it means building more multi-threading constructs into the language. Something the Parallel Tools (PTP) people are working on building tooling for APIs like OpenMP.

As I’m sure everyone who’s built a multi-threading application (such as Eclipse plug-ins) know, working in this environment is difficult and somewhat unpredictable. The door is wide open for a new set of analysis tools that we can use to scope out when things are going wrong. And I’m sure our experience with such tools in the embedded industry, where we have had to deal with unpredictability of environments for a very long time now, will become of value to everyone.

It’s an interesting time again in our industry and we’ll all need to keep our eyes on it and be ready to hold on tight as yet another paradigm begins to shift.

Sustaining Open Source Projects Through Turnover

When you have an open source project such as the CDT that has been around for a while, you end up having to deal with turnover in the people that are working on that project. There are usually a couple of reasons I’ve seen as to why this happens. Either they have been revectored or promoted to work on something else, or they’ve left the company that was contributing the resource to a company that doesn’t want to invest their resources that way. (As an interesting side note, we have quite a few examples now of people who have switched companies but are still working on the CDT, including yours truely, but that’s a topic all on its own…).

In dealing with turnover, I find myself going through a paradigm shift from young project to mature project. In a young project, you are struggling to get people and organizations to contribute to your project. So you find yourself accepting contributions that may not perfectly fit the mould and architecture you are trying to set out, but getting those contributions mean getting people involved and showing the world that your project has momentum and is “the exciting place to be”.

But with turnover, without proper documentation, automated tests, and good architectural fit, you start finding that that code that helped get your project going now becomes extra baggage. You start struggling to add new features and you find you need to either replace or simply remove the functionality it provided. Without someone to keep the code alive, it quickly gathers “rust” which starts to spread to places where you are trying to do new work.

So the lesson of the day for me is too keep the long term vision, including a well laid out architecture, for the project front and center from day one. Try to influence new contributors to follow that vision and to manage the churn in that vision so that you can sustain the code as long as you can. This is all basic software engineering school stuff, but it applies to open source projects as much as it does to commercial. And I think I am now of the opinion that having a strong vision like this can serve as much of a draw for contributors as a wide open door does. Or maybe the growth in the CDT lately has given me a bit more confidence. Or maybe its my new rose colored glasses…

JUnits are my friend

Now I’m sure everyone who writes code in Eclipse is well aware of the power of the JUnit, but I just felt like expressing my appreciation for them right now.

I am in the middle of adding a few constructs to CDT’s new index that didn’t make it into 3.1.0 and was worried about whether the code I had just written was correct or not. Of course, the CDT is chalk full of JUnit tests for the DOM and other features, but in the mad rush to get the new indexing framework in I cut corners and didn’t write any JUnits for it. Instead, I had my new Index View that I used to browse the index and visually verify things. (Now that view was supposed to be hidden since it’s not quite complete but thanks to those who found it and have raised bugs against it :).

Well, now that I have a bit more time, I figured I had better make the plunge and start writing some. To my surprise, with the new indexer architecture it was actually pretty easy to programatically create a project, import some files from my test plugin into the project and run the indexer over them. I was then able to easily write some code to search the index and make sure everything was there that was supposed to be there.

Alas, of course, it showed me that it didn’t and I have to now go and find out why that reference to my enum didn’t get added. In the end, writing JUnits will have saved me more time than it took to write them. No more excuses. And thanks to Mr. Joe Unit for saving the day yet again!

How many engineers does it take to turn a CDT?

We had our regular monthly CDT contributors call yesterday. These are usually low key things where we quickly touch base, talk about release planning and the occasional technical issue. We’ve had calls that have lasted only 20 minutes. Sometimes they’ll stretch to the whole hour if someone brings up a technical issue and we talk slow enough about it.

This months meeting struck me a little differently though. First of all, I was able to get a full head count and we had 21 people on the call. Of those people, I’d say 16 of them were people that have contributed code or are planning on contributing code. I also know that there were 3 or 4 such people that weren’t on the call. I found that I had to cut off discussions and table them for future meetings because we were going to run past the hour we have allocated.

When I joined QNX last year and was handed leadership of the CDT, I remember mentioning to Mike M. that we had a hard time attracting contributors. At the time we really only had 5 or so people actively contributing. We knew the interest in the CDT was high and just needed to find a way to turn at least some of that interest into contributions so that we could continue to grow the CDT.

I’d have to say now we are finally getting the attention that the CDT needs. With contributors counting around 20 and a lot of people out in the community testing and raising bugs, I’m starting to feel like we can actually reach the goals I had personally for the CDT and go way beyond. We have a bright collection of talent now and they are all doing great things. Even over the last week as we opened up CDT 4.0 development, there have been some cool enhancements going in (like common navigator support) and I can’t wait to try our first weekly build on Monday.

But the thing that really struck after the meeting was that I am going to be a busy man. With this many people contributing to the CDT, it’s going to be a great challenge to make sure we don’t run over each other. Communication is going to be key and I will take on the responsibility to make sure this communication happens and to facilitate the resolution of any conflicts that may arise. It’s going to be a great run, though, and I can’t wait to see what we accomplish as a team.

How many engineers does it take to push a button?

One of the benefits of being located in Ottawa is that I get to rub shoulders with the who’s who of Eclipse at interesting times. One of those times happened again yesterday as the button for releasing Callisto was pushed. Now, it wasn’t really a button and it took about a half an hour from the time Denis started when the mirrors were ready until all the web sites were updated and we could download Callisto. But it was a moment.

It was particularly underwelming for the newspaper guy who was there, but I did get a chance to interview with him and hopefully sent him off with something interesting to write other than a bunch of computer geeks hitting refresh until we could see the magic “3.2” appear. But such is our life.

I came away very impressed with the work that Denis and his team do. Sometimes we forget how complex an operation that a site such as is. But it takes a team of dedicated professionals to pull it of and my hats off to Denis, Matt, and Nathan for pulling off one of the most challenging releases you’ll see in this industry. And it was pretty cool to be in the nerve center as it was happening. Not to mention, they were all using Eclipse to managed the site which was also cool.