Virtual performance better than physical

July 21st, 2010

I just thought this was too funny. VMWare recently released version 7.1 of their workstation product and one the new features was improved performance of the display drivers for windows guests. So I decided to look at the Windows Experience Index of a Windows 7 guest (32 bits).

Now in order to appreciate the irony here you have to look at the WEI of my host system

The guest has a higher WEI than my host system! That’s pretty amazing actually, I could probably play a DirectX 9 game inside that virtual machine. Probably not Crysis but still, Half Life 2 or Team Fortress 2 would probably work.

Now you know why I don’t mind working inside a virtual machine.

Meeting The Gu

March 30th, 2010

Yesterday we had a chance to meet with Scott Guthrie and Brian Keller. We talked for roughly two hours and had some interesting discussions.

The topics included:

  • The Security Development Lifecycle
  • Developer Productivity
  • Code Quality
  • Mocking frameworks
  • ASP.NET MVC in a SharePoint context
  • How to create a 64 bits developer environment in our environment
  • TFS and/or Visual Studio in the cloud
  • How do you convince DBA to allow generated SQL instead of stored procedures
  • Creating WCAG 2.0 accessible sites with SharePoint and Silverlight
  • Silverlight on mobile devices

None of the answers provided turn key solutions but most were either interesting insights or angles we had not yes considered.

If you get a chance to sit down with Scott then I suggest you take it. It will almost certainly be an interesting experience!

Married!

March 27th, 2010

Yesterday, Friday march 26, My girlfriend Yvette en I got married! We kept it small and went with just the family (and a photographer) to city hall.

It was a nice ceremony. The woman officiating the ceremony had been to our house in order to prepare and she had turned all the information she had gotten into a moving story. After that we both said ‘I do‘ and exchanged our rings. So that officially makes us husband and wife!

Today we received a very nice and very large bouquet of flours from my co-workers. Thanks everybody!

How to choose a technology

March 23rd, 2010

A while ago somebody asked for guidance on which database technology to use for a new application. Even though our part of the organization uses mostly Microsoft technology, there still is a lot to choose from. In the .NET space Microsoft currently offers five distinct ways to access a database:

  • ADO.NET Core
  • ADO.NET Entity Framework
  • ADO.NET Data Services
  • ADO.NET Sync Services
  • LINQ2SQL

So how do you choose?

An architect typically defines (or borrows) principles and lets them guide him.  So lets give that a try.

From the ivory tower…

In our global reference architecture there is a principle that can get us started:

Apply a technology only for its intended use!

The Application Architecture Guide can help us here as it defines various application archetypes and maps technologies to those archetypes. We can use these archetypes to classify the applications we build and maintain and then select (or reject) technologies based on those archetypes. As it turns out the applications we build mostly use these three archetypes:

  • Web Application Archetype
  • Service Application Archetype
  • SharePoint LOB Application Archetype

That immediately eliminates ADO.NET Data Services and ADO.NET Sync Services since those technologies are primarily aimed at Rich Clients.

From the store…

Another principle that applies in this case is coming from our database administrators:

You must use stored procedures!

Since LINQ2SQL is not working well with stored procedures as well as its being superseded by ADO.NET Entity Framework. This means LINQ2SQL also has to go.

From the boss…

Now we’re left with ADO.NET Entity Framework and ADO.NET Core. Both are good technologies to use for data access so which do we prefer? That brings us to the final principle for this case:

Use as little custom code as possible!

This makes us favor ADO.NET Entity Framework but with one caveat. If there is a performance bottle neck in the data access layer which makes it impossible to meet the performance requirements then the development team can use ADO.NET Core together with the EntLib data access block to get that extra bit of performance.

All for you

When you have a lot to choose from and a hard time choosing, try borrowing or defining some guiding principles. These can help you reduce the problem and simplify the decision process and as long as you apply them consistently they will make your work a little bit easier.

The right tool for the job

March 18th, 2010

Today a fellow architect and I got invited to look at a tender that the customer was complaining about. It was the customer’s opinion that the price was too high and that we should bring it down.

After having the problem explained to us, we tried to reduce costs on the proposed solution. It was going to be a completely custom-built solution involving some services, queues and a database. It was not looking too complicated but after trying to approach it from a couple of different angles we concluded that it was going to be hard to achieve significant savings on this design.

I started to reexamine the original business requirements and noticed that the problem could be approached from a totally different angle. What the customer wanted was to have  information gathered from various systems, transform that information in a particular manner en store the result in a file. That actually looks a lot like an ETL problem!

We consulted an ETL specialist and concluded:

  • we could bring down the development costs from over 2,000 hours to well below 1,000 hours,
  • a lot of the risks went away,
  • there was going to be a lot less design and testing involved,
  • the TCO also went down because this solutions requires a lot less server components.

All in all this new approach was going to save the customer literally tons of money while minimizing risks!

So next time somebody complains about a tender, take a long, hard look to see if you have the right tool for the job!

TheAspiringArchitect has moved

January 26th, 2010

a moving vanThis blog started its life out on blogger. However I soon got annoyed with the limited capabilities of blogger and have therefor moved to a self hosted installation of wordpress. It took some effort, mostly because blogger uses a fairly complicated structure for adding a custom domain, but the move is now completed.

Before PDC09

November 15th, 2009

My colleague Gerben and I arrived yesterday in Los Angeles to attend the PDC. There are pre conference sessions on Monday and the actual conference starts on Tuesday. Today is reserved for sightseeing and getting used to the time difference.

The Hollywood Walk of Fame

With our brains on different time zones we woke up early and decided to ‘carpe diem’. We started by going to Hollywood and have a look at the walk marathon of fame. This really surprised me but the starred sidewalk actually goes on for several miles…

We had breakfast in a small coffee shop and after that left for Griffith park. This is a huge park that contains the hill that has the Hollywood sign, the LA zoo and the Griffith observatory (amongst others). At the observatory you have can overlook most of LA (depending on the amount of smog).

After this we had a quick look at rodeo drive and passed on to have lunch. With our bellies full we decided to go to Venice beach. This proved to be quite an experience. There is a mile long boulevard that has lots of funny, interesting and strange things too look at. We watched a local basketball game and were entertained by some street artists.

We’re both wondering what this PDC will be about. Will there be revolutionary new announcements like last year or will it be ‘more of the same’? We both expect this year will not be as spectacular as last year was but we would love to be surprised.

To be continued…

Is Using Branches to Isolate Features Bad?

October 10th, 2009

Lets look at the two of the requirements we have for our development teams:

  • The mainline in version control has to done at all times
  • Work has to be checked in before leaving the office

The second requirement conflicts with the first one which can cause a mainline to be broken for days or even weeks on end. Since we have to be able to deliver from the mainline it has to be functional, stable and optimized at all times.

I’ve been toying with the idea of using branches to isolate the development of features for a while now. The reason for this is pretty simple: It prevents features that are not yet done from entering the mainline. Thus minimizing risk of getting a broken mainline.

After reading a couple of articles from a series on Branch-Per-Feature Source Control by Derick Bailey (and I hope he continuous that series) I decided to take the plunge. One the systems I architect for had to do some serious refactoring in order to improve performance. Using branches to isolate those changes from the work being done on the mainline would minimize the risk of missing our deadline.

Recently Martin Fowler also wrote a piece about feature branches. He makes the point that when using feature branches you either get a dangerous merge at some point because somebody has to integrate a lot of changes or various branches start integrating without using the mainline (which he calls Promiscuous Integration). His preferred solution is putting more thought into the design in order to Branch By Abstraction. I agree that this is a good idea, I don’t think it solves my original problem.

What is your your opinion and/or experience on this topic?

Just how mature is your process?

July 1st, 2009

Last year we were audited for CMMI level 3 and we received the certificate for that level. At the time I didn’t particularly care since I had other things to do but this morning I found two great articles by Tyner Blain on CMMI.

The first one (CMMI Levels explained) covers the basics and explains that having a CMMI level only means you have a some sort of process and depending on the level: are doing it (1), have written it down (2), have standardized it (3), are gathering statistics from it (4) or are improving upon it (5).

In second one (What CMMI Level should we use?) he illustrates that having a high CMMI level does not mean you can make a great product. It not even shows that you have a good process. It just shows that you have a process and how rigorously you are following it.

So should you care about CMMI? It depends on what you want to achieve. Do you want to measure the maturity of you current process? Then CMMI is just the thing. However if you want to improve on the existing one then you might want to look at RUP, SCRUM or Lean.

Whatever you do, don’t forget that the goal is not following the process but building great software.

Host header problems with IIS 6

May 21st, 2009

globe[1] Past week one of my colleagues was trying to create a production like site on his laptop using IIS6 and Windows Server 2003. He creates a site and gives it a host header mysite.corp so it can be differentiated from the other sites. In order to make this work he creates an entry for mysite.corp in his hosts file that points to 127.0.0.1 (or localhost). Instead of a working site he gets a HTTP 401.1 – Unauthorized: Logon Failed error page.

At this point we think its a simple ACL problem so we add the NETWORK SERVICE account, the internet guest account (IUSR_<Host>) and the Launch IIS Process Account (IWAM_<Host>) thinking this will solve the problem. It did not.

After some hard thinking and serious googling we found KB896861 which describes this problem. With service pack 1 for Server 2003 (and SP2 for XP) a security feature was introduced that protects against attacks on your loopback device. Luckily for us the knowledge base article also lists two solutions. The first one is creating an exception for each host header you use and the second one disables the feature altogether. We opted for the former and the problem went away.