What larks!
I wonder what metrics Microsoft uses to calculate the benefit of a new feature. Take the new battery notifications messages added in Windows 7 for example. On paper, and during testing, that must have seemed like a useful feature to have in the product – it certainly seems useful to me.
Instead, when the feature actually worked as it was supposed to it turned out to be a mini PR issue for the company, as sites reporting that ‘Microsoft is investigating battery notification issues…’ steadily appeared.
Of course there’s nothing wrong with reporting that there might be an issue (thankfully it didn’t turn into the sky-is-falling fiasco of the T-Mobile data hiccup), but I shudder to think how much time and energy Microsoft had to invest into investigating the reports, speaking with partners and then conducting testing into the occurrences. And that’s not even counting the gynormous cost of Steven Sinofsky (his hourly rate must be up there) writing his clarification post over on the Engineering Windows 7 blog. Great post by the way!
I can imagine the next meeting of the ‘Windows 7 Battery Notification’ team. ‘Hey great feature folks. Nice work. But unfortunately we’ve had to make you all redundant. Steven’s post was charged to our cost center and the budget for the next year is all gone…’
I wouldn’t be surprised if a quiet Windows Update in the next month or two just simply removes this feature altogether. There, problem solved!
Let this be a sobering thought to all developers out there unlucky enough to be tasked with estimating the cost of a simple new feature:
- Specification of feature: 4 hours
- Cost of development: 12 hours
- Testing and deployment: 8 hours
- Investigation into and write up of supposed problems caused by the feature: 2,396 hours
:-)
The road to hell is paved with good intentions.
The road to hell is paved with good intentions.
The price of a useful enhancement http://bit.ly/9WTb4m
The price of a useful enhancement http://bit.ly/9WTb4m