Thursday, December 28, 2006

Cost of features

Microsoft seems to be hitting their head in a wall with the new Vista content protection according to a very interesting report by Peter Gutmann.

If the actual implications are as fatal as what is stated remains open to see, but a conclusion we can make at this point is that not all end results can be known beforehand when designing an extremely complex piece of software like for example Vista.

There is no way of knowing how the software/hardware ecosystem will be affected by a gigantic release with lots of new unknown features/traits. Transparency can help, by making the design and implications visible early, but there is naturally no guarantee that anybody will spot the problem before it is too late. Also, afterwards the problem must be acknowledged and corrected, requiring a fair degree of openness.

This time it seems like bending far over in one direction has resulted in a large invasion of problems from an other unwelcome direction. It is very unfortunate and disturbing that the focus does not seem to be set on making real customers happy, but more on satisfying digital rights owners unspecific dreams.

Thursday, December 21, 2006

Openness, transparency and efficiency in IT

In Open Source - More Than Code Bill Barr discuss if you really can run IT efficiently while maintaining full transparency to stakeholders. I have also been looking at the effects of transparency.

The spirit of SOX for example seem to require that business have full visibility into everything going on, down to a very technical level. In my opinion business should. First of all, they are paying for the IT systems and their development. Secondly, the more involved they become the bigger is the opportunity that the IT system serves the business need. Following SOX and other rules can then just be seen as another minor reason for driving transparency. Extra paperwork and meetings is often seen as a main drawback, and with the sad state of documentation I have seen on a number of cases this is most certainly true, but not in my opinion valid, since there is usually some value in improving documentation anyway.

The problem with transparency and involvement starts when business stakeholders interpret this as a mandate to lead and manage the IT development. This often result in micromanagement, having everything grind to a halt. The problems may be small or big, but in the end they all depend on the personalities and driving forces of the stakeholders you are working with. Keeping these stakeholders in the dark is often seen as a solution, but that can create even more serious antagonism further down the road. I think that a better solution is to make them aware of other stakeholder views and needs, in terms of costs and business risks, and the fact that they may not necessarily have all the facts or the knowledge required.

I would not say that transparency or openness need to be in conflict with efficiency, but there is a good chance that they will be. In a deep hierarchical "chain of command" type organization where departmental silos are not discussing with each other I think you can have either openness, transparency or efficiency. Achieving any two in any combination would come at an extremely high price, and all three is probably next to impossible achieving.

That said, I still consider involvement of individuals, openness for change and transparency to what is being done as the key to efficiency. The individuals in a lean organization, once set with a task, may well know how to achieve a solution as long as the organization is truly open to this kind of involvement. Transparency guarantees that if some initial condition change or some theory is later found wrong, there are plenty of opportunity spotting and correcting the error. The key i think is that everybody need to agree on what the actual problem that is to be solved is and listen to each other.

Improving efficiency is easy to if you don't need to consider any side effects. In the end, IT department efficiency should not be considered as the ultimate goal, and trying to optimize on a project or code development level can in worst case only reach a local optimum. Optimizing for business efficiency would be far better in many cases.

Monday, November 06, 2006

Knowledge for public good?

I would agree with Gowers review of IP (Intellectual Property) report that public good should be considered over private rights. I guess somebody eventually will take the time to sit down and plot out a chart of the solutions to the problem, as a function of public good, private profit and the good that can bring over time of a set of different IP types, but that may not be all that useful.

The biggest problem with the IPR legislation as i see is however that the law and the public perception of what is right is diverging. I would say that the damage to coming generations perception of right and wrong simply cant be corrected if that situation is allowed to continue.

Both society and coorporations would in my opinion benefit the most if knowledge and IP works of any kind truly were open for others to mix, match and blend withingiven limits that everybody were ready to aknowledge.

Sunday, October 08, 2006

Openness and efficiency

What is the point in keeping new innovative ways of working secret?

Some state that it is not any single idea, or any kind of secret you hold that is the key to success, but hard work. People would very much like the opposite to be true, since that would mean that they don't need to work their butts off, but can sit back and relax while somebody buys their idea. That is however seldom the case. You realy need to work hard in order to do a better job than your competition. Most of them do a really lousy job anyway so it should not be all that hard, and the rest never understood what it the idea was all about anyway...

However, secrecy, internal competition and the "secrecy cult loving"” corporate climate seems to have got the better of efficiency in today's world. Patents, non disclosure agreements, licenses, information on need to know basis and lawyers all help holding this cult of secrecy in power. Does it really make sense?

Nature teach us that individuals die and the knowledge they have acquired with them, but species survive. Other individuals in the species pick up some of the ideas and transmit them further. Holding secrets is not very smart in the long run with the knowledge that the individual will go under in a short while. Ideas can only be saved by copying them. The more copies the better. If there is a short term benefit to be gained from keeping secrets it used to be profitable only for a short time. In the long run somebody always came up with a better idea. With the prolongation of the patents life and copyright this is perhaps not the case any more.

Organizations and individuals should in my opinion at least be open to new ideas. There is a saying that innovation happens elsewhere. I tend to think that this is very true for major new ideas, since organizations and individuals inside the organizations get used to defend their own position and focus on minor improvements that can be done without disrupting the stagnant state the organization live in. If organizations are closed to new ideas then they will in the long run be doomed, which should however perhaps not be seen as a bad thing. Something else will follow after a major failure.

Openness also implies sharing. You have to give something in order to get something. I think this is also true when sharing ideas. If you only listen you will be able to understand something, but the comments you get from sharing what you do can be very much more valuable.

Some laws however set limits on the openness an organization can display and what can be shared. Health records or other personal information can not be shared without serious risk of violating the law. Openness can generally not be targeted at only one subset of the organizations peers, but must be equal to all. The law however still leaves plenty of room for sharing, openness and learning from others for any organisation to become more efficient.

Friday, September 29, 2006

Are "best practices" really any good?

Why did somebody pick the term "Best Practices" to describe something very often constituting a local optimum in efficiency space? I find it very hard to believe that all approaches described as "best practices" are equally good or even fulfill the label "best practice" at all. Lets for example examine two approaches that seems to be fairly popular, and seem to at least think they represent "best practices". The first, patterns, is fairly technical in its offering and close to the code while the second, ITIL is more management related.

Patterns have a clear area where they are applicable limiting their scope. Patterns are can be used, copied and most importantly improved by anyone. Patterns are in a sense open for improvement. Patterns seems to have spread rapidly to use into real life applications, some patterns obviously more than others. Embracing patterns has been made fairly easy. Inefficient code can be reimplemented using patterns by a technique called refactoring. Some tools are even claimed to automatically refactor for you at the press of a button.

ITIL is described as a collection of best practices, but is it really as efficient as it can be? I would state that it is too early to say although ITIL has been around for some years. ITIL is however only in the process of being broadly adopted. ITIL seems more proprietary and does not seem to welcome feedback to the same extent. This limits the people working on improving the offering to only those that work for the agency. In stead ITIL seems more focused on providing training and certification at a price. Migration to using ITIL processes is left to the organization perusing excellence. ITIL also seems fairly vague in terms of limiting applicability.

A "best practice" should logically be something that is coming fairly close to being as efficient as possible. Many patterns fulfill this requirement. ITIL i would say is a good practice, at least in some respect, but in others seem fall short of being a "best". I don't think the difference in technical dept between the two examples i picked is fully responsible for all differences.

Some are often baffled by the fact that a bureaucratic approach like ITIL is even considered a good approach. Everything however depend on the problem you have at hand. If you need to only manage a single server used for a non critical application used by a single user to serve his own blog entries ITIL is probably just as overkill as writing this application using every existing pattern ever documented. For a major fortune 500 enterprise in Europe ITIL may however be a very good approach to fulfill legally mandated governance of financial applications.

What i mainly disappointed in is the failure of "best practice" promoters to recognize and efficiently communicate the known limits. One major key to efficiency in IT lies in knowing what is suitable. In a similar way one of the major reasons for failure is belief that one approach would be suitable when it in reality is not. Marketing seldom gets the limit across. In the mean while until there is a non subjective declaration of content for each approach the best solution is to be skeptical of all "best practices" and carefully evaluate their merits. And yes, i would prefer that the "best practice" promoters would in stead start to talk about good operating practice.

Thursday, September 14, 2006

Efficiency in IT

Defining efficiency in IT is harder than defining inefficiency. Inefficiency could for example be defined as when resources/time/effort is wasted in IT functions.

There is nowadays much talk about efficiently aligning IT with business. IT functions should be getting the right things done in an agile manner as requested by the business. There is some merit to this, but mostly its just empty talk. If business have no clue what they want, or if business have no way of communicating what they need you will not either get any efficient aligned IT function for your business.

You must to a large extent know what to do from all stakeholders point of view in a balanced way. Some unfortunately tend to understand that business alignment only mandate implementation of the often arbitrary requested features. Truth is that a successful efficient system must fulfill needs raised by all stakeholders regardless if they are business representatives, technical deployer or help desk support professional. Some requirements are more important than others, but letting one part decide on priorities is not wise.

This does not mean that ALL functional or nonfunctional requirements should be implemented. It is equally important to be able to disregard all not well thought out requirements. Featuritis is a dicease where often business are forced to prematurely state all requirements up front (as done for waterfall projects) to be sure that they don't need to be involved in any cumbersome change management process intended to limit changes more than fulfilling any other need.

So how do you know what to do? One thing you could try is to ask me, but ill just answer that I don't know... The key, if there is any, is to involve everybody in your quest for efficiency. This has been shown to work in factory settings, so it should work in any setting. The people doing the work that will be affected by the change should know what can be done better and where it will lead, but getting the ideas out into the open will take a bit more than just asking for comments in a email message sent to all. An other thing to rely on is experience. If you don't have experience you have a good opportunity to acquire some by trial and error.

An useful guide to efficiency is to keep the spirit of Occam's razor in mind, or try tp do everything as simple as possible, but no more simple. Excessive complexity leads to waste, or aspects that will never be used that will be in the way of further development.

Nature has a couple of important lesson for us to learn. Always start small and evolve from there. Kill of bad ideas often and early, before they become multimillion failures bringing the whole company down. Everything does not need to be perfect, just good enough will do. Interaction with stakeholders and suggestions for small changes should be welcomed at every opportunity. The ability to test out new small features in a cheap way without substantial risk can be one source of efficiency. Doing small nondisruptive changes quickly is why agile movement is pushing. Note however that any agile approach by itself will not guarantee efficiency.

Saturday, July 29, 2006

What is a limit?

According to common sense it is simple to state that one thing is better than something else, at least as long as there is some aspect of the thing that can be measured. Stating that one approach to software engineering is better than an other does however not belong to the category of things that can be analysed in a simple way and does not seem to lend itself to any common sense approach. The problem is that there is not only one aspect to measure from which you can draw conclusions, but lots and lots of different aspects playing together, either making an approach good or bad. There are also lots and lots of "parameters" that can be changed and changing one thing will influence the others, even in unpredictable ways. Further more, changing one parameter any way possible may improve efficiency.

Theoretically there must exist a very best way to achieve something, but we don't know how good it is or where in the “parameter space” it is at. With any selected approach you can do better or worse, and finding the best way to archive a goal can be done by trial and error.

For example, by obtaining the latest development tools and making tool usage mandatory to a team of developers used to notepad / vi, delivery is delayed and amount of shipped bugs increase. The task for the team has become learning in stead of creating software. After the price is paid, and the team have done the learning and got used to the new tools the situation is totally different. Mundane tasks are automated by the tools, and tool specific quality assurance measures reduce the number of bugs. In this example you could say that by using notepad /vi the efficiency was trapped in a local maximum value. Getting out of this local maximum could have been difficult, but once done the efficiency increased. There are presumably many other things that could have been done, costing more or less, but before trying them out you can not really know if they would have worked better than the selected approach. There is neither any guarantee that the same approach will work on the next project.

Comparing to when scientists calculate the 3-D conformation of macro molecules like proteins and smaller molecules that bind to proteins, there is a simulation named Monte Carlo method being used, where you by repeatedly and randomly altering energy levels can get a good estimate of what the overall lowest energy conformation of the macro molecule system is. Repeated optimisation cycles and random alteration to the parameters will not guarantee that all possible conformations are evaluated but it does a pretty good job estimating the best conformation.

I don't know of any fully successful approach that would lend itself to study software engineering where you could experimentally or computational control the environment. We only do things once under normal conditions, and afterwards you have learned some more. We can't repeatedly create the same software over and over again altering just one factor and analysing the efficiency. In RAD Races you have different teams competing to create something working and useful, but the teams are constituted by individual members whose skill and motivation can vary by a large extent. Statistical approaches with many parties, contributing results over a long period of time could also be a good possibility, but getting objective and impartial data would be difficult. Further more the vast number of possible values for the vast number of parameters is way too large for any kind of exhaustive analysis.

If productivity is rising exponentially over time, it can be very difficult to pinpoint the most efficient approach. Getting stuck in a local optimum is very easy, and that is in my opinion the biggest problem for many organisations today.

A. A.

Tuesday, July 18, 2006

What is efficiency anyway?

Efficiency can be defined as the ratio of output to input for any given system. A better explanation could be skillfullness of avoiding waste. Google has a lot of other definitions as well.

The strictly economic aspect of efficiency is not always appropriate, although there may actually exist people who think so. See for example Efficiency in Art as an illustration of what i mean. I would argue that we need to take a broader view of what constitutes efficiency in software engineering. Clearly economic aspects are important, but suitability of functionality to the need as it changes over time should be considered more important.

An other problem with stating what is efficient in software engineering is that it is very difficult to draw the line of what is the boundary when you measure input and output. Some waste today could turn out to reduce waste later on.

If you draw your boundary too tight, and just focus on the next object or line of code you need to write, you can easily get stuck in local optima, which probably is very far off from the efficiency limit. If you make your boundary too wide, and try to embrace the world you will not either end up with a very good end result.

A. A.

Wednesday, July 05, 2006

Is there a limit to what software can do?

Grady Booch wrote an article for Rational Edge a few years ago explaining the physical limitations of what software can do. The article is still available, but now on IBM developerWorks (

To kind of summarize, there are some things in software engineering that are possible to achieve and some that are impossible. Of the possible ones we currently know how to achieve some while we have no clue about the rest. Of the ones we know how to achieve, some we can actually afford to, while other are too expensive to achieve, leaving in between a grey area where we can either make a reasonable estimate or we will be able to afford in a year or two. We are also likely to be restricted by our management and organisations further by imposed rules and regulations to be followed, some for a well motivated reason, others for more obscure beliefs.

Further more, we unintentionally add errors and magnify any uncertainties in our understanding of the requirements. Some of the tasks in SW engineering are also are less motivating than others, further making the end result more uneven.

I somehow don't think that the hard limits to software engineering efficiency are anywhere near. I feel that the way we go about doing things are not anywhere near the level we could achieve

My intention is to further describe what is limiting our ability to solve IT related problems
as efficiently as possible in upcoming posts to this blog.