The deadline

Motivated by a current project with late hours of work, this text resulted. In this project, there was a deadline until the productivity start had to happen. Of course, this deadline was given by higher instances not knowing about the work done by the divisions below them (development, operating department).
Most of you should know this scenario. Many of you, I guess, have read the excellent book The Deadline: A Novel About Project Management by Tom DeMarco.

What was the result of that impossible deadline (of course, some time later it came out that it was impossible getting ready to that end date. Early enough to move the deadline to a later date) ?
Exactly sort of things described in the remarkable blog entry IT Survivors - Staying Alive In A Software Job from Harshad Oak, which I much appreciate:
  • 6 workdays/week
  • 12+ hours a day (some freaks had the need working 15 hours...)
  • Highly-dynamic decisions (some few hours before Friday's closing time came the request from the project leader to work on Saturday)
  • A 15-hours-a-day collegue expected the others to work as long as he did
  • Some collegues being on the edge
  • Irritated glances at colleagues not being available for a meeting at 20:00 o'clock
  • Shortly announces meetings (some few minutes before)
  • No plan at all....


JGAP 2.5 released

Yesterday, the new version of JGAP has been released. JGAP is a genetic algorithms package, which is easy to use and which is delivered with ready-to-use components, such as genetic operators, selectors and examples.

Try it out now!

There are many references available. Check out the JGAP references page


Console Outputs in Unit Test

During the review of several unit tests in different projects, there are usually tests containing output statements (such as System.out.println or file output). My initial opinion to outputs in tests, especially to console outputs, is that they should be avoided in any case.
When thinking further, it seems to me that gathering data in a file could be legitimate for some situations. But console output still should be avoided in general. The assertXXX-methods or the fail-method allow for displaying any message the developer wants. So outputs to other channels (console, file etc.) need a different motivation.

What could be the motivation for a console output? Maybe to display informal messages to the developer, such as warnings or pure information. For the latter, the motivation seems not strong enought to me for justifying the console output. For the former, I could image that for such console output certain circumstances could apply of which I could not name one concretely, now. However, instead of using warnings, a unit test should fail or not fail, nothing in between, IMO. A unit test should have the chance being fine-grained enough to fulfill this postulation quite easily.

For file output or related (such as sending an email to an admin, although I have not seen this in any unit test, luckily) one could think of keeping track of statistical data to recognize a "tendendy" of behaviour, e.g. for logic depending on pseudo-randomism such as hashcodes.

To conclude, I still think in most cases it causes more harm than do good using console or file outputs.

Do you have an example where such output would be justifyable in a unit test?


Online survey for JCP program

This is the opportunity telling your opinion about the JCP program, including the Java Specification Requests (JSR's). But consider taking some time as you will be occupied with the questionaire for several minutes.

To the JCP online survey.

Update: the JCP online survey is closed now.


JSR 220 compared to JSR 250

Recently, I wrote about JSR 250, having a poor quality IMHO. Apologies for writing again about it, but as I saw the draft of JSR 220 (Enterprise JavaBeans 3.0), I felt the force to do so.

Just have a look at the public review available. It is structured so clearly, wonderful! The goals are stated in a clear, extensive way at the beginning of the document. Such could be expected of any JSR. Then read JSR 250. Be sure to also check the comments given in the public review ballot of JSR 220 and of JSR 250. Notice the slight difference. For JSR 220, JBoss congratuated "to the JSR-220 EG for delivering such a quality specification." This is the first time I read such positive comments for a JSR (maybe I did review too few of them, who knows).

Several thoughts came to my mind after the comparison:
  • The importance of a JSR will probably influence its quality.
  • The person leading the specification will have a great influence on the later outlook and quality of the JSR.
  • As some persons participated in both JSR's, it's unlikely that they influenced the work significantly.


JSR 250 (Common Annotations) approved: Why?

Just these days, JSR 250 has been approved in the public review ballot. Only IBM voted with "no".

I cannot understand the process entirely. Here are my reasons (also see my former entries JSR 250 (Common Annotations): Opinion and Is JSR 250 already mature?)

Some weeks ago, exactly on 2005-06-25, I send several pages of suggestions to jsr-250-comments@jcp.org (the mail given to do so). No answer, nothing. On 2005.07-21, I asked the spec. leader, Rajiv Mordani, at rajiv.mordani@sun.com to give me an explanation, why there is no answer to a well-meant, reasonable suggestion and contribution to JSR 250. Again, no answer, nothing. This is very frustrating for someone who spent several hours thinking and writing something to enhance the JSR 250 in its current form. I regard the JSR 250 as weak, premature and not acceptable in common. The comment of IBM gives at least *some* hint about the validity of my opinion.
Also, the concrete suggestions stated by individuals in a discussion about the first draft were not recognized and/or not considered.

I await the effect of the JSR to the community eagerly. Will the community accept and use the common annotations? I don't think so. But let's wait...

A frustrated blogger and neutralized JSR participant...


Physics and Computer Science

As I discussed in a former entry, Why Quantum Theory is important for Object Orientation, I see a connection, a conceptual link between physics and computer science.

That I am not alone with this opinion, although it might sound strange at first, could be seen by the article of G√ľnther Meinhold: Einstein 2005: Modell und Wirklichkeit (model and reality), unfortunately for all non-German speaking people in a foreign language ;-)

The author draws the following comparisons, which I might be allowed to translate for showing the main idea behind the article:

PhysicsComputer Science
Models of real objectsModels of the software product to develop
Models demonstrate how Nature worksModels show how the future software product works
Physical modelLogical object model
Mathematical modelDesign object model
Mathematical calculationCreation/Development of source code
Mathematical results for measurable physical parametersExecutable source code with testable functionality
Experiments and observations as test basis for the models and their resultsTest of the functionality of the software against the product requirements

Yeahhh! I am not the only one out there believing in physics influencing computer science.

Anyone out there who doesn't love Einstein? Can't think so (he is not the father of the atom bomb, btw., please don't argue this way).


Is JSR 250 already mature?

As discussed earlier in a more general manner, I have the feeling that JSR 250 is away from a publishable form. Here are some of my concrete suggestions on how to pimp up the current paper released on June, 21st. I already sent them to the JSR commission as an RTF-document containing of several pages:
  1. Define what the JSR 250 really is about. There is no commen consensus about which aspects should be covered by the common annotations to be defined in JSR 250.
  2. I suggest including only annotations that have a generic character and support concerns of common interest that are not too complex (if they were they could be part of a separate JSR).
  3. Group annotations: It should be quite obvious how to group the currently defined annotations in the public review paper. This should be a first and easy step getting a better feeling on where the journey goes.
  4. Context-free aspects that could be covered by annotations could be:
  • Design Patterns, especially the topics
    • name of the pattern
    • role within a design pattern
    • description of the pattern itself and of its roles
  • Design by Contract-related issues
  • Architectural layers, such as (just to give an idea)
    • persistence
    • persistence mapping
    • interfaces to third-party systems
    • business logic
    • domain logic
    • transportation / protocols
    • view mapping
    • view logic
    • display

The current version of the JSR 250 public review seems not adequate for manifesting or extending it in a way that it is displaying when reading it. As I suggested, there should be a general refurbishment of the whole paper. We are far from looking at if a single annotation proposed by the JSR’s expert group is useful, should contain additional information or should be renamed.

The problem is of more principle nature: The goal should be defined in round terms, a firm stand should be taken, context-free (or, say, generic) annotations should be considered instead of special ones and the direct support for J2EE specifics should be reconsidered.

Interesting resources:


JSR 250 (Common Annotations): Opinion

The public review of the JSR 250 (Common Annotations for the Java Platform) has been published on June, 21st. After looking thru the list of annotations provided, I felt somewhat puzzled. IMHO, there is no clear line visible, no clear concept recognizable on which the selection of the annotation proposals is based.
I will write another entry soon explaining in more detail what I have written to the JSR 250 expert group as my proposal on how the annotations of this JSR should be choosen.

From my point of view, this JSR should mostly (exclusively?) contain annotations that are of common interest and are context free. One could joke about whether annotations specifically designed for J2EE could ever be of common interest. At least they are not context free.

Context free means in this context (intentional word repetition :-) ) that the thing annotated is context free. Take Design Patterns for example. They provide a context free concept of how to improve the architecture of your software system to allow for easier maintenance and better documentation of it.

Naming conventions for variables

After several years of experience in software development, I found a way of naming variables (attributes, fields, field declarations, as you like) that I feel convenient with.

First of all, in any class representing a business object (i.e., that contains business logic to a certain amount) there should only be private variables. Protected variables don't allow for implementing the Observer pattern! Use getters and setters instead. In few classes it may seem suitable using public fields because those classes are only used as very simple data containers.

My naming conventions for variables have evolved to the following:
  • At class level - for private and protected fields (if any) - I use the prefix m_
  • At method level there are two cases:
    a) method signature: I use prefix a_
    b) method body: No prefix, just the variable name.

Using this kit of simple concepts, it is very easy determining the scope of a variable just by looking at its name very superficially. In a setter method, the naming conventions allows to quickly find assignment errors and helps avoiding "this.", which in most cases (I saw) is regarded as ugly.

After an possible prefix the name of the variable starts with a lowercase letter. If there are multiple words concatenated together, from the second word on each word starts with an uppercase letter, such as: private int m_thisIsMyVariable;

Variables being constants by having the final static modifiers, are written all in uppercase letters. By the way, they are always public (at least if they are declared at class level).


Intentionally failing JUnit tests

In the JUnit mailing list, I followed a discussion about how to implement tests that are failing intentionally with an exception, i.e. the existence of an exception is seen as correct and the other case is a failed test.
Several proposals were made. Let me cite the initial comment on the subject partially:

--------------- snip
Someone in our team writes his tests like this:

catch(NoObjectFoundException e)

He says that e does it because otherwise Checkstyle would complain about the empty catch block.
I do it in this manner:

catch(NoObjectFoundException e)
// Its okay since we expect it.

--------------- snap

I find the "assertTrue(true)" in the catch statement very funny (who codes like this?). It is of course legitimate and correct to take care of Checkstyle, but in this way?

Most (serious) developers in the JUnit mailing list proposed to do it the latter way (putting a comment like //ignore in the catch block), but then you again will get a Checkstyle violation.

What about my solution:
try {
catch(NoObjectFoundException e) { 
   ;// this is OK
The only simple thing I do is adding a semicolon to the catch block.
Semicolon itself is an empty statement, there you go...
See JGAP for a sample application.


JDO: My opinion

Due to the, say, failure of JDO (see the public review ballot) some developers (about a thousand) "signed" the Petition to the Java Community Process Executive Committee.

My opinion about JDO in some short statements is:
JDO is transparent in a different way than, for example, java.io.Serializable is. JDO manipulates class files, not the source code. I don't like this as you could run into difficulties when doing Test-Driven Development (TDD), and I like to see what is there (namely in the source code. There sometimes was a post on Joel on Software, The Law of Leaky Abstraction, about the problems with abstractions.).
JDO seems to be a good solution but not the best I could dream of. Just one remark to the disposedness of SAP, a really big software development company, about JDO: In later releases of the NetWeaver Developer Studio for Java, part of the huge NetWeaver architecture, JDO was not supported. Period. You could hack and get JDO out of it, but who wants to develop software that way?

I would prefer a solution based on a code generator, or, if possible, something similar to java.io.Serializable (althought the latter may be too inconcrete to argue about, as it's only about introducing a marker interface). But a code generator would add the necessary flexibility as well as the necessary capability! And you could TDD your code as you like, do code coverage analysis on source code, which would make things much more fun. Some say, don't test third-party products. But relying on "a" third party toolkit made by people not known the to-believed trusting person is not the best sort of risk management to think about. Only one argument: Open-Source. Although not bad in general, many of these projects have kind of a charme like "Ahh, there is someone who wants to contribute, I don't know him, but he is willing to add something, so let him do. And if I, the project admin, find time, I will check what he really has done". I love open source but I would not trust it generally. There is no support, in contrast to many commercial products (allow me to cut it short here).


When TDD is not optimal

As a big fan of TDD, I often use it, admittedly mainly in a sort of poisoned form. I am not writing all tests before writing the code. At least half of it I write after "semi-completing" a method or a bunch of methods.

These days I wanted to bring in the functionality of a third-party open source framework into code of mine. But at the beginning I was not sure if the concepts of my framework and the other one would allow a joining. Therefor it had a touch of prototyping merging some logic of the other framework into mine.
If I wrote test cases before or even during this activity, they would probably had been a waste of time, in case the joining would have not worked as I hoped. But on the other hand ensuring the correctness of the newly created logic is number one priority and most important as unknown code always is a danger for a project.

Reflecting, I am quite content with my approach not to write the tests until the current point, where I am nearly finished merging the two frameworks and see that it works in general. There is enough time for that right now, I am likely to say, although I know that some guys would raise their hands and remark about the semi-optimal appraoch chosen. An important aspect seems the experience of the developer involved in the process, from the perspective of software development in general as well as from the special perspective on the code to be extended.

Is there anyone having merged two code bases with writing test cases during or even before the merger? I would be very interested in knowing details.


Sourceforge in trouble?

For every sourceforge project there is a nice usage statistics reflecting the number of page views and downloads per day (or other time spans if you want).

But since January, 15th, over one month ago, no further statistical data is shown.

Sourceforge knows about this and informs that they are going onto a new statistics system:

"Project statistics data for 2005 that is omitted will not be processed until launch of the new statistics system. This data has been collected, but has not been processed for display. To ensure accuracy and reduce performance impact to users, these dates will be omitted until the new statistics system is launched. There is not currently a date set for the rollout of the new stats system. Please keep an eye here for that date when we release it."

What I wonder about is the amount of time necessary to do so. One month is about beyond any basis for discussion, I strongly believe. There must be a deeper reason and not just small problems that arose when doing the job.


Why Quantum Theory is important for Object Orientation

Historical information
Quantum Theory is the most precise complex theory we have today. There is no other theory of such universality and brilliancy. The theory was created by Max Planck in the year 1900. In his earlier years, Planck wanted to study physics and was told that this would not be beneficial as anything of importance had already been discovered. Good for us he did not follow that hollow advice.

After Planck laid the foundation, world-famous Albert Einstein who is being celebrated this year (Einstein year, year of Physics) because of the 100th aniversary of the Quantum Theory and his 50th day of death, extended the great theory "casually". Then, he was employed at the patent office in Bern, Switzerland and was able to think about the problems with Quantum Theory while doing his daily business (which he must have found sort of boring and that used about 1% of his intellectual capacity, I assume). That is someone you would name a genius!

Other great names correlated with the theory are Niels Bohr (danish, outstanding character and physician, mentor of Heisenberg), Werner Heisenberg (founder of the Uncertainty Principle), Erwin Schrödinger (developed wave mechanics as well as Heisenberg did, but from a different perspective as discovered later on) and not to forget Max Born as teacher of many great physicians such as Heisenberg, Robert Oppenheimer (director of the well-known Manhattan project in Los Alamos), Fermi and Edward Teller (father of the hydrogen bomb).

It's all about philosophy
For most people not having a general understanding of Quantum Physics, it's not easy to see that this theory at first is capable of explaining our daily life's experiences (such as velocity, chemical reactions etc.). One must make himself aware that Quantum Physics contains Newton's Theory which is only a special case of Quantum Physics! Secondly, Quantum Theory predicts - and that is common sense and out of discussion in physicians circles - that nothing is determinated, meaning we are ruled by randomness. If you don't believe it you are in company with Einstein but not with Stephen Hawking. At third Quantum Theory is the most precise theory we have. Period. Next, Quantum Theory and Einstein's Theory of Relativity both meet when it comes to trying to explain Black Holes. This is because Quantum Theory copes with very small particles and a Block Hole is something very very small. And it is because Einstein's theory is appropriate for explaining huge mass embodiment, such as a Black Hole.

There is no Reality
Although Quantum Theory is the best theory we have, it is not a correct and complete one. If you don't believe it, I suggest reading some books about it. The theory is all about trying to give us a quite good description of what reality might be and how we could predict future states of our reality. Heisenberg's Uncertainty Principle alone prohibits knowing about reality precisely!

As we can easily see Physics and Object Orientation are both subject to objects. The type of objects does not matter, may they be technical instances or real instances. There are no real instances as said before. Both theories, Physics as a whole, and the OO paradigm are only descriptions of a theory that is commonly seen as "good". Nothing more.

So, just being logic and asking a provocative question: If:

  1. Quantum Theory is a very good theory and
  2. Quantum Theory for sure is much more complex than the theory of Object Orientation and
  3. both theories are about describing objects as part of our "reality"

then, how could someone think about OOP as a near-perfect paradigm? How could someone think OOP is a really good theory? I don't want to say that OOP is a bad thing, I love it. But as OOP is so extremely simple compared to Quantum Theory, and OOP has its weaknesses, how could we think of that there are no potentials? Many people don't think so, fortunately. But some love OO that much, that they get blind about its pitfalls. Just remember AOP as an attempt to make OOP more powerful and capable. And evertime you try to make something more powerful it will soon get too complicated. So AOP will be doomed to die out in just a few years as will classical OOP. AOP is too impratical to be used by the mass of developers out there. But it is a necessary step recognizing the possibilities and necessities to form a new theory of describing situations in a machine-understandable form.


Resources related to "Information Engineering"

The science I want to call Information Engineering here copes with the evaluation of data in order to to obtain information and the extraction of high-priority information (filtering out data of lower interest) from a plethora of information. Information Engineering helps reduce the problem of information overflow, just remember the latest prominent example of Cassini-Huygens.

The following resources I came up with when investigating the subject. My motivation was the advancement of the Java Genetic Algorithm Package, JGAP, developed by some other people as well as by myself.

Collection of Resources

Also see my other blog about Visions for Evolutionary Algorithms

Visions for Evolutionary Algorithms

Browing sourceforge and other open-source platforms, you find myriads of free software packages. Many of them have a glance of sophistication. But what about their practical use? An article from these days states that the NSA is in high need of a capable software tool that can identify information having a potential of being valuable. The NSA wants to "connect dots" and needs informatics to do so.

These classification problems are among the most interesting ones I could imagine nowadays (in my role as a computer scientist and "informaniac"). What I think would help bringing up a solution is Evolutionary Algorithms (EA). EA's align with nature, evolution based on Charles Darwin and inheritance from Gregor Mendel. Part of the EA approach is Genetic Algorithms (GA) and Genetic Programming (GP). The currently developed GA framework, JGAP (Java Gentic Algorithms Package), helps accomplishing both concepts. Work is currently undertaken to develop GP by adapting GA and infusing more flexibility to it.

If you are interested in knowing more, feel free to go to the sourceforge site of JGAP to see project information, or use the project homepage submitting several information about GA's of general interest.

A valuable resource about Genetical Engineering can be found in wired magazine with their article Life, Reinvented.

Also related: Games that make leaders: top researchers on the rise of play in business and education


Java UI Frameworks: Far way to satisfaction!

Similar to my former entry, Java Persistence: Failed, I have been stimulated by another entry, this time from Alexey Maslov, who asks Do we need another UI framework?.

In my comment to his entry, I stated that in my eyes, there is no real UI framework existent for Java, being capable of representing a common platform for UI development. Either it is too expensive (I want a framework for free, please. Look at Delphi, SAP etc. where several standardized mechanisms are there to satisfy the developer). Or it is instable, too hard to learn, not capable enough etc.

Sorry, I have not found a single framework fitting my simple needs. When trying out Luxor/XUL at first I thought there is light at the end of the tunnel. But then I soon recongnized the limized features when it comes to event-handling and other non-regular stuff (what is irregular about event-handling, one might ask).

As long as we ask questions such as Why does GridBagLayout get tab order wrong? we have a far way to go to satisfaction!

Developers Don't Write Documentation

An article titled Developers Don't Write Documentation can be found at OpenXource. It is written with some wit and is worth reading. As the article's title expresses, it is the opinion of the author that developers shouldn't focus on writing documentation. Documentation is meant here as documentation for the user of the software not technical documentation I assume.

Well, I also assume that most developers love to code or construct and develop an architecture but don't like to write that much plain text for the stupid user (ironically, of course). This seems human. When working on a doctorate you notice that thinking about great ideas, getting inspirations, drawing graphics and schemata is refreshing and satisfying most time. But when it comes to clearly structuring the stuff, choosing correct, concise and consistent formulations (3 C's Law?) then the brain soon gets stuck, often.

For coding software the reason could be the break in paradigms between writing code and writing natural language. The latter implies a more or less informal system, a non-consistent grammar, ambiguous expressions and no compiler checking everything. The only thing you have at hand is perhaps the word processor aiding to some extend with simple grammar checks embodying the intelligence of a stupid child (which is astonishing as intelligent software is rare and hard to craft, IMO).

Test-Driven Development
There always is a problem letting an outsider (not a developer) write documentation. How does he know in which way the system works and what the intentions of the developers are? In my eyes, a documenter needs to understand at least a bit of software development. And if he knows too much, he would probably love to code and would not be the right person for documenting.

To get out of the mess a little, I like to suggest the idea of Test-Driven Development (TDD). TDD is not only a possibility of validating your code work as expected but also (or mainly!) a documentation tool. TDD expresses what your code should do. It's the same with Design Patterns, although many people don't recognize the documentation power of both Design Patterns and TDD!

So go write your tests and with that create a basis for other people understand your well-validated masterpiece in order to write documentation for it. Isn't that a valid approach?


Test-Driven Development: Useful Techniques, Resources

Test-Driven Development (TDD) is subject of many publications. I feel that it never was seen as a hype. From my point of view, it is a really useful concept helping great in reducing errors. That seems proven, IMO. I used TDD in several projects, currently with JGAP and with a software package based on JRefactory.

I found out that many test cases rely on the same testing techniques, such as:

Type Cast
When testing if the return value of a method call conforms with a certain type, you could write:

   assertTrue("My String",vector.get(0));

Better would be:

   assertTrue("My String",(String)vector.get(0)); 

The latter ensures that we really get a String type back. The former could also be tue if the toString() method of the type returned is accordingly.
You could also use Type Case if you only wanted to ensure that an element at a given index in the list is not null and conforms to a special type.

Elements in unordered collection
If the result of a method under test is an unorded List or a Map, then it should nevertheless be tested whether obligatory elements exist in the collection and others do not. We can easily accomplish this by firstly building up an internal Map (e.g. java.util.HashMap) by putting in all elements that are expected to be returned by the method unter test.
After that we perform the method call. Then, we iterate over the returned list and call a helper method such as

   assertInList(final Map list, String s); 

This method could read like:
public void assertInList(final Map list, String s) {

  if (list.containsKey(s)) {
else {
fail("Object " + s + " not in list!");
Of course, you could extend this helper method for handling types delivered in the java.lang package. Sometimes a developer states java.lang.Comparable and sometimes he omits the package. By extending method assertInList to check for a string concatenated by java.lang and the string originally searched, we could make these instances equal.

Don't forget to write

     assertEquals(0, list.size());
after testing against the obligatory elements to ensure no other (unwanted) element is in the return list under test.

Molecular tests
It is not forbidden to include multiple asserts within one testcase. The advantage of only one assertion (or more generally: check) in a test case is that one could immediately see the point of failure. But with my IDE I could easily figure out the assertion faiiling just one mouse click later. Multipel assertions in one test case help validate more than one thing without the need of constructing again and again the input configuration (instantiatng objects, setting parameters).

Other publications about TDD:

Java Persistence: Failed

As the rejection of the JSR 243 shows, one of the most promising persistence strategies for Java, JDO, is doomed to fail. One could say that on a later time errors could be corrected. But nowadays we cannot affort waiting too long with most important aspects of an architecture.
I have been concerned about Java persistence in general since I began coping with persistency in Java. For me, there is no really satisfying mechanism available fitting to the (sub-)enterprise level. The current release of SAP NetWeaver for Java shows that JDO is still in its early days, at least the support for it in enterprise architectures. Look at SAP legacy architecture (ABAP): There you have it all at no cost, no need to do these ugly things necessary with JDO, EJB persistence, Hibernate and all the other stuff. I don't have anything against Hibernate, particularly. it's a good piece of open-source software. But in my eyes it's not enough compared to what can be possible.

Look at the approach Yi Zhou developed. Admittedly, I have not read it all, it may be ingenious, I don't know. But if, then at what cost (complexity)?

BTW: Tell me which *integrated* Java architecture to use for commercial projects containing persistence, business logic, view mapping, GUI etc. with a different name than J2EE? I personally don't like J2EE in any way. It's much too complicated, slow, non-agile, somewhat cryptic, redundant in the way you need to code, complicated (Design Patterns... oh my god, most people are happy when knowing about AbstractFactory or Observer...).

What I was wondered all the time how a single person such as Doug Lea made it to the elitary club of JSR reviewers. Good job, Doug. Although it is astonishing that he did not make a comment on his vote and is in sole company with Apple...

Agile Software Development

Agile Principles

After reading the book Lean Software Development by the Poppendiecks (can be found in your popular bookstore) I felt somewhat enlighted. The authors shared their experiences and ideas about good project management and processing. Some of them I liked very much. Here is my personal impression of Agility and XP. It should be clear that XP and Agility are not a magic bullet - the same goes for SOA, see my weblog entry Service-Oriented Architecture: What's up with it?.

Decide As Late As Possible
On one hand a good thing, on the other impractical. Try to tell your customer he should trust you. I know of customers that examine the time reported to be accounted up to the minute. What would be the chances to persuade such a customer to decide as late as possible just because you tell him it is a good thing and it is an agile principle?

Deliver As Fast As Possible
Catchwords are: schedule, cost of delay and tradeoff decisions. From the economical point of view the message behind those terms is apparent. For the software developer it means discomfort in most cases if a deadline will be overdrawn.

See the Whole
Seeing the whole means engaging a bird's eye view. The whole from the point of view of the employees should be to have as much fun as possible during their work. For that personal habits should be followed. Superstition is another element mentioned in the book from the Poppendiecks. Superstition forces us sometimes to do things in a way that could not be followed easily with logic. More than that, the worker wants to see his beliefs reflected during his activity.

Seeing the whole for the company means relationships to other customers and partners. Here, contracts play an important role. Free yourself from the faith fixed-price contracts and paid-per-hour-contracts will be the only possible solutions. As we all know (if not drop a comment) a fixed-price contract is a dangerous mechanism when talking about the nowaday complex world of software technology. On the other hand some customers would nearly feel abnormal if not making a fixed-price contract. Perhaps they want to feel unhappy?

A shared benefits contract would be a good solution. The problem here is measurement. But this is the same problem as how to measure the productivity of a worker.

For your convenience, browse the following to find out more about Agility:


Service-Oriented Architecure (SOA): Bubbles?

Service-Oriented Architecture (SOA) has been in the media since several months. Which also has to do with the marketing of big companies, commercial blogs or people jumping on using the buzzword (SOA is one, IMO). One can find articles about SOA in all the popular magazines around here in Germany. In other countries this should be true as well. Just take some popular magazines (I could easily name some German ones).

Mind-making thru the media
I have always wondered when any of these magazines resp. papers printed articles written or co-authored by employees of big companies. When reading the articles you soon get the idea that SOA is not a very big deal but more or less a hype. Although SOA is not bad at all, its congeniality is overemphazised too much, IMO. What is so great about SOA? I think, it is a logical consequence nearly every semi-talented softwerker or architect would have "invented" sooner or later for himself in general or regarding the building blocks of the paradigma. Of course the personal inventions would have been not as complex as it seems right know. And I am not talking about whole architectures already offered by big companies. It's mainly about common design principles.

What is a service?
SOA is about fine-granular software entities being autonomous and being able to be integrated within a platform to form an application, sort of. Services, as the acronym SOA indicates, are an integral part of the concept. BTW: Can you tell me what a service is? Ten people, ten different opinions.
But SOA is not a concept as all-embracing as the concept of Object-Orientation itself is. SOA officially is not well defined (everyone understands something different when hearing or reading it or knows just vague about it). SOA is just following common design principles being known for long time (long time in the context of the software market can be some few years, but principles as Loose Coupling, Open-Closed or Interface Segregation are known for longer. Packages could be abstracted to services, then, with packages being more well-defined than services. This was cut short!).

Is SOA embracing anything?
When telling companies "By doing a service-oriented architecture, your applications will be able to integrate within a bigger context, they will be extendable more easily etc." the truth is not matched fully.
Firstly, legacy applications cannot be converted to whatever we want. And if the target architecture is something seen as highly sophisticated (as SOA officially is) the task is getting near impossible.

SOA is daily business, isn't it?
Again, by publishing lead articles in popular channels (mostly semi-academical or just plain) the impression has been risen that SOA is the eighth wonder of the world. This definitively is not the fact. SOA is not bad, it is a good thing. But SOA is not something to emphasize to an extend currently recognizeable. My strong believe drawn from experience with other hypes is that SOA exactly is lining up with other bubbles. Perhaps it takes longer for the community to identify it as a hype than with other hypes. But the root for this lies in the strength of the marketer and the "official complexity" of the subject.

What about performance?
OK, computer get faster every few months. It is also known that software complexity grows faster than hardware evolution can compensate, generally. The excellent article Fuzzy Boundaries: Objects, Components, and Web Services shows that web services could also be seen as high potential performance killers.

I'm not the only one
An interesting blog entry displaying a year 2005 prediction list, contains a statement about SOA:

The term SOA will have been beaten to death and the software industry will invent or recycle some equally vague term to replace it.

The term has zero differentiation value at this point and marketing teams across the globe are looking to coin a replacement that will give them something more interesting to say about their middleware than "we move messages around really well."[...]

It is legitimate for any company promoting its products (and slightly smoothing reality sometimes) as it is legitimate for an article writer to sharpen reality from his point of view, trying to align with it as good as possible. As you noticed I have not said much about what SOA really is. From my point of view this is not important in this case as numerous articles (published internally and externally) exist about this matter.

Take care of yourself to not overestimate the importance of some buzzwords thrown in. Being honest, these buzzwords mostly affect non-technicians or people being semi-technically oriented because the truth is very complex and not easy to uncover without comprehensive coping. As most of us remember with the crash of the New Market, the danger of overestimation is greatest when another area of expertise is involved than the own main field of interest. Because we do have to believe someone!

This blog entry first appeared at SDN, this is a modified version.


5 Books for Java Developers

R.J. Lorimer posted his suggestions for 5 books that should be noticed by Java developers.
As I am coping mainly with software architecture and Design Patterns, my list is sort of related with that. Additionally, I read several books on different topics (e.g. Streamlined Object Modelling) which I would not recommend here.

Here is my list as I have quite different suggestions than R.J.:

Test-Driven Development by Example
The classical book from Kent Beck, well written, giving great insights into the concept of TDD: Even if you don't follow the XP paradigm "test first" but do the tests after the coding (I do it after the skeletton has been finished) it helps very much getting a good start.

Design Patterns: Elements of Reusable Software
The hyperclassical book on Design Patterns. IMO, it is not a really great book concerning the usefulness for Java developers as the examples are presented for Smalltask or C++ mostly. Additionally, it is kind of outdated. But if you want to claim knowing about Design Patterns this is a must-read.

The Pragmatic Programmer
Really cool book with a unique content. Check it out!

Design Patterns for Object-Oriented Software Development
Another book covering the topic of Design Patterns. Centered on frameworks and seeing DP's from a quite different perspective than is usual. More a recommendation for people really trying to dig in deeply with the subject.

The Timeless Way of Building
A non computer science book from the architect Christopher Alexander. Well know in academic circles but a tip for anyone who wants to get a new insight on software archtiecture.

You find these books at your favorite bookstore. I don't want to link to any commercial store here.


Open Source Projects: Which license I chose

Recently I had the need considering the license for a Java open source project administered by myself, JGAP. JGAP is a Genetic Algorithms Package that helps solving problems in a Darwinian way.

To Open-Source or not?
Well, the thing about open source software is, that it is available to anyone accepting the license agreement. And that agreement is the point. Some people may not want to make their work public by giving their source code away just because a stupid piece of code used forces one to do that. In contrast, it should not be allowed to go for gold with open source software just according to the weather today. Meaning, some forces are wanted and should not be omittable by any reason.

But after getting that with the LGPL you either have to submit your source code to public or need to make your software reverse-engineerable it should be clear that no commercial product will want to use LGP'd pieces of code.

Dual licensing
Fortunately, sometimes a serious person asked me to use JGAP for his commercial software package. That was motivation enough thinking about licensing models. I came up with dual-licensing offering one license for open source projects and another one for commercially oriented software.

I chose LGPL for the open source side. There are several other fitting licenses for that case, no problem should arise here.
For the commercial usage of JGAP I decided to use the MPL as I was pointed to a Java project doing this as well. Therefor it should work, I thought and informed myself about MPL and it seemed promising to me.
Now JGAP is set up with LGPL as well as MPL. You can chose LGPL without any restrictions. The MPL can onle be chosen after donating a certain amount to the paypal account of the JGAP project. This seems a fair solution to me. What do you think?


Design Patterns - Boondoggle or State-of-the-Art?

Real-world discrepancies

Everyone's talking about methods and concepts helping in making the development of sofware easier and more fruitful. Design Patterns are supposed to help as well. But there seems to be a discrepancy between academical research activities and practical use of patterns in pofessional software development. In academia, patterns are a hot topic producing uncountable publications and creating highly-paid chairs. In industry, my observations and interviews indicate a *slightly* different picture: Nearly none of the object-oriented developers uses patterns. And if, then the easiest of patterns to be known. Frameworks injected in a company's software product deliver Design Patterns to some extent. One reason might be that framework developers are usually more competent than "ordinary" developers or programmers doing their job.

Side note: IMO, a developer only using one ore more patterns without knowing them cannot be counted as a person knowing the applied Design Patterns in any case. The reason for this is the definition of Design Patterns: They are a proven solution to a recurring problem. And a proven solution must be explicitely known to someone, otherwise it may be proven "globally", so to say, but not locally, i.e. in his mind. Well, this is sort of philosophically, anyway.

SAP world introspected

SAP is a huge, integrated enterprise software system allowing to manage whole global playing companies. I display my thoughts on SAP regarding design patterns here as it contains a typical software architecture (grown and in consequence of historical development) and it is well known to me thru my work as a consultant.
In case of SAP developers the situation is even worse. Having programmed with ABAP (the SAP-built in lanbguage) all night and day, ALMOST NO ONE of them basically knows what a Design Pattern is. Am I wrong? Then give me a comment on this blog. But most probably my assumption should be correct.On the other hand, patterns play an important role in NetWeaver, as for WebDynpro the concept of UI patterns is being introduced to develop applications more intuitively.

Getting on it

I'll tell you something concrete about my strong assumptions being covered by many interviews, references and personal impressions on several software projects:

  • 90% of all developers don"t know about the existence of Design Patterns.
  • 10% of all developers use Design Patterns.
  • Only 1% of all developers (or one tenth of the 10% mentioned above) invent new Design Patterns or variations from known ones.

My sources were: colleagues, students, a questionaire performed by a university with about a hundred persons and a talk with a manager of a modelling software product who was involved in many interviews with developers. I don't claim to have the best sources but they are an indicator and I strongly believe what I'm writing.

The journey into success

As John Vlissides (member of the Gang of Four, GoF) said, there are six phases to climb the hill of being able to successfully (in our words: profitably) use Design Patterns. After having understood what they are, you begin to work with them, play with them. After a while you know more than two or three patterns. After that you even understand why they are implemented in their specific way. But only the last step brings you into a position embracing you to use them to your advantage. Before that everything you do is - more or less - a waste of time (or boondoggle, like this word).


The concept behind Design Patterns has been proven useful. It has been adapted from the well known architect Christopher Alexander, who invented the concept for use in architecture. Now we are searching for ways of creating mechanisms and tools to help us to cope with Design Patterns more effectively. The effort in learning them up to the point that enables us to use them as a daily tool is not acceptable. The community of developers needs something to overcome this antagonism. Here, we need academia to lay ground and industry to sponsor projects and give feedback and new impulses as well!

Your opinion?

What do you think of design Patterns? Which of them do you use? Which of them do you know by heart and which of the,m must you look up in a pattern book or another resource?

Tell me by dropping a comment!


Here are some links in addition to the one at the top of the article concerning Design Patterns:


Annotations in Java (1.5)

Annotations have been introduced to Java officially by the JSR 175. Previously they were most known by XDoclet, IMO.

There have been many blogs and articles on the subject, but I will explain annotations from my point of view and assembles some resources you can find at the end of my blog entry. I took some information from the SUN article (although I wrote this entry before the article has been available at that URL).

Purpose of Annotations in Java

The motivation for the introduction of annotations into the Java programming language was the need for semantic information for a given piece of code. Normally, I would speak of entities as an annotation could also be applied to an application as a whole in the form of a descriptor. But the latter is only part of my imagination and not explicitely specified within the JSR 175, AFAIK.
Annotations in Java need to be well-defined. The reason is quite obvious: A compiler or another tool scanning a source code should be able to automatically process the semantic information applied by a programmer.

Declaration of Annotations

To ensure well-defined annotations and allow reusability of definitions made before, the JSR 175 defines two modelling entities: Annotation types and annotations. Annotations are based on annotation types and cannot exist without them.

Annotation Types

An annotation type will be defined like as it were a Java interface. That means you handle them like you did with a Java interface (with restrictions).
An annotation type class implicitely extends the marker interface java.lang.annotation.Annotation. But it may not extend other annotation types or other interfaces. The latter would not make much sense.
How can we distinguish between normal interface definitions and annotation type definitions? An annotation type is defined by using the @interface keywords (plural!) instead of interface. The "keywords" @ and interface are concatenated to result in @interface. You could separate them by a whitespace. But this is not encouraged by matter of style, as the JSR 175 writes.

A declaration of an annotation type could look like this (example taken from JSR 175):

// Normal annotation type declaration with several elements

* Describes the "request-for-enhancement" (RFE)
* that led to the presence of
* the annotated API element.
public @interface RequestForEnhancement {
int id(); // Unique ID number associated with RFE
String synopsis(); // Synopsis of RFE
String engineer(); // Name of engineer who implemented RFE
String date(); // Date RFE was implemented

As you can see the declaration contains four fields. It would be possible to leave out the parameter definitions. Then we would have a marker annotation type. As with a marker interface like java.lang.Serializable a tool could interpret an annotation based on a marker annotation type in a special way.

Annotations based on Annotation Types

As said, an annotation is based on an annotation type. We could say, it extends the annotation type interface and, with that, implements it. Finally we would not speak of implementation in the context of annotations, because they don't represent an executable statement in native java code.
An annotation is used similar to Javadocs. Javadocs are dedicated types of comments. They begin with the at-symbol @ and are included within a comment section surround by /* and */. But annotations are more than comments. You don't surround them with comment markers as this is the case with Javadoc. You just take the at-symbol, write the name of the annotation type to use after it, provide all parameters of the type with values and place the whole annotation just ahead of a program element to be annotated. Let's have a closer look at this and go into an example for clarification.

Usage of Annotations

The JSR 175 writes, that "annotation type declarations are legal wherever interface declarations are legal, and have the same scope and accessibility.".
With that in mind and the concept of annotations at hand for implementing annotation types, we can use annotations as a well-defined mechanism to "comment" on program elements.
Example of an annotation (taken from the JSR):

// Normal annotation
id = 2868724,
synopsis = "Provide time-travel functionality",
engineer = "Mr. Peabody",
date = "4/1/2004"
public static void travelThroughTime(Date destination) { ... }

The scope of the above annotation is a method named travelThroughTime. As you can discover the parameters defined within the annotation type RequestForEnhancement (see example in section Annotation Types) are assigned with values. The order of the parameters equals the order of definition. This is not a must but proposed by the JSR.
The advantage of using the above annotation is that we have machine-processible information about a method, a variable declaration or a class. You could then use these pieces of information to check who has implemented how many methods, when the last change has appeared etc. The example is quite simple and you could criticize it easily in terms of usefulness. But use your imagination... XDoclet is sort of similar and is used extensively in the field of J2EE. Although there is some criticism about it, too, I won't dig into it right here.

It is also possible to use marker annotations. An example tells you all:

// Marker annotation
@Preliminary public class TimeTravel { ... }

The meaning of the @Preliminary annotation should be easy to follow.


Not having said very much about annotations because of several constraints, they seem a valid approach of handling metadata within source code. There should be no questioning about the usefulness of metadata. But the form of the JSR 175 approach could be reviewed. In my eyes, the proposal is quite easy to realize. IMO, it presents a concept being very abbreviate and handy.
Future considerations could cope with tool-support for annotations or how to use them pratically (e.g. in which package to declare them etc.).



Genetic Algorithms with JGAP

JGAP - Framework for Genetic Algorithms (and Genetic Programming)

There are several frameworks available to help you building up your own GA implementation without coping with the sticky details. One of these frameworks is JGAP. I am administrator of this project. There has been much feedback from JGAP users supporting the package with very positive feedback. The new release 2.0 was just published begin of this year and we got over a thousand page views one day. This is not much for some other projects, but I find it great for a sort of non-popular subject like Genetic Algorithms.

JGAP can be used as a startup basis, because it's not so complex and provides basic functionality.JGAP sets you into the position concentrating on the real problems of a GA: Setting up a fitness function, choosing a representation for your problem and fine-tuning the parameters. No need to hassle around with those boring stuff like programming random functions, genetic operators and so on. Although it is possible to implement your own extensions quite easily. Here's a class diagram for the most important JGAP classes (for version 1.1 and partly version 2.0):

(Click on image to zoom)

Implementing a GA

Using JGAP

With JGAP comes a simple example demonstrating the procedure of using the framework. The story behind the example is: For a given number between 1 and 100 find the minimum set of coins required to reach the given number. The coins available are 25, 10, 5, 2 and 1.

The Fitness Function

The fitness function is measuring the quality or fitness of a solution encoded by the internal state of an individual. In our example we just would calculate the difference between the number to be reached, say 100, and the number reached by the individual (using coins of several valences).

Representing an Individual

For the example an individual could be represented as a chromosome consisting of 5 IntegerGenes. An IntegerGene is a JGAP class and represents an integer number, that's easy. In our case, the index of the IntegerGene (1 to 5) would indicate the coin to use (25, 10 etc.). Additionally, the value of the IntegerGene would determine the number of coins of a certain value (25, 10...) to use.
With that we can easily encode possible solutions without knowing the solution exactly. And the fitness function could easily evaluate the state (resp. the coin value) of each single individual within the population.


With JGAP it is easy to implement Genetic Algorithms. It is offering all basic functionalities required to do so. Hosted at sourceforge you can use it freely and therefor be able to "go for gold" with evolutionary algorithms at no cost. Admitted: GA's - or generally speaking: EA's - are not a solution to any problem. They do not make you a magician. But for some problems they are definitively better than other algorithms (take the NASA who designed an antenna for space with help of GA's).
Watch out for other frameworks, like ECJ by Sean Luke. ECJ is more complex than JGAP which means it is more powerful on the one hand but you need more effort implementing your solution on the other hand.