My opinion about JDO in some short statements is:
JDO is transparent in a different way than, for example, java.io.Serializable is. JDO manipulates class files, not the source code. I don't like this as you could run into difficulties when doing Test-Driven Development (TDD), and I like to see what is there (namely in the source code. There sometimes was a post on Joel on Software, The Law of Leaky Abstraction, about the problems with abstractions.).
JDO seems to be a good solution but not the best I could dream of. Just one remark to the disposedness of SAP, a really big software development company, about JDO: In later releases of the NetWeaver Developer Studio for Java, part of the huge NetWeaver architecture, JDO was not supported. Period. You could hack and get JDO out of it, but who wants to develop software that way?
I would prefer a solution based on a code generator, or, if possible, something similar to java.io.Serializable (althought the latter may be too inconcrete to argue about, as it's only about introducing a marker interface). But a code generator would add the necessary flexibility as well as the necessary capability! And you could TDD your code as you like, do code coverage analysis on source code, which would make things much more fun. Some say, don't test third-party products. But relying on "a" third party toolkit made by people not known the to-believed trusting person is not the best sort of risk management to think about. Only one argument: Open-Source. Although not bad in general, many of these projects have kind of a charme like "Ahh, there is someone who wants to contribute, I don't know him, but he is willing to add something, so let him do. And if I, the project admin, find time, I will check what he really has done". I love open source but I would not trust it generally. There is no support, in contrast to many commercial products (allow me to cut it short here).
These days I wanted to bring in the functionality of a third-party open source framework into code of mine. But at the beginning I was not sure if the concepts of my framework and the other one would allow a joining. Therefor it had a touch of prototyping merging some logic of the other framework into mine.
If I wrote test cases before or even during this activity, they would probably had been a waste of time, in case the joining would have not worked as I hoped. But on the other hand ensuring the correctness of the newly created logic is number one priority and most important as unknown code always is a danger for a project.
Reflecting, I am quite content with my approach not to write the tests until the current point, where I am nearly finished merging the two frameworks and see that it works in general. There is enough time for that right now, I am likely to say, although I know that some guys would raise their hands and remark about the semi-optimal appraoch chosen. An important aspect seems the experience of the developer involved in the process, from the perspective of software development in general as well as from the special perspective on the code to be extended.
Is there anyone having merged two code bases with writing test cases during or even before the merger? I would be very interested in knowing details.
But since January, 15th, over one month ago, no further statistical data is shown.
Sourceforge knows about this and informs that they are going onto a new statistics system:
"Project statistics data for 2005 that is omitted will not be processed until launch of the new statistics system. This data has been collected, but has not been processed for display. To ensure accuracy and reduce performance impact to users, these dates will be omitted until the new statistics system is launched. There is not currently a date set for the rollout of the new stats system. Please keep an eye here for that date when we release it."
What I wonder about is the amount of time necessary to do so. One month is about beyond any basis for discussion, I strongly believe. There must be a deeper reason and not just small problems that arose when doing the job.
Quantum Theory is the most precise complex theory we have today. There is no other theory of such universality and brilliancy. The theory was created by Max Planck in the year 1900. In his earlier years, Planck wanted to study physics and was told that this would not be beneficial as anything of importance had already been discovered. Good for us he did not follow that hollow advice.
After Planck laid the foundation, world-famous Albert Einstein who is being celebrated this year (Einstein year, year of Physics) because of the 100th aniversary of the Quantum Theory and his 50th day of death, extended the great theory "casually". Then, he was employed at the patent office in Bern, Switzerland and was able to think about the problems with Quantum Theory while doing his daily business (which he must have found sort of boring and that used about 1% of his intellectual capacity, I assume). That is someone you would name a genius!
Other great names correlated with the theory are Niels Bohr (danish, outstanding character and physician, mentor of Heisenberg), Werner Heisenberg (founder of the Uncertainty Principle), Erwin Schrödinger (developed wave mechanics as well as Heisenberg did, but from a different perspective as discovered later on) and not to forget Max Born as teacher of many great physicians such as Heisenberg, Robert Oppenheimer (director of the well-known Manhattan project in Los Alamos), Fermi and Edward Teller (father of the hydrogen bomb).
It's all about philosophy
For most people not having a general understanding of Quantum Physics, it's not easy to see that this theory at first is capable of explaining our daily life's experiences (such as velocity, chemical reactions etc.). One must make himself aware that Quantum Physics contains Newton's Theory which is only a special case of Quantum Physics! Secondly, Quantum Theory predicts - and that is common sense and out of discussion in physicians circles - that nothing is determinated, meaning we are ruled by randomness. If you don't believe it you are in company with Einstein but not with Stephen Hawking. At third Quantum Theory is the most precise theory we have. Period. Next, Quantum Theory and Einstein's Theory of Relativity both meet when it comes to trying to explain Black Holes. This is because Quantum Theory copes with very small particles and a Block Hole is something very very small. And it is because Einstein's theory is appropriate for explaining huge mass embodiment, such as a Black Hole.
There is no Reality
Although Quantum Theory is the best theory we have, it is not a correct and complete one. If you don't believe it, I suggest reading some books about it. The theory is all about trying to give us a quite good description of what reality might be and how we could predict future states of our reality. Heisenberg's Uncertainty Principle alone prohibits knowing about reality precisely!
As we can easily see Physics and Object Orientation are both subject to objects. The type of objects does not matter, may they be technical instances or real instances. There are no real instances as said before. Both theories, Physics as a whole, and the OO paradigm are only descriptions of a theory that is commonly seen as "good". Nothing more.
So, just being logic and asking a provocative question: If:
- Quantum Theory is a very good theory and
- Quantum Theory for sure is much more complex than the theory of Object Orientation and
- both theories are about describing objects as part of our "reality"
then, how could someone think about OOP as a near-perfect paradigm? How could someone think OOP is a really good theory? I don't want to say that OOP is a bad thing, I love it. But as OOP is so extremely simple compared to Quantum Theory, and OOP has its weaknesses, how could we think of that there are no potentials? Many people don't think so, fortunately. But some love OO that much, that they get blind about its pitfalls. Just remember AOP as an attempt to make OOP more powerful and capable. And evertime you try to make something more powerful it will soon get too complicated. So AOP will be doomed to die out in just a few years as will classical OOP. AOP is too impratical to be used by the mass of developers out there. But it is a necessary step recognizing the possibilities and necessities to form a new theory of describing situations in a machine-understandable form.