My personal belief is that design paradigms are there to assist with good software design, but there are times when they should be left out in the cold. I should state this is my personal view, against the herd (can't think for themselves) mentality - but hear me out I have very good reason:
- Interface all your BOs (business objects)
Now this is just silly - I have a large three tier system, with some 30-40 BO classes alone. The specification doesn't say these will be swapped out - so why interface them? Another senior developer of the group got into a heated argument with me that you just should, because it was "good practise" and we "may" need to swap it out in future but couldn't give me any reason why or when (and against design for now agile methodology), why we couldn't interface at a later date or accept how during dev we modify the implementation extensively and need to also do extra work changing the interfaces. I understand that interfaces are useful for unit tests but there ways around this and it is trade off between testing convenience and readability. As an aside, interfaces actually add slightly to the running performance overhead!
- Singletons are always good, especially for your constants.
The singleton pattern is actually classed as an anti-pattern. From a pure OO perspective this can completely break the concept, e.g. keeping all constants in the a single class, from your DB strings, to your hash keys in BO, to error message keys in the UI. The constants should be in separate base classes, for each package's classes to extend from. Admittedly this is not always possible, but it's not an excuse to plonk everything in a central singleton - which can get very unwieldy.
- Always use object to relational database mapping libraries (e.g. Java's hibernate/toplink)
The traditional approach of executing raw sql is sometimes better. There is a lot to be gained by using the relational mapping libraries - which I won't go into - however they are not always the best tools for the job. For example, you have limited memory, your application executes ~10 or less statements or you need code visibility as to what is actually happening whilst debugging without looking in log files. Executing raw sql can often be more efficient because you can specify sql exact to your requirements (no blanket selects) and can use less memory - one result set.
- Three tier hierarchies are better/are the future
If you're programming on a limited device (such as a hand held pc) you shouldn't be using a three tier system, you don't need the system to scale to hundreds of users at once - it only has one. Simple is nearly always better in this situation, two tiers will do, forms and a backend with some limited business logic. If you have central server to synchronise with defer and offload as much logic as you can there - obviously not form validation or similar, I'm talking middle tier content that doesn't need to be shown or used in real time on the limited device.
There are many other examples, but remember that designs patterns and paradigms are there to help you, to prevent common design and coding pitfalls and deficiencies in programming languages. They don't dictate how you must design, try and use common sense and know when not to use them! :)
1 comment:
You're so right about BO interfaces! I maintain a similar system at a large company written in Java that the out sourced company has interfaced as many classes as possible, but the interface methods are specialised e.g. displayInvertedRasterImage and will never be replaced. It makes my life hell debugging and managing the source.
Post a Comment