When programming, you take a generic algorithm and customize it for a task. Sometimes you have a copy of an implementation of that algorithm laying around that you can copy. OO tells us not to do that. Someone once said, "If you're going to make a mistake, make it in a big way". Keeping one sacred copy that must be correct could certainly accomplish that. However, it makes it possible to fix a problem once instead of having it spread around the code. Having code replicated is a huge commitment. You're banking that nothing is wrong with it, that your program will never change how the data it works on is represented, and that people like looking at endless permutations of a single piece of code.
Object Orientation proposes to eliminate this. The act of separating your program into objects creates a ripe new area for endless duplication: sequences of setting, querying, and passing data. A common situation:
$money = $player->query_money(); if($player->query_max_money() < $x + $payout) { $player->set_money($money + $payout); $nickels_on_floor = 0; } else { $nickels_on_floor = $money + $payout - $player->query_max_money(); $player->set_money($player->query_max_money()); }
No matter which way we make the set_money() function work, we're doomed. If it enforces a ceiling, then we have to query again to see if the ceiling was enforced. If it doesn't enforce a ceiling, then we have to check each and every time we access the value and enforce it ourselves. The result is one or two of these sequences of logic will get strewn around the program. The problem is that the client needs something slightly more complex than the server is prepared to provide. We could, and perhaps should, make the object we're calling, $player, return an array, including success or failure, how much money actually changed hands, how much more they could carry. This would go with the approach of providing way more information than could ever be needed. This leads to bloated code and logic that we aren't sure whether or not is actually being used, leading to untested code going into production and becoming a time-bomb for the future, should anyone actually start using it. Less dramatically, we could modify the target object to return just one more piece of information when we realize we need it. This leads to a sort of feature envy, where the server is going out of its way to provide things in terms of a certain clients expectations, causing an API that is geared towards a particular client and incomprehensible to all else. Also, there is temptation to write something like:
package Util;
Beware of Utility, Helper, Misc, etc packages. They collect orphan code. The pressure to move things out of them is very low, as they all seem to fit by virtue of not fitting anywhere else. They grow indefinitely in size because the class of things that don't seem to belong anywhere is very large. The effect snowballs as the growth of other objects is stymied while the "Utility" package booms.
Snafus like this cause the number of accessors to grow to accommodate all of the permutations of accessing the data. You'll often see a set_ function, a query_ or get_ function, and add_ function, for each value we encapsulate.
package Casino; use ImplicitThis; ImplicitThis::imply();
sub pay_out {
# this method would go in $player, since it is mostly concerned with $player's variables, # but we don't want to clutter up $player's package, and we don't know if anyone else # will ever want to use this code.
my $player = shift; my $payout = shift;
my $money = $player->query_money(); if($player->query_max_money() < $money + $cost) { $player->set_money($money + $payout); $nickels_on_floor = 0; } else { $nickels_on_floor = $money + $payout - $player->query_max_money(); $player->set_money($player->query_max_money()); } }
Associating methods with our client object that reasonably belong in the server object ($player, in our case), isn't always the worst solution. In fact, if you put them there and leave them until it is clear that they are needed elsewhere, you'll find that either they are globally applicable, they only apply to this client, they apply to a special case of the client, or they apply to a special case of the server.
1. Applies only to this particular client: Leave the server's accessors in the client, in this case, Casino. 2. Nearly every client can benefit from this code: Put the logic in the server, in this case, $player. 3. Applies to a special case of clients: Consider a Facade for $player. Not worth it? Toss up between #1 and #2. 3. Applies to a special case of servers: Subclass $player's package.
Its okay to do it "wrong". Each new thing that gets built will give you more and more insight into how things really need to be able to work. The important thing is to continue to incorporate these lessons into the code, to keep the code in line with reality, and never be afraid of breaking your code. If you're afraid of breaking your code in name of making it better, it has you hostage. If you're afraid of breaking it because you think it'll take too long to fix it to work again after the change, it has already grown rigid, and only frequently breaking and fixing it will allow it to regain its flexibility. Take Jackie Chan, for instance. Having broken countless bones, he's only gotten stronger, braver, and apparently, more skilled at walking on a broken leg, more knowledgeable about his limits, and adept at healing. Alternatively, if you're afraid of subtle bugs creeping in undetected, you've got murky depths syndrome. Perhaps a lot of it is poorly understood Lava Flow code, that was laid down once, built on top of, and has become permanent for it. Reading through the dark murky code is a good start. A pass now and then keeps the possibilities and implications fresh in your mind. However, this is time consuming and will ultimately miss implications. There is no substitute for knowledge of the code, and neither is there substitute for testing. There is a special class of code where every bit of logic is exersized every execution. Mathematical modeling applications that work on large, well understood, datasets fit this description. Any subtle bug would give dramatic bias in the output as soon as the buggy program were run. Normal programs doesn't have this luxury - their bugs lurk for ages, possibly until maintaince headaches dictate it be abandoned and rewritten. We can't understand everything in a large program, but we can contrived data sets and test applications that work out every feature of our module. Writing artificial applications that use our modules, and coming up with bizarrely improbably-in-every-way datasets simulates the "luxury" case where all of the dark murky depths are used every run, in our case, every test run. The only time dark murky depths and Lava Flow code, the code most in need of a refactor, can be attacked is when we have a definitive method for discovering whether or not we've broken it. "Measure twice, cut once". If you're anything like me, you like flying by the seat of your pants. If you go to the movies or watch the tele, you know that every fighter pilot struggles with this issue: do I trust my intuition and wing it, or do I go by the book? Luke Skywalker destroyed the Death Star without that damn useless targeting computer. The architects that built skyscrapers certainly have to think outside of the box, so to speak, to come up with techniques and ideas for building beyond the bounds of what is believed possible, but no one would trust an architect that couldn't back up his gut instincts with cold, hard math. Solution? Code with your heart, but be the first to know when you make the inevitable wrong guess. Write seat of your pants code, but write first class scientific tests.
The article is originally from Perl Design Patterns Book
Search Encyclopedia
|
Featured Article
|