Tuesday, November 06, 2012

269 and Peer 1

So there is some discussion this year of a 269-269 electoral college tie. The level of thinking involved, though is shallow. This is an important lesson in disaster planning. Peer 1, and their customers, didn't plan well. You can read the story here. The connection will be come clear in a paragraph.

The quick summary is that Peer 1 relied on the fuel pumps of the building (75 Broad Street) that were in the basement near the fuel tanks. But those fuel pumps failed when they were flooded by Hurricane Sandy as well as a water main break. They went to heroic efforts with a manual bucket brigade, and even avoided downtime (it was so close that some customers chose to shut down to prevent data corruption). However the failures in planning became very clear. Their generators could not operate so long (a week) without downtime. They clearly didn't consider a flood. They were only prepared for a power outage, like a substation explosion. Nothing more. No extended outages, nothing. From the description it isn't even clear they had two generators. This comes from failure to consider the possibilities, and a complacency about the current model.

In a 269-269 tie, Congress votes on the president. But there are other possibilities. An Elector is not legally required, let alone automatically counted, to vote for the candidate he or she is pledged for. If there is a 269-269 tie, the first thing that will happen is that the winner of the popular vote will pressure electors for the other side to switch. If just one switches, it is over. Look for the Democrats to do that this time, because in a 269-269 tie, they are most likely the winner of the popular vote, and more importantly they stand to lose in the house. So even if they lose the popular vote, look for another argument.

When preparing for the unknown, you have to think beyond the current model, and test its assumptions with different scenarios, and then see if those scenarios are worth planning for in advance (cost/benefit or any other perspective or priority you have).

Sunday, June 10, 2012

Kotlin - Yet Another Language

What is the point? Yet another language on the JVM which has some syntactical sugar variance from everything else available. What for?

The next real programming language that changes something will do something that cannot be done in current languages. Python has its low barrier of entry and dynamic interpreting. Ruby has its textual flexibility so it is used as a DSL, and really showcased in Rails. Scalla has its static typing, with some flavor of functional programming. Functional programming in general is a different paradigm, and its flagship language is OCaml.

So what is the point of Kotlin? Just a slight twist on the same paradigms. Really not worth it. It didn't even do anything original with checked exceptions - it just got rid of them for the usual flimsy reasons.

The next language that will be significant will enable something that wasn't possible, or wasn't possible elegantly, previously. I suppose there may be space for a language designed so well that it eliminates pain points. Like IDEA used to be for IDEs. In truth, what bothers me most about the language is that it takes away resources from making IDEA truly great again.

Anyway, here are some things I would love to see in a new language:

1. Natural Database access, with static checking. So much business programing is done against the database, but lining those up (especially in the context of refactoring) is a major pain. Even Rails doesn't quite hit the sweet spot.

2. Natural XML access with static checking. When you do a lot of XML programming, it gets very hairy very fast. But the biggest problem is that it is a different language, essentially, with its own types and its own syntax, and you have nothing but string literals to deal with it in your programming language. How about variables that are XML types, loaded from the XSD (or a WSDL directly) by the compiler, and statically checked?

In the HTML world, you kind of have PHP for this, but it has problems, enough to justify a new language.

We spend way to much time either building RAD environments that are inadequate or creating glue code for no reason other than keeping a language general purpose. Solving this problem is what will make the next great language.

Thursday, February 02, 2012

Friends don't let friends buy whole life

Why is whole life insurance a bad deal? Let me count the ways.

First, the fees are very high, so they may say things like "Guaranteed 6% return." Just ask if that is after fees.

Second, the cost of the term insurance is overpriced - it is basically a hidden fee. Quote term insurance separately at AccuQuote or SelectQuote or Zander (Zander doesn't even ask who you are) and see. Then consider the difference a hidden fee.

Third, if you die before your savings exceed the face value of the insurance, they keep all the savings. Talk about a lousy term insurance deal.

Fourth, if you die after your savings exceed the face value of the insurance, you only get your savings - what happened to the term policy you continued to pay for out of your earnings? It was still in effect until you get too old to insure. So you are paying an insurance for the insurance company to make more money off of you.

So what if your friend sells life insurance, but you aren't sure what to do? So you ask another friend. But what if the seller of the life insurance is a mutual friend? Will you get an honest answer? Think about it.

So if you check a reference, sometimes a friend, someone you trust, will be less than candid, because he has a conflict of interest. Always keep your eyes open.

Thursday, December 08, 2011

Dave Ramsey and Employment discrimination

So Dave Ramsey describes your potential employees fitting in with your philosophy. On page 143, he describes that as the "loving people" part of his philosophy. And those who don't get it as not understanding "when we stopped what we were doing and helped some hurting people." Very adroitly done to avoid an admition of religious discrimination.

On page 219, he speaks about a "devotional" but couches it as motivational speakers. Quite grey.

However, in his EntreLeadership podcast #2 he lets the mask slip. At 17:45 he says the "why" of his philosophy is "Because we are Christians." Who is the we in that sentence? He then goes on to expound on an example of "helping some hurting people." Namely: Praying for someone. So if you aren't the right religion, or you don't want to participate in a prayer session, you don't fit in.

He may get away with it by staying in Nashville, but that stuff won't only get you sued in other parts of the country, you will lose (even in Nashville, it is still a Federal problem).

Besides this, the book also encourages another form of discrimination, perfectly legal in Tennessee, but will get you sued in 20 states.

Page 146:
This may be the best advice in this whole book ... we not only get to meet the hire's spouse and get to know them before they join our team, we can solicit their input on whether they think this position will work for their spouse.
Asking them if they are married alone, with nothing else, is evidence of discrimination based on marital status. Anyway, in a lot of the country it won't do you much good in addressing the concern because "living in sin" is very common. Will you ask them if they have a roommate too?

However, besides for evidence, we have the advocacy of actual discrimination based on marital status:

Page 152:
I also won't let a team member stay if they decide to have an affair. If their spouse can't trust them, neither can I.
If an employee sleeps with someone, that's OK. If they were married to someone else, they are fired immediately. That is discrimination based on marital status. This is legal in Tennessee, but Dave Ramsey really makes himself look ignorant (I don't doubt that he actually knows this is a problem) here when he pretends this is not a problem for a significant portion of his audience.

Anyway, the bottom line is that today, common sense is illegal - be prepared for it.

Thursday, September 08, 2011

Why Swing Sucks

The mouse listener edition. .

Of course pot shots are easy. How would you do it better? First, the most frustrating thing about Swing is the inability to have a reasonable default behavior. You spend a lot of time getting stuck on silly minutia like this.

Second, consider what the opposite behavior would look like. Mouse exited happens when the mouse leaves the entire space occupied by the panel, not just its visible part. Then the same solution would apply, just in the reverse. If you want the mouse to exit when it hovers over a different component, you have the same workaround. Which is the more likely, understood, default behavior?

The answer is probably different per component.

Which is why the listener framework is irredeemable in Swing - the listeners are just too dumb to be anything but a struggle.

Monday, August 22, 2011

How not to apply for a Job

So our company put out a job offer, and asked for Resume and cover letter. We get back an e-mail that says:

Subject: Job offer
From: n....@yahoo.com

Salary range? Benefits?
Thanks and good shabbos
Sent from my Verizon Wireless BlackBerry

No resume, no cover letter and no name, for a job that specifically advertized the need to have strong writing skills. Needless to say, I didn't respond.

Yes, the unemployment rate in this state is a bit over a full percentage better than the national average, but really? This person, whoever they are, is going to be wondering for a long time why they can't find a job.

Monday, May 02, 2011

So what is wrong with Properties in C#?

This has to be the best argument against them. But I'm not enough of a purist to say it wins the day.

Wednesday, March 30, 2011

But not quite ...

So what went wrong with checked exceptions? Well, they try to force a programmer to think about something that they don't want to, and prevent them from thinking about what they do want to.

So they turn them off, swallowing exceptions left and right. Just letting an exception propagate is much better than swallowing all the time. So what happens is that Checked Exceptions become just as important in an API design as the method name, return type or parameter list as far as the compiler is concerned, but from the point of view of the API designer or the programmer, they are just not very important and tend to be forgotten, causing pain in the API.

Ironically, it is precisely Java where this is a problem. Java is a language of corporate development. Corporate development doesn't generally involve a great amount of thinking about such high level robustness, so a typical Java developer is all the more likely to look at their code and say: "It compiles, it runs, looks good." and not think about error scenarios. It litterally never occurs to them.

Of course not all Java developers, but enough that it matters.

So I think the correct ballance in such a language is to have compiler warnings. Let the compiler tag classes that can throw checked exceptions, and if you fail to deal with them explicitly (throws or catch) then it will generate a compiler warning. You either care about that or you don't, but you are not swallowing exceptions.

Wednesday, March 23, 2011

In Defense of Checked Exceptions

I fought in the Checked Exceptions wars, much like your father ...

I have been defending Checked Exceptions since 2004, but lately I have been thinking that it just doesn't fit with where Java is in the computing universe. Ironic since Java is the only language that has them as a compile time constraint (to my knowledge).

Just to reiterate, this is my argument. A followup post will explain why I am walking away from it.

A potentially cogent argument for checked exceptions:

An overly simplistic way of thinking about checked exceptions is that if given the exact same input parameters, the method could sometimes throw an exception, and sometimes not, then that exception should be a checked exception. It tells the user of the API that this is an error condition that they cannot account for ahead of time, prior to calling the method. The programmer writing good code that works needs to realize that his or her application must handle that situation. If by handle that means "crash" then converting to a runtime exception may indeed be appropriate, but that is the clients choice, not the API's.

This is overly simplistic because not all appropriate checked exceptions would fit in this parameter, and there are always grey areas, but this guideline may help develop a reasonable API.

Good examples of such exceptions would be SQLException and ClassNotFoundException. The class being loaded by reflection by definition may not actually be in the classpath at runtime (if you could be sure that it would, you wouldn't need reflection). The database may not be working, or not reachable, at run time.

For operations which always fail given the same input parameters, a runtime exception could well be appropriate (such as ArrayIndexOutOfBounds) but should be documented in the javadocs so that users of the API know what parameters are valid.

One thing which regularly trips up developers, even experienced ones, is that they think they have to handle the "Exception" and not the problem it represents. For example, one of the arguments (expressed here) that checked exceptions are not appropriate is that the caller isn't in a position to deal with the exception.

This is really backwards thinking. When you call a method which relies on a database connection (or other volatile resource such as a file or server connection), you have to know that you may not get what you asked for. Returning null is only appropriate if you could expect to get null legitimately (a value in a column of a database, for example), and the method should be documented to return null. A checked exception means you have to think about what to do about the fact that the method cannot guarantee a consistent result. That is what a checked exception should be telling you: Hey, this may not work out, what do you want to do about that? How do you want to solve the problem when the database went down, or is misconfigured, or whatever?

You should wrap the checked exception when you want to abstract it. For example, if you are writing a datasource API which can flexibly get from a flatfile, a JDBC datasource and/or an XML file, you would want to wrap the various exceptions so that calling programs don't care which one was actually used. But in all cases they are volitile, and may not be available or properly configured at runtime, so the wrapper should be a checked exception.

What to do about exceptions through an interface which cannot throw checked exceptions (Say Comparable, or an ActionListener or the like)? The first thought should be that what the API is telling me is that my operation should be guaranteed. If pressing a button hits the database, then before this method ends, I have to inform the user of failure, and do whatever I wanted to do about that failure, so the event queue can continue on its merry way not caring about the result. Otherwise, the user presses the button, and simply nothing happens. Is that what you want? Comparable is saying: this operation has to work out, with a deterministic result. If it doesn't, it is up to the implementor to figure it out, not the client. It is either greater than, less than, or equal to. There is no option of "I can't figure it out, sometimes." If I always can't figure it out given this input, on the other hand, this is often a ClassCastException, an appropriate use of a runtime exception.

I think this leads to the core problem: Some programmers view checked exceptions as coding problems (how to handle them), not as design considerations (how to construct a program which handles serious potential environment errors in a robust way), and therefore don't handle them properly.

If your answer is that my system assumes that the runtime environment is pristine and never fails, and that level of robustness (or lack thereof) is acceptable to you, then indeed wrap exceptions in a runtime exception. It is a step above swallowing the exception, although probably little better than catching Exception and logging the result. But that is the client's choice, not the API's. The API is telling you something will go wrong which you could not account for programmatically. It is up to you if that bothers you.

In large applications where Exception swallowing becomes the norm, I would bet that that is a symptom of larger problems such as developers producing specific functionality, not well functioning programs, and/or a highly coupled system without clear boundaries of responsibility leaving it unclear when to pass exceptions and when to wrap them, or simply not caring about error handling. Error handling should be as much a requirement as the core functionality. If that requirement is crash early, crash often, then wrapping in runtime exceptions, or even swallowing may be appropriate. At least it is a deliberate decision, and having API's which throw checked exceptions encourage the deliberation, helping good programmers develop good code that works.

Monday, March 14, 2011

Why multiple choice tests are a bad thing

This is a great example of a bad multiple choice test. Multiple choice tests are hard to write because the one writing the question has to anticipate the misunderstandings that can occur. The first two questions are a disaster and the last one isn't any better. Let's take this a question at a time:

1. Sara works full-time at the Big Save store and earns $2,500 per month. Who pays the contributions to Social Security on the $2,500 per month in wages she earns?
A. Only Sara.
B. Only Big Save, her employer.
C. Both Sara and Big Save, her employer.
D. I don’t know.

So the question is "who pays?" I'm sure they are looking for C (even before checking the answers at the bottom), but in fact you could justify every single answer. From an economic perspective, the employer basis the salary it can pay on the added payroll tax and costs, so really Sara is paying for it in a lowered salary - certainly in the aggregate of all the Saras in all the Big Saves of the country. From the perspective of who writes the check, the answer is B. From the perspective of who sees what deduction on their income and or inflation in their payroll expense, that would be C. And of course you could write D just because the question is such a mess.

2. Amanda has $5,000 saved up from working at different jobs. She puts her money in a savings account that pays four percent a year in interest. How much money will be in her account at the end of the first year and at the end of the second year?
A. End of first year: $5,100; end of second year: $5,400.
B. End of first year: $5,200; end of second year: $5,400.
C. End of first year: $5,200; end of second year: $5,408.
D. I don’t know.

Well, this is a compound interest question. They want C. At least A and B are clearly wrong. But the real answer is D - there isn't enough information to know, as you don't know how often the interest compounds. The question assumes annually. I don't know any savings accounts yielding 4% currently, but I have never seen one compound only annually.

5. John drove his car to the local Gas and Shop store. On the way to the store he got distracted while talking to his friend in the car and hit a street sign. Neither he nor his friend was hurt in the accident, but the front end of the car was damaged. What type of automobile insurance coverage will provide reimbursement for damages to his car?
A. liability.
B. collision.
C. comprehensive.
D. I don’t know.

If you get collision *or* comprehensive, you will be covered. Technically when you read down the itemized list of insurance coverage, they may be broken out, but when people think of insurance they often think of levels, so liability is less than collision which is less than comprehensive. So B or C are both financially literate answers, unless you are an insurance agent.

Friday, January 28, 2011

The whole employee

The latest buzz in HR is to provide financial management advice to employees, to avoid situations where you give employees a $300 a month raise and they go get themselves a $400 a month car payment.

It opens an interesting question: how much of your employees personal life is your business? Sometimes employees make it your business in their raise requests. But in general what is the right balance?

I think the answer lies in the difference between being helpful and being intrusive. If it is offered as resources to employees to make them happier, and happier employees are more productive, with the recognition that it is personal and not something you can push, then it is certainly worth doing.

However, if you think that micromanaging your employees lives is the way to make them satisfied with their salary, I don't see how anything gets better.

Monday, July 12, 2010

Why AGPL (and to a lesser degree GPL) is a business problem

iText went AGPL. On their page they list two additional requirements, and the AGPL does give the licensor the ability to add certain restrictions. One of them is:

In accordance with Section 7(b) of the GNU Affero General Public License, you must retain the producer line in every PDF that is created or manipulated using iText.

That didn't really meet my understanding of section 7(b), so I asked the Free Software Foundation for a clarification:

I read that portion of the license and it says:

Notwithstanding any other provision of this License, for material you
add to a covered work, you may (if authorized by the copyright holders of
that material) supplement the terms of this License with terms:

...

b) Requiring preservation of specified reasonable legal notices or
author attributions in that material or in the Appropriate Legal
Notices displayed by works containing it;


I would understand “that material” to be the source code, and they can require retaining the author attributions and legal notices in their source code. However a restrictions that says you cannot change how the source code performs its function, or require that it put specific text in an output result of the software would seem to not be authorized by section b, and it would seem to violate the letter and spirit of the AGPL. Can you please clarify?

Thank you very much.

Francois Marier was kind enough to get back to me and replied:

Section 7(b) allows you to add either or both of two separate things.
First is "Requiring preservation of specified reasonable legal notices or author attributions in that material" -- and you're right that "that material" here is source code that you add to the software (per the paragraph at the top of the list in section 7). So the requirement in question does not meet that criteria.

The second thing it allows you to add is a requirement to add similar text "in the Appropriate Legal Notices displayed by works containing it." This goes beyond the source code: per the definition of Appropriate Legal Notices in section 0, these appear in interactive user interfaces provided by the software. However, that still doesn't go so far as including notices in the noninteractiveoutput of the program. So the requirement does not meet this criteria either, and is not acceptable under section 7(b).

Please note that this is not legal advice.

So given the above if you were a company that had a business model that would involve distributing AGPL software, and you wanted to use something from some other Vendor plus iText, you would have a real problem. On the one hand, iText is giving you a non-AGPL compliant license restriction, on the other hand, it is their code, they can do that if they want to. Even if the FSF has a problem with taking and modifying the AGPL for your own purposes, that would be between iText and them, it would not automatically entitle you to use iText under a pure AGPL. Maybe you could make that case in court, but who would want to take that chance?

So the bottom line is that the ability to add restrictions of a limited type, coupled with the inability to add further restrictions can take the whole GPL/AGPL license into a host of incompatibilities. I hope the FSF figures out a better solution for the next version.

Friday, May 07, 2010

Thoughts on Windows 7

So why is Windows 7 so much better received than Vista? My guess is simply that it is backwards compatible with Vista, and the pain of the transition was already taken out on Vista. If Windows 7 came first I think it would have had the same problems.

(BTW, what is the 7 in Windows 7? 7 from what? If Windows 95 was 4, then 98 was 5, ME 6 and XP 7. If NT was 4.0, then 2000 was 5, XP 6 and Vista 7. Why 7? Wikipedia says that 2000 was 5 and XP was 5.1, so Vista was only 6, but then Windows 7 is really only 6.1. I remember when Microsoft justified its moving away from version numbers with Windows 95 by saying everyone was confused by them.)

Need a new Job

It is almost impossible to imagine that a profitable company functions this way, but courtesy of the blog Statically Typed by wheaties we have one horror story after another. Just wow. And he tags it Mea Culpa!

I think the most unbelievable part of the story is the part about how subversion hiccuped on them and they therefore threw out version control (!!!). Of course the naming convention on the file system isn't going to do much good if that hard drive crashes. Do these people even know what backups are???

Whatever it is, I want to know where he works so I can be sure to never, ever buy any of their products.

Tuesday, January 26, 2010

Business Management and the Scientific Method

It is pretty rare that I read a Bob Lewis column and find myself not only disagreeing with everything in it, but finding that it didn't give me any new insights.

This is one of those rare exceptions.

He starts with a description of how debates settle nothing but the question of who is the better debater. The problem is he then misses the point:

When I listen to a debate, at the end I’m only persuaded as to which of the two opponents is the superior arguer. I’m not convinced one position is superior to the other because the whole point is that the superior debater could just as easily persuade me of the opposite proposition.


The real takeaway from that observation is that any debate essentially demonstrates who is the better debater. In order know which one is right, you have to understand what they are saying, know how to look at evidence beyond what either of them may have presented, and know how to decide which one is correct. The key point being that this is true whether or not the debaters passionately believe in what they are arguing or they are just an advocate for a client. The pureness of motives of the debaters doesn't affect this, it is a matter of the effect on the listener.

Missing that point he then goes on to the scientific method, extolling the virtues of two debaters (Leonard Susskind and Stephen Hawking) because their motives were pure. But in reality if you don't understand the subject matter of the debate, you are only judging by the debating skills of the two.

To compound the error, the article completely misses the different techniques that the scientific method has developed to insulate the biases (such as falling for superior debating skills) from effecting the results of the research, and instead seeks to apply the scientific method of falsifiable predictions. He explains:

Here it is: Scientists understand they can’t ever prove a theory. All they can do is try to disprove them (”falsify them,” according to the vocabulary introduced by Karl Popper, the epistemologist who first explored this subject in depth). Fail to falsify one enough times in enough ways and researchers start to gain confidence that they’re on the right track.

The way they try to falsify a theory is to explore its implications. Some of those implications result in predictions — that under a specific set of conditions, the researchers should be able to observe a specific phenomenon. The researchers then either create those conditions or find them, and either observe the predicted phenomenon or something different. If they observe something different they’ve falsified the theory.


He then sets up what are essentially straw man predictions and attempts to apply them to IT theories that he doesn't like. However, having dropped one of the main controls for bias in science (a reproducible experiment), the value of the scientific method and its applicability to IT management is really no where supported by what came before it in the article.

Fans of the scientific method tend to not like the fact that it doesn't apply cleanly to all situations. Not all situations can be analyzed with repeatable experiments. And not all situations can make both accurate and meaningful predictions. IT governance is such an example.

In Bob Lewis' own words, here is the prediction he ascribes to the view that IT should be run as a business:

For example, if IT organizations that are run as a business deliver more value than any others, then companies within which they are run this way should be more successful and profitable than competitors that don’t.

Except this would be a bad test. The theory isn’t that running IT as a business is better than the average of all other ways. The theory is that it’s best practice — that it is superior to all known alternatives.

So a better test would be to compare running IT as a business to a specific alternative. Here’s one: Integrate IT into the enterprise, providing strong strategic business leadership and disciplined governance. Compare the success and profitability of companies that use the two different approaches and you’ll have made a start.


The strawman here is idea that a given theory about IT governance predicts demonstrable "success and profitability." Sure, all things being equal, but they never are. Consider Company A, in 2006, heavily invested in Sub-Prime mortgages. Consider Company B, in 2006, heavily invested in Government work that was heavily funded by the American Recovery and Reinvestment Act (otherwise know as the Stimulus package). Is their theory of IT governance going to be the controlling condition of their success between 2006 and 2010?

I suppose you could argue that a good, strategic, IT organization would have predicted the risk of sub-prime mortgage investments, but even if the company had diversified, in mid-2008 it would have looked like a failure.

Of course I am picking on the most extreme, obvious example, but even something as simple as were you Starbucks or were you McDonald's at the end of 2008, what happened afterward (Starbucks down, McDonald's up) has nothing to do with IT governance, and even if Starbucks had the Integrated IT approach and McDonald's had the IT as a Business model, the "success and profitability" of each company in fact tells us nothing about their IT practices.

Wednesday, December 30, 2009

Every interaction is with a potential customer

Today I had to schedule a pickup of a FedEx package (being paid for by Dell, as it was a return of a defective DVD drive). The FedEx automated system online as well as over the phone require you to have a FedEx account in order to schedule a pickup. By the time I was being transferred to speak to a live person, I had no idea if they even would pick up the package. When I spoke to the real live person, they were very helpful and scheduled the pickup without problems, but the whole experience left me with the impression that FedEx would not be a good choice for a shipping company. I don't make that decision now, but I may in the future, and FedEx might have lost a customer simply because they didn't care enough about interactions with people who are not current customers.

Wednesday, June 17, 2009

JavaFX backwards compatability

Interesting to see that JavaFX isn't very finicky about backwards compatibility. Quite a contrast from Java. This tells me that it isn't very popular. I still haven't found a good Business toolkit for it.

Thursday, August 21, 2008

Microsoft's Vista blindspot

Microsoft has dedicated a whole website to try to convince people that Vista isn't so bad after all. The problem actually repeats one of Keough's descriptions of how he fell for replacing Coca-Cola with New Coke. His commandment is trusting outside experts, but what you really see is with New Coke you had user studies that showed people liked New Coke more in blind taste tests. What they missed is that people had an emotional connection with the old Coke, regardless of the sweeter taste.

Here Microsoft is missing that the issue with Vista is not the usability in a vacuum, it is what happens when you try to get it work with your legacy applications and hardware. No drivers, problems, upgrades required, and so on and so on. They don't test for that in their experiment.

This study has all the feel of some attempt at internal political justification, and not actually marketing a product anyone wants.

And I say this as someone who runs Vista and has no issue with it.

Thursday, July 03, 2008

The most important UI design advice you will ever get

Follow it and your users will thank you forever. Or even better praise - they won't notice at all, because it is so obvious to them that this is how it was supposed to work.

When static typing just won't do

Although I'm a fan of static typing, when calling web services, it just doesn't make sense. The amount of effort expended to typing the WSDL to make the code compile is already high, and then add the maintenance cost, it just gets unbelievable. Compare this in Ruby on Rails, compared to what it would take with axis. The difference is just obscene.

The only obvious conclusion is that you may run your code inside a JVM, but use JRuby or something else that is dynamic for the web services calls, if you have any significant web services to use.

The long term solution has to be to make the compiler understand a WSDL and do the static checking against that. It won't help if you want to dynamically discover web services at runtime, but that is really a separate problem from just needing to consume a web service in an application.

Sunday, June 15, 2008

Know your business

Via Instapundit we have this timeline of Wikipedia vs. the AP.

The problem for the AP is that they (should have) a lower tollerance for getting it wrong. I'll leave it to them to address their specific business problem, but the general lesson is to know your business, and not try to compete on the other guy's terms. Wikipedia has the luxury of a reputation which forgives inaccuracy under the premise that it is easily corrected. AP lacks that advantage, so instead of competing on speed, it needs to compete on its strength.

Many companies are afraid of this kind of competition, because it creates what is called differentiation, and when you are different, you risk losing business even if you execute flawlessly, because the market demanded what made the competition different from you, instead of the other way around. It is much easier to play it safe, especially with a boss, or a board, that will have no problem just retroactively deciding that being different was a bad decision. Playing it safe means being just like the other guy, just doing it better, or being able to hide behind the excuse that any failure wasn't do to execution of the concept, because you see your product competes feature for feature.

It is those that are in a position to take that chance, that can become the next Google.

Monday, May 26, 2008

When your business model is interupted

Here is a failure of imagination. No one knows how to make money on the Internet, or so conventional wisdom goes (don't tell them), so a business can reach millions more people, and lose more money. Like the music business, it is a failure to realize the changing realities and adjust to them. Of course, I wouldn't know how to make money in the business, which is why they don't pay me several million dollars a year. But there is clearly money to be made. They just forgot that their core selling point to consumers is hard-core news, not opinion.

Thursday, April 24, 2008

Vista's problem

So Microsoft sends out a list (via it's Microsoft at Work RSS feed) of the 10 top things about Vista. That is a very anemic list. With the exception of Windows Areo, everything on that list can be done in XP. And this is their top 10. No wonder they are having so many problems with it. It is Windows ME without the backwards compatibility.