By David A. Cook


If you are not familiar with the quote "It's just a model!" it comes from the movie Monty Python and the Holy Grail. Patsy is a character in the film played by Terry Gilliam. Serving as King Arthur's assistant, he only has a few simple tasks throughout the entire film, such as using two halves of a coconut to simulate the hoof beats of Arthur's nonexistent horse. Patsy has only one line in the film: "It's only a model," said when the Knights of the Round Table first catch sight of the castle at Camelot. And indeed, it is only a model. As a matter of fact, if you wish to see the dance routine Knights of the Round Table/Camelot Song played by Lego characters, I suggest you Google "knights of the round table Legos" and have a good laugh.

"It's just a model"? Models are a critical part of what we do for a living. From 2003 through 2009, I supported work on the Airborne Laser while I was employed working for a modeling and simulation company named AEgis Technologies. AEgis Technologies has a lot of people who really understood the modeling and simulation business. In modeling and simulation, you worry a lot about errors. Both Modeling and Simulating can produce type l and type II errors, just as any other branch of statistics. Quickly put, a type 1 error is when the model and simulation shows that something will not work, but it actually will work in the real world (for example, the model and simulation shows an aircraft cannot safely clear a mountain, but in the real world it could have succeeded). A type II error is the other way around, your simulation shows something will work, but it doesn't in real life (for example, your simulation shows you can easily clear mountain on takeoff, but in reality, you crash). When you make this kind of error, especially in the avionics industry, millions of dollars may be wasted, and people may die.

There's also a type III error, although this is more in the field of verification and validation. A type III error is when you solved the wrong problem. In a type III error, for example, given the problem of "can an airplane make it over a mountain top", you instead show that a helicopter could. You have solved the problem based upon an invalid interpretation of requirements. Your model is wrong. This type III error is not really a type of modeling and simulation error, it is an error in verification (is the problem being solved in a correct way) and validation (are you solving a correct problem). You see, a model must be developed in a correct manner (correctly abstracting and encapsulating behavior from the real world) and it also must be a valid model, the model of what you're trying to use as a pattern. For example, if you're trying to model behavior of a Boeing 747, but you capture the flight characteristics of a small Cessna, it doesn't matter how good the model is, it is the wrong model and cannot be used to solve your problem. The initial assumptions or requirements are wrong, so your model will be wrong. Yeah oh okay is all this included and on.

To paraphrase Shakespeare: "Requirements? Ay, there's the rub!" It's all about requirements. That's where all our problems start, don't they?

What does it take to build a good model? I guess that depends on what you're doing. Which leads us to the real topic of this column.

#1 – Modeling is a tricky business. Unless you're comfortable tossing around words like encapsulation, abstraction, verification, validation, coupling, cohesion, inheritance, polymorphism, multiple inheritance, and overloading – you should not be developing the models. Oh sure, you can help, but you lack the tools necessary to do it right. Leave it to the experts to build a model, and you need to be content just using the model.

#2 – Using the model requires great familiarity with the model. Models and frameworks are seldom trivial. You should be reasonably smart to use the right model and you have to be very smart to use the model right.

#3 – There is no perfect model. There're lots of models that will work, and there're also a huge number that won't work. You should find the one that works best for you, and understand that if it meets 80% of your needs, it's probably good enough. You can spend all your time and budget trying to find the perfect model. It doesn't exist. This is probably related to Pareto's Principle, the "80/20 Principle". There are a lot of models out there that will not work at all. There are some that are usable. Sometimes it's better to find a usable one quickly instead of waiting for the perfect one. Remember the one that is perfect for you will probably not be perfect for anyone else – and the savings from reuse will be decreased.

#4 – Be wary of models that claim to produce executable code. Remember that the code produced by a model is only as good as the model is. You need people trained in both modeling and the use of the model-producing tool to get reliable results.

#5 – Using model-based development and model-based testing is hard. Don't expect to buy the tool in January, have a one week class in February, and have reliable test cases flying out of the tool by March.

I close with the following anecdote: Back in the 1940s, supposedly a Texas Aggie was looking for work over the summer. He was an engineer and looking to apply his talent. He asked around, and found out there was a great need for out houses that had little odor. Using the highest quality cedar wood, and his knowledge of airflow and ventilation, he built a model of a "sweet-smelling outhouse". After he completed the model, he would travel around South Texas, showing the model and installing hundreds of his "sweet smelling outhouses". Over the summer, he made quite a lot of money. The next summer, he decided to go back into the same business. However, upon driving back to South Texas, he was confronted by many former customers who complained his outhouses did not work. He decided to investigate, and went to one of the first outhouses the ever installed. He came back out a few seconds later, and said "Well, of course it stinks! Do you have any idea what somebody did in there?"

It's all about understanding your requirements, isn't it?

David Cook, Ph.D.

Professor of Computer Science

Stephen F. Austin State University

cookda@sfasu.edu



« Previous