For another interesting take on models and their future, check out this story on the Pensacola New Journal's website: Supercomputers Aid Hurricane Forecasting. It gives an interesting look at the state of modeling, what they do, and so forth. I can't say that I'm not looking forward to the Maryland model in the next year or two. The Gulfstream Doppler should do a nice job in helping to get more accurate initialization data into the model.
On a side note, and this is the computer programmer in me, I wonder if the NOAA has ever contemplated coming up with a distributed computer based model concept? With the success of things like SETI there's a huge market of computer afficionados out there who have tons of CPU cycles wasting energy. I would bet good money there's tons of people (myself whole heartedly included) who would love to donate our CPU idle time to parsing data for a distributed model system. All you'd need to do is create a thin client for computers to run, allowing them to download "wedges" of grid data for processing, and return the results to the central server. Depending on participation, you could probably jam out a model result fairly quickly. Just make it an experimental model so people don't take it as NHC gospel and it could work wonders. Heck, at USF where I go to college, they leave their lab computers on 24/7. That's hundreds of computers sitting idle, most of which are 1.5gHz+ machines. That's one college campus....
|