Gonzo blogging from the Annie Leibovitz of the software development world.

Category: Programming (Page 3 of 4)

Applying predprey models

I haven’t learned yet. Ever have someone tell you not to start reading a book late at night? You laugh at those statements right? I did. It started when I began reading Disclosure at around 9 one night. I didn’t put the book down until I finished it at 6 the next morning. I did the same thing with Lost World, Airframe, and Timeline where I would just sit and read them straight through in nearly one sitting. Michael Crichton has that unique ability to do that to me, something that only Stephen King has been able to do in the past.

I picked up Prey a couple of months ago. Prey looked interesting (blending nanotechnology and computer programming) so I decided I would give it a shot. Man it was a great read. It just flowed right off the pages and had a lot to say about mankind’s technological terrors we create (albeit a little hard hitting and judgemental but that’s ok too).

Back when we had wood-burning computers, I did some programming around flocking technology, dynamic population models, AI etc. using some of the neural-net and heuristic algorithms available at the time (and trying to come up with my own, failing miserably). They’re all about the relationships between predetators and their prey. How they interact, what do they do when they get hungry, what patterns they follow when they hunt, do they learn from following paths that return nothing, and so on.

One algorithm was the infamous Lotka-Volterra Predprey model (which the book makes some reference to and extends). The Lotka-Volterra model is a simple model of predator-prey interactions. The differential equations were developed independently by Lotka (1925) and by Volterra (1926). It answers (or allows you to answer) the question “Do predators influence prey populations?”. The Lotka-Volterra model follows these principles:

Each prey gives rise to a constant number of offspring per year

In other words, there are no other factors limiting prey population growth apart from predation.

Each predator eats a constant proportion of the prey population per year

In other words, doubling the prey population will double the number eaten per predator, regardless of how big the prey population is.

Predator reproduction is directly proportional to prey consumed

Another way of expressing this is that a certain number of prey consumed results in one new predator; or that one prey consumed produces some fraction of a new predator.

A constant proportion of the predator population dies per year.

In other words, the predator death rate is independent of the amount of food available.

Lotka and Volterra made a number of guesses when they wrote their equations. They did not carry out any experiments and thus made a lot of assumptions that were inaccurate. The Lotka-Volterra model makes no allowance for many biological features (such as appetite). A lot of these algorithms are expressed mathematically and you can spend literally hours just looking at complex graphs and trend models all to see, well nothing. What is interesting in all this is the fact that a) you can express these models using fairly simple algorithms and b) it would be great if there was a way to demonstrate and view the evolution of such an algorithm rather that staring at graphs. The predprey model can be mathematically expressed as:

dX/dt = aX – bXY

dY/dt = cbXY – dY

where:

X = size of the prey population

Y = size of the predator population

a = number of offspring per prey per year

b = proportion of the prey population consumed by one predator per year

c = conversion of one prey consumed into new predators (i.e. if this was 0.1 then for every 100 prey destroyed gives rise to 1 new predator)

d = proportion of predator population dying per year

Enter Terrarium

Terrarium is a sample application built by Microsoft. It’s a game for software developers that provides a great introduction to software development on the .NET Framework. In Terrarium, developers create herbivores, carnivores, or plants and then introduce them into a peer-to-peer, networked ecosystem for a survival-of-the-fittest type competition.

The Terrarium server tends to be up and down, but I highly recommend it for anyone who’s interested in trying to beat down their geek friends through programming. In the early 80s we had something called C-Robots (sometimes Pascal-Robots). These were engines that allowed you, through programming, to create a robot and battle in an arena-type environment to the death. 10 robots in, 1 comes out. I still have my “killer” bots somewhere on CD. Terrarium revisits this idea, but provides a more biological environment and, IMHO, a better way to build your predators.

You can download the client and armed with a copy of Visual Studio .NET and the documentation, you make your first creature by creating a new class (based on a base they provide) and introduce it into the ecosystem. You release it, it communicates with the server (through the Terrarium client) and starts interacting with other peoples creations. I have a 4th generation creation I call “Xymos” (after the corporation in the book). It’s a carnivore (herbivores are so boring, and plants, well…) that roams the playing field in blocks, looking for fuel and slowing down when it begins to run low (it switches from active to passive based on how it feels). My creation is pretty good and follows the predprey model. I’ve been evolving it when I find some other creature that kills it, make some changes (usually on the pathing) and re-release it. The last gen can survive for a few days but there is currently a bit of a problem gathering food so I need to try to figure that out (my first gen only survived about 10 minutes before it dropped dead of exhaustion).

Anyways, if you’re into writing games, programming in .NET, or just want to see how your creation fairs up against others give it a shot and I’ll see you on the battlefield.

Microsoft lied!

Okay, so this post will probably blow my MVP nominations for this year and for most of anyone out there, it’s nothing new but dammit, Microsoft lied to us!

With the introduction of .NET, one of big claims I kept hearing was about how all the code was compiled down to IL (Intermediate Language) and the CLR, the .NET runtime, would process everything the same way because it was the same code. Gone was the holy wars of VB being slow, C++ being bloated and, C being licketity split fast. Now everything ran the same because it was. Or so they said.

Today I spent the better part of 8 hours in a “advanced” .NET debugging course. Except for the last 1/2 hour when we actually were given a problem to solve, the rest was going through labs where you typed something and watched what the result was. Type in “C-A-T”. Do you see a CAT? You typed “C-A-T”. Yeah. Anyways, rather than screw around with one of the labs we decided to use our new found knowledge of the IL code that .NET is based on and poke around a bit.

We created two applications, side by side, with exactly the same code except one was written in VB.NET and the other was C#. Two applications (TheApp) with a class (TheClass), a method (TheMethod, you’re seeing a pattern here right?), and a variable (TheVariable for those of you that haven’t been following along). Given our knowledge of what Microsoft has told us, the two, when compiled would look exactly the same to .NET

Nope.

Here’s the IL dump from the C# version:


.method public hidebysig instance void TheMethod(string theParameter) cil managed

{

// Code size 8 (0x8)

.maxstack 2

IL_0000: ldarg.0

IL_0001: ldarg.1

IL_0002: stfld string TheApp.TheClass::_theVariable

IL_0007: ret

} // end of method TheClass::TheMethod

And here’s the VB.NET version:


.method public instance void TheMethod(string theParameter) cil managed

{

// Code size 10 (0xa)

.maxstack 8

IL_0000: nop

IL_0001: ldarg.0

IL_0002: ldarg.1

IL_0003: stfld string TheApp.TheClass::_theVariable

IL_0008: nop

IL_0009: ret

} // end of method TheClass::TheMethod

Hmmm. One of these things is not like the other. hidebysig? In the C# version but not VB.NET. Code size is different (but not much). Wow, look at that stack! VB is 4x the size of the C# one. And what the hell are those “nop” things? In assembler, nop is a non-instruction. Basically tell the computer to do nothing (or as close to nothing as a computer can get). So VB.NET has graciously added a few of those in our code.

Just for kicks another bud built the J# version. Hey, it all compiles to the same IL code right? Uh-huh.


.method public hidebysig strict virtual instance void

TheMethod(string theParameter) cil managed

{

// Code size 10 (0xa)

.maxstack 2

IL_0000: ldarg.0

IL_0001: ldarg.1

IL_0002: stfld string TheJSharpApp.TheClass::theVariable

IL_0007: br.s IL_0009

IL_0009: ret

} // end of method TheClass::TheMethod

At least the “nop” commands are gone but they’re magically replaced with a br.s to IL_0009 (whatever that is). Gotta love that J#.

So let’s do the math. A 1 line method in a class produces 3 operations in C#, 4 in J#, and 5 in VB.NET. Using Chewbacca Defense logic (and math), if I had say an application that had enough classes with enough methods the computer would spend 40% more time churning with the VB.NET app than it would C#. Does that sound the same to you kids?

So Microsoft lied to us and here it is in black and white. C# is better than VB.NET! Let the holy wars begin!

Source control for agile development

Or rather the quandry of finding one. When it comes to version (revision) control systems, developers working in an agile environment are in a somewhat confusing situation. Which system should we use?

Microsoft’s offering is SourceSafe, and is the one most of us are probably using (probably because it comes with Visual Studio so is considered “free” and part of the package). However, not much has happened with SourceSafe since the current version was released almost 5 years ago. Why didn’t MS release SourceSafe .NET with Visual Studio .NET? Will there ever be a new version of SourceSafe? With the coming of Whidbey, things might change but that’s still a year off.

There are several reasons to look for something better than SourceSafe: It is slow. It is even slower across a VPN. It does not handle multiple users editing the same file very well. It has problems with opening and closing ss.ini, especially over a slow network. There is no SourceSafe service to restart when you need it to release ss.ini. You have to reboot the server, which is kind of nasty when you are running SourceSafe on a multi-purpose server machine. But worst of all: it corrupts the repository every now and then, and Microsft’s best practices implore us to run a corruption analyzer every week! Nice.

So what are the alternatives: The most apparent one has to be CVS, which works great over the Internet, but is a bit lacking if you want the same kind of Visual Studio integration as SourceSafe. There are a couple of SCCAPI implementations for CVS, but I haven’t found one that works very well. Igloo looks to be the most promising. There is a modern replacement for CVS as well, but naturally this one is even less compatible with VS.NET.

So what about the commercial alternatives? There is Borland‘s StarTeam and Rational/IBM‘s ClearCase, neither of which seem to support an agile environment and with ClearCase, it’s apparently a beast to understand. Then there is Merant‘s PVCS, which we’re currently using as our corporate repository but because of it’s pessimisstic locking scheme, any refactoring just can’t be done effectively. There is also a new kid on the block, SourceGear‘s Vault, appealing because it runs on .NET, less appealing because it tries to look and feel like SourceSafe.

So, in short, I’m quite undecided when it comes to which version control system to use. What I want is something that plays well with VS.NET, works efficiently across slow networks, is built on a database, is transactional and is easy to backup and restore, and support an agile development environment. If it supports multiple checkouts of the same file as well, that would be a nice bonus.

Does anybody have any good recommendations?

PS I’ve added a new commenting system to this site so you can leave feedback on each post. Enjoy!

VB6+

Okay, that’s a rather silly name however Bill McCarthy has an article called “It’s Time for VB6+” over at FTPOnline. In the article he says Microsoft’s loyal customers (Corporate America) deserve better. They deserve another version of VB6.

I can sympathize somewhat with the words being said here because at work, we’re in a pretty bad place. Tons of VB6 code all over the place running mission critical apps and the corporation being somewhat forced into moving to .NET isn’t the best way to migrate smartly. I don’t think a VB6+ is the answer though because it becomes just another ugly compatibility bridge between VB6 and .NET.

There are a few things that Microsoft did to bridge that gap already. They have several documents/white papers/etc. on how to migrate from VB6 to .NET. There is an upsizing wizard that will analyze a VB6 app and suggest ways to move it to .NET. Additionally they created a Microsoft.VisualBasic namespace that provides compatible functions to VB6 apps which map to their .NET equivalents.

The thing is that it’s wrong in so many ways to just “wizard” an app from one language to another (and VB.NET is not VB6 upgraded). The way things are done is different and thus requires some rethinking. It’s not like just grabbing the house and moving to a new foundation. The foundation has changed and doesn’t necessarily support the same way the walls fit together. True, there is some glue here that will help those old walls hold together but don’t expect them to weather any kind of a storm. Without doing some re-engineering it’s like writing C style code in a C++ world. It’s not OO, it’s anti-OO.

« Older posts Newer posts »