Wednesday, April 18, 2007

Lab 10

Flow Chart for the first algoritm in Lab 10



The main differnce between the Hartley and Shannon measures of information is that Shannon uses the average amount of uncertainty and information to measure how much information is present in a message thus accounting for the higher or lower probability of the alternatives in a problem. Hartley on the other hand can not acount for the different probabilties of different alternatives. To find Shannon's measure of entropy you divide the average amount of uncertainty for a set of wieghted alternatives by the average amount of information needed to remove the uncertainty. Hartley's method simply takes the amount of uncertainty associated with a set of alternatives divided by the amount of information needed to remove this uncertainty.

1 comment:

Jinjeet said...

Hi Drew! I'm Drew, too....or perhaps Drew2 since google returns you before me on searches for Drew Philbrick.

Just out of curiosity, what's your middle name? Mine is Scott.

Andrew Scott philbrick, that's me, living in sunny Phoenix, AZ.

Sorry to post a casual comment on your otherwise quite technical blog.

Nice to meet a fellow Philbrick, especially one with such a fine first name!

-Drew2