What is the best way to go about explaining the difference between these two different types of entropy? I can see the difference myself and give all sorts of intuitive reasons for how the concepts work and how they kind of relate. At the same time I can see why my (undergraduate) physicist friends would be skeptical when I tell them that no, I haven't got it backwards and a string of all '1's has nearly zero entropy while a perfectly random string is 'maximum entropy'. After all, if your entire physical system degenerates into a mush with no order that you know nothing about then you say it is full of entropy.
How would I make them understand the concepts before nerdy undergraduate arrogance turns off their brains? Preferably giving them the kind of intuitive grasp that would last rather than just persuading them via authoritative speech, charm and appeal to authority. I prefer people to comprehend me than to be able to repeat my passwords. (Except where having people accept my authority and dominance will get me laid in which case I may have to make concessions to practicality.)
Ugh. I realize you probably know what you are talking about, but I expect a category error like this is probably not going to help you explain it...
Edit: Actually, I suppose that sort of thing is not really a problem if they're used to the convention where "a random X" means "a probability distribution over Xs", but if you're having to introduce information entropy I expect that's probably not the case. The real problem is that the string of all 1s is a distractor - it will make people think the fact that it's all 1s is relevant, rather than just the fact that it's a fixed string.
Edit once again: Oh, did you mean Kolmogorov complexity? Then never mind. "Entropy" without qualification usually means Shannon entropy.