What is the best way to go about explaining the difference between these two different types of entropy? I can see the difference myself and give all sorts of intuitive reasons for how the concepts work and how they kind of relate. At the same time I can see why my (undergraduate) physicist friends would be skeptical when I tell them that no, I haven't got it backwards and a string of all '1's has nearly zero entropy while a perfectly random string is 'maximum entropy'. After all, if your entire physical system degenerates into a mush with no order that you know nothing about then you say it is full of entropy.
How would I make them understand the concepts before nerdy undergraduate arrogance turns off their brains? Preferably giving them the kind of intuitive grasp that would last rather than just persuading them via authoritative speech, charm and appeal to authority. I prefer people to comprehend me than to be able to repeat my passwords. (Except where having people accept my authority and dominance will get me laid in which case I may have to make concessions to practicality.)
Sniffnoy's comment has it right. Information-theoretic (Shannon) entropy is very similar to thermodynamic entropy and they don't contradict each other as you seem to think. They don't talk about individual bit strings (aka microstates, aka messages), but rather probability distributions over them. See this wikipedia page for details. If you have in mind a different notion of entropy based on algorithms and Kolmogorov complexity, you'll have to justify its usefulness to your physicist friends yourself, and I'm afraid you won't find much success. I don't have much use for K-complexity myself, because you can't make actual calculations with it the way you can with Shannon entropy.