Here's a poser that occurred to us over the summer, and one that we couldn't really come up with any satisfactory solution to. The people who work at the Singularity Institute have a high estimate of the probability that an Unfriendly AI will destroy the world. People who work for http://nuclearrisk.org/ have a very high estimate of the probability that a nuclear war will destroy the world (by their estimates, if you are American and under 40, then nuclear war is the single most likely way in which you might die next year).
It seems like there are good reasons to take these numbers seriously, because Eliezer is probably the world expert on AI risk, and Hellman is probably the world expert on nuclear risk. However, there's a problem - Eliezer is an expert on AI risk because he believes that AI risk is a bigger risk than nuclear war. Similarly, Hellman chose to study nuclear risks and not AI risk I because he had a higher than average estimate of the threat of nuclear war.
It seems like it might be a good idea to know what the probability of each of these risks is. Is there a sensible way for these people to correct for the fact that the people studying these risks are those that have high estimate of them in the first place?
200 Soviet nukes lost in Ukraine -- article from Sept 13, 2002. There have been reported losses of nuclear submarines at sea since then as well (though those are improbably recoverable). Note: even if that window is closed now, it was open then, and no terrorist groups used that channel to acquire nukes -- nor is there, as your citation notes, even so much as an actually recorded attempt to do so -- in the entirety of that window of opportunity.
When dozens of disparate extremist groups failed to even attempt to acquire a specific category of weapon, we can safely at that point generalize into a principle that governs how 'terrorists' interact with 'nukes' (in this case) such that they are exceedingly unlikely to want to do so.
In this case, I assert it is because all such groups are inherently political, and as such the knowable political fallout (pun intended) of using a nuclear bomb is sufficient that it in and of itself acts as a deterrant against their use: I am possessed of a strong belief that any terrorist organization that used a nuclear bomb would be eradicated by the governments of every nation on the planet. There is no single event more likely to unify the hatred of all mankind against the perpetrator than the rogue use of a nuclear bomb; we have stigmatized them to that great an extent.
A Pravda article about an accounting glitch is not terribly convincing. Accounting problems do not even mean that the bombs were accessible at any point (assuming they existed), much less that they have been available 'on the "black market" (thanks to sloppy soviet handling practices) for decades'! Srsly.
(Nor do lost submarines count; the US and Russia have difficulties in recovering them, black-market groups are right out, even the drug cartels can barely build working shallow subs.)