I preface this by saying that I just started watching Breaking Bad.
Regardless, it's strange watching this show, especially when the DEA are working with the drug cartel guys to set up his other drug players. I've heard American drug policy protects drug cartels, but it really didn't fully understand how until I watch this show. Replace drug cartels with corporations and replace US drug policy with regulation and...ta da!
So is it just a matter of self preservation (which I understand), or do these guys really think they're making a difference by driving up the price of stuff they're "taking off the streets," making somebody that much richer?