As far as I can see drug companies are in it to keep us ill so that they can keep selling us drugs that dont make us better - just controling symtoms
Hell if they make you better they cant get anymore money from you - what is there impetus to make you well!