Medical school should teach Drs about what really destroy the health. Dvjorge.
Lol. Sorry Dvjorge, the medical establishment is controlled by BIG PHARMA and the global corporations that control big pharma are the banks. You and I will NEVER see an answer to the many health problems that have plagued the 20th - 21st Century and beyond because those that control the medical and political paradigm don't, and never will want cures to these health problems. You can hope and wish all you want Dvjorge, but this will never come to pass. This isn't a Conspiracy theory, and never was. Just look at the cancer industry, worth multi-billions worldwide and still no cure. Cancer is the no.1 disease that keeps the herd or the general global population down. I believe our masters call it de-population. The reason why medical doctors worldwide aren't taught about these health problems is because disease and death makes money. It's better to treat patients with pharma drugs and manage symptoms and disease rather than go straight for the cure button and make human beings healthy and alive.Being alive and healthy isn't lucrative to the BIG BOYS AT THE TOP. Doctor's are more brainwashed at state run medical universities than ever before, you could say pre-programmed. It's just treat the disease state rather than what comes before the disease. This just shows how far we've come as a human race. Controlled by multi-national global corporations that set out to exploit human beings weakest point, disease and illness and turn both into a global money making empire worth multi-billions every year. The last laugh is on all of us, not those in control of our health. C