Like traditional radiation therapy, the release of energy disrupts the
internal chemistry of the affected cells, damaging their DNA and thus
preventing them from reproducing or even performing their normal
metabolic functions. The cancer cells thus destroyed are then subject
the body's own cleansing mechanisms which flush the dead tissue out of
The details of mapping the beam's path are daunting, since a nozzle
must be used to simultaneously deliver multiple proton beams from
different angles each with different energies. The problem -- besides
the expense of a week's worth of high-priced labor -- is that while the
doctors and technicians are plotting out the paths for the beam to
follow, the tumor continues to grow, thus making success less likely.
IBM's solution is to use a supercomputer to quickly plot out the
necessary path for the proton beam to follow, presenting numerous
alternative therapy plans to the attending physician in just 15 minutes
(instead of a week). As a result, by the time that patients can be
shuttled from the MRI or CT scanner to the proton-beam therapy room,
they can be treated immediately, ensuring that the tumor has not had
time to grow further, thus greatly enhancing its chance of success.
automation techniques to proton cancer treatment, we also improve the
model that predicts what happens when beam hits tumor, compared to
manual methods, said Nassif. "It produces thousands of different way to
perform the treatment, just like it produces thousands of alternative
ways to optimize chip fabrication."
Working with its partners at the University of Texas's Anderson Medical
Cancer Center (Houston) IBM is hoping to reduce the cost of proton
therapy by as much as 60 percent, as well as speed-up the treatment
planning time using Power7 cluster supercomputers, which perform the
computational tasks up to 1000-times faster than the manual methods used
by doctors today.
Seems to me that this is more a case of computing power getting the job done (that would otherwise take much longer) than analytics innovation. A more apt title would have been "IBM's cluster computing power solving cancer!" The predictive algorithms that the article mentions were sure screaming for more computing power in years past.
Power 730 Cluster is impressive.
With academic research I'm sure other, more efficient, computational platforms could be envisioned that could do the this task in the same amount of time but at a fraction of the cost.
From what I can see the majority of the costs are in the accelerators themselves. That being the case, it seems like a pipelined architecture would be very useful. Patients could be fed into the MRI or CT on one end, and while transiting to the accelerator, the computations would follow.
One article on cost in the WSJ commented the accelerators were about the size of a football field.
"Last year there were over 12 million cancer patients worldwide receiving various therapies, and that number is predicted to increase to 21 million by 2030"
Can someone shed some light on why is this so? Is world wide population growing? Or something else is happening.
As more and more developing countries choose to use 'manufactured' foods while abandoning traditional 'healthier' options, this is to be expected.
As regards to increase in cancer patients in step with population increases, I am not sure whether the percentages have remained the same. If anything, I would expect that to have increased if my surmising above holds true!
David Patterson, known for his pioneering research that led to RAID, clusters and more, is part of a team at UC Berkeley that recently made its RISC-V processor architecture an open source hardware offering. We talk with Patterson and one of his colleagues behind the effort about the opportunities they see, what new kinds of designs they hope to enable and what it means for today’s commercial processor giants such as Intel, ARM and Imagination Technologies.