In the recent "Himalayan tsunami," more than 10,000 pilgrims visiting Uttarakhand in north India went missing and entire villages were swept away. Could any tech projects have possibly saved them?
Bangalore -- Last week I attended a seminar on high performance computing (HPC) where several professors from the Indian Institutes Technology (IITs) presented papers on the kind of R&D work they were doing in their labs. There were also a handful of officials from the government, including senior representatives from the Centre for Development of Advanced Computing (C-DAC), which works in supercomputing domain.
All seemed extremely passionate and committed to the research and development of HPC in India -- working on system software to address the performance and usability challenges for high-performance machines, developing compilers for high-level languages, assemblers and exploring next generation peta-exascale supercomputing architectures, parallelizing and optimizing application programs... the list goes on.
But what struck me as strange was the fact that nobody seemed to be in touch with the ground reality. And for me, the ground reality at that particular point of time was the recent "Himalayan tsunami" where more than 10,000 pilgrims visiting Uttarakhand in north India (most important Hindu pilgrimage sites are located here) went missing and entire villages were swept away.
A journalist friend of mine had just returned from this hill state and was giving me shocking details of the tragedy as well as the many factors that had led to this flooding -- an abnormally high amount of rain, swollen rivers, and a huge quantity of water, which was probably released from melting of ice and glaciers due to high temperatures during the month of May and June, which not only filled up the lakes and rivers that overflowed but also caused breaching of dammed lakes in the upper reaches of the valley.
So, I asked around, how we, as Indians, are able to develop grid computing facilities for scientific and engineering applications in the bioinformatics, monsoon modeling, computational fluid dynamics, open-source drug delivery (OSDD), and still not be able to do something about these kinds of tragedies that take place. I know nobody can prevent a natural disaster, but we could have at least have done something to mitigate this kind of a disaster. Everyone says they saw it coming because of the degradation to the environment that was taking place in that hill state and of course because there was no way to channel the water to other areas.
River linking project
I met S. Ramakrishnan, former Director General, CDAC, who narrated this interesting story. He said the river linking project had inspired a few professors since 2003. The Indian government has been planning to link 37 rivers in the country to put an end to the droughts and floods for decades. One professor/team from Madurai-Kamaraj University in south India had come up with a model where they could do forecasting of the rivers in south India. They had created a model of sluices that could transfer water optimally.
This team had managed to collect real data from a few rivers in south India as well as a few in north India. Incidentally, collecting data from different sources is not a cake walk because any organization that has it (government or quasi government) is reluctant to share it. Once this raw data was acquired, a forecasting module, fault tolerant architecture, and a geospatial visualization of results was developed. This could forecast the time when water would come in, the amount, and the effect it would have on the downstream.
It struck me as odd that when we could have the best of both worlds by interlinking the rivers -- a methodology that enabled each region to obtain water on a voluntary basis for hydrological purposes and simultaneously prevent floods and droughts too by a different or similar method -- we were not able to take this any further.
I believe many of us do not have the courage of conviction; we look at it only as a project and not as societal challenge. In India, we have the right methodologies, resources, and the talent pool, but something is missing.
We should be able to identify what problem to solve in a generic way, and should be able to start off small and then let the story develop. But unfortunately in India, the real story starts only after the Mumbai blasts, or in this case, the Uttarakhand tragedy.
And, to add insult to injury, someone also mentioned that former Indian president APJ Abdul Kalam, during his presidency (2002 to 2007), was part of a team in Canada to study the river linking projects in that country and their forecasting models. But nothing had come out of that either.
No doubt, the Indian HPC community is quite small -- just 500 to 750 engineers working in this domain, though there are over 1,500 users. But the work being done in the labs is of extremely higher caliber. Admittedly, India does not have a high number of processors when it comes to HPC compared to the US or any other developed country, but it sure does make it up in terms of getting talented engineers to work on more efficient algorithms to achieve better results.
And while sitting during some of the interesting papers on HPC at the seminar, I couldn't help but think that this dedication and commitment at a research level comes to naught or worse (like having the intellectual capability to mitigate disasters but inability to do so), and is a sort of tragedy by itself, more lamentable than the 6,000-odd people dead or missing in the hill state.