For techies the most interesting aspect of the NBA is what is happening in the rafters instead of on the basketball court. As an ESPN article recently articulated, all those clipboards being wielded by assistant coaches and statisticians trying to track every play and player formation are being replaced by camera-based game analysis.
SportVU is an Israeli-based company (which was acquired in 2008 by Illinois-based Stats) and utilizes a broad range of technology to increase the speed, precision and quality of sports coaching and coverage. In the NBA example, ten teams are using cameras set up in the catwalk area of their respective arenas to record, measure and analyze a range of information that was simply impossible to develop previously including precise measures of ball handling, player position and competitive team formations. The camera information feeds into a database which rapidly analyzes and spews the info the managers query. Clipboard manufacturers may replace the buggy-whip obsolescence metaphor as sports management become leaders in melding smart, networked components, big data and real time data analysis to get the edge on the basketball court.
The basketball example is fun (I'm a Boston Celtics fan), but the dumping of clipboards for iPads by the coaching staff illustrates the promise and the problems of designing and deploying new systems that incorporate video, advanced sensors and big data analysis. The NBA franchises have the money (when they are not on strike) to invest in new technology. You need money, a plan and the component and system skills to make these smart systems happen. If you are missing any of the pieces, you are going to be stuck with that clipboard. Oh yeah, and you also need the skills and experience to build these smart systems—something not taught in schools and generally not part of a design engineer portfolio.
To get a handle on just what it takes to build these hybrid component/computer/analytical systems I talked to someone working on some of the really big projects. Tim Durniak is IBM's CTO for the Public Sector and is deeply involved in the company's Smarter Cities projects. Aside from offering some great advice for planners contemplating city projects (start with smaller, defined projects that are customer facing and have a strong payback in making the citizen and government agency interface more fluid and transparent), Durniak noted that the technology trends sweeping the private sector including mash-up applications, cloud computing and big data analysis are also key developments in taking the smarter city concepts from idea to reality.
"People are mashing stuff up, for example using a web server to handle customer queries," said Durniak. Also, and this was the statement that got me started down the road to an update on machine to machine interfaces, the need for engineers to rethink their skill level and where CIOs should start investing their technology dollars, Durniak mentioned that the smarter cities projects often provide unexpected benefits as the instrumentation takes place. Leaky pipes, mistimed traffic signals and inefficient electrical power grids all give up their secrets as instrumentation and analysis takes place.
However, to unlock those secrets and deliver on what the "Internet of Things" and machine to machine markets have been promising will require a rethinking, retooling and skills and a new integrated approach to system building which I find missing in many companies. Consider:
Design engineers are directed to build products that are less costly and have strong consumer appeal rather than products that contain the sensors and connections necessary to enable the "Internet of Things" type systems.
CIOs tend to be conservative and more focused on cost efficiencies from consolidating sytems and incremental improvement in their companies. For more on this see a column I wrote on CIO risk taking. http://www.informationweek.com/news/global-cio/interviews/240001018
Broadbased systems that require smart sensors, big data and big data analysis , robust secure networks and vertical industry expertise are new territory for most companies and demands an intercompany team approach which is also new territory for companies operating in expertise silos.
Jeff Kaplan, director of ThinkStrategies, sees four obstacles that need to be overcome to make these connected systems happen.
"First, there has to be a substantial increase in available bandwidth to accommodate the explosion of 'big data' being produced by the new endpoints created by M2M solutions.
Second, the deployment and administration of M2M applications has to be simplified. Today's applications require too much customization, and can't be deployed easily enough yet to become pervasive and mainstream.
Third, there are still too many security concerns about compliance issues or access control problems.
Finally, there aren't enough skilled M2M engineers among third-party consulting firms and corporate managers within enterprise organizations who can guide the planning, implementation and administration of M2M deployments."
I agree with Jeff. While it is fun to think about those NBA-type applications that make buggy whips out of clipboards, a broad deployment of new components, skills and executive attitudes will be required to make the Internet of Things a reality.
Eric Lundquist is vice present and editorial analyst for the InformationWeek Business Technology Network.
David Patterson, known for his pioneering research that led to RAID, clusters and more, is part of a team at UC Berkeley that recently made its RISC-V processor architecture an open source hardware offering. We talk with Patterson and one of his colleagues behind the effort about the opportunities they see, what new kinds of designs they hope to enable and what it means for today’s commercial processor giants such as Intel, ARM and Imagination Technologies.