Is social media a bunch of BS? (There's a new book
which argues yes.) Even if we stipulate that it's not, there's an unspoken consensus that our online interactions with the friends we've never met isn't achieving its full potential.
Enter Intel Labs, where scientists and engineers are investigating what I'd call Social 4.0 (my phrase, not theirs.) Back in October, the folks at Intel were kind enough to take me around their facilities in Hillsboro, Ore. and in Santa Clara, Calif. My payback for their generosity has been a three-month delay in reporting on what I saw, perhaps because I was so awed by the breadth and depth of the research. Beginning today, I intent to tilt the other way, with a series of posts that'll Intel you out by the time we're done.
As regards social, Intel is doing a lot of heavy lifting, which could better knit heavily hyped but underachieving social tools into the core of our daily computing habits. (See, social is still so amorphous that it resists a coherent explanation of what it requires to become really useful.) Perhaps it'll take a psychologist with computer smarts -- or vice versa -- to uncover the answer.
"Social media is going to become far more dynamic," Intel senior research Margie Morris told me. "It's going to help people [voice emotions] in a more fluid way." With her background as a clinical psychologist -- I was hoping she wouldn't send me a bill after fielding all my questions -- http://www.intel.com/content/www/us/en/research/people/intel-labs-bio-margie-morris.html Morris has standing to help cook social's secret sauce.
She's leading an Intel Labs project dubbed Emotions through Images, which I'd characterize as adding Intel's deep technical smarts alongside the still-lightweight apps pushing out social content. Then, the whole thing is projected out en masse to hopefully engage the masses. Consider it a kind of virtual Middle Ages town square, updated for the twenty-first century, but without the bubonic plague.
As Intel's formal description (edited) puts it: "Images taken by individuals on mobile phones with Instagram are projected on a large interactive display. Intel sentiment analysis software studies the captions to infer the "mood" of the images, which are categorize according to the Circumplex Model of emotion. [This is a kind of psychologist's heat map, which color-codes the pictures.] Individuals are invited to express how an image makes them feel with a touch screen Mood Map. Individuals can also associate images, using emotion to compose arrangements and colors of the entire installation. These compositions allow us to capture and share the collective vibe of events."
Vibe -- that's the ticket. Here, look:
The pix are projected by a separate Intel Labs project called Display with Boundaries. Led by Intel researchers Doug Carmean and Carl Marshall, the objective is to change how and where we display and interact with our content.
Success entails much more than just right-angling the image so it's projected onto a wall. Flat-panel displays are clear and regular. Not so painted walls or tables, which have irregularities and different absorption characteristics peppered throughout their surfaces.