Archive for the ‘ Observation ’ Category

The End of the Social Network, the Dawn of the Social Object

A few friends and I are all sitting around the couches in our living room, when Kerry receives a Snapchat from our mutual friend Matt. It’s a video. “I’m opening it!” she exclaims. Panic ensues. We’re sure it’s going to be hilarious — and we only have 5 seconds to take it in. Sienna and I leap across the armchair and plop down next to Kerry right as she holds her thumb down in the bottom right of her iPhone screen.

It doesn’t matter what the video was — the point is that it’s gone now.

A few years ago I wrote an essay on the importance of designing information objects for share-ability. That was in 2012, when Facebook went public.

My argument then was that the network a piece of content originates on doesn’t matter — in favor of condemning it to life on only that network, it should be designed and packaged to take on a life of its own as it travels from place to place.

Now networks, it seems, are dissolving into content. Even the largest social network can’t maintain its outer walls. As MG Siegler puts it, “we’re entering the era of the great unbundling of Facebook.”

“There are powerful categories emerging that once fell under the “social” umbrella that can now stand on their own. The latest ones that interest me are based around new types of specific intents,” he explains, pointing out that photos were the first to break free of the walls of a larger network — then messaging became an independent use case.

As Peter Morville pointed out in Ambient Findability in 2005, “at the heart of many of today’s killer applications lies the power and prevalence of gossip.” The social network has long built on the idea of “communal construction of meaning,” presented by Bruno G. Bara in his study Cognitive Pragmatics. Once an information object is put out into the ether, the internet works together to construct the significance of it.

But are people sick of communal expression?

“Part of Snapchat’s lightweight appeal is that message recipients can’t comment on a photo you’ve sent,” says Ellis Hamburger in a The Verge article on the science of Snapchat’s success. It’s not just the ephemeral nature of the content, but the fact that once we huddled together to laugh over Matt’s video on Snapchat, we’d exhausted its greater meaning.

Total control — that seems to be the prevailing advantage of using Snapchat to distribute a simple visual commentary you might otherwise send on a network with more permanence.

Total freedom, then, might be the motivator that explains the rise of the anonymous social network. In the growing genre of apps that include Secret, Rumr, Whisper and the recently shuttered Rando, the social object is longer lasting, but will never be credited to its owner.

“Anonymity reduces our hesitation to create and express ourselves,” says Ryan Hoover. Fear of judgement is a consistent thread in the discussion — as our resumes are slowly replaced by our digital reputations, no social network is a safe place to think out loud.

The buzz surrounding these apps is no surprise — the physiological attraction we have to sharing dirty secrets in a public forum was long ago popularized in high school bathroom stalls, and even in a wildly popular art-project-come-book-series.

This space is growing quickly — and catching the eye of Silicon Valley’s top investors — with more players like WUTShrtwv etc putting their own spin on the anonymous social object. Their success is unclear, but their existence is telling.

When the founders of Snapchat pitched the product to Lightspeed Ventures, they insisted it would create “a more real and authentic mode of communication, one where you weren’t “performing” for every present audience in most social media.”

Performing online will still have its place. But Instagram’s one-to-one messaging option, recently replicated in a similar feature by Vine, makes it clear that even in a massive public forum, we’re hungry for more ways to speak in secrecy.

I’m excited to watch communication become more and more fragmented; maybe it’s a growing social anxiety that will force us to reach for new ways to interact.


Command & Control: Human Knots

C. Todd and other students in the Product Design class attempt to unravel their human knot.

This summer I’m studying Product Design in a program called Boston Startup School, a 6-week training program run by Techstars Boston. I’ve fallen way behind on blogging, but hopefully this is the first of many reflections on the amazing lectures and projects we’ve been doing at the Harvard iLabs!

Today Rob Rubin, previously VP of Engineering at Carbonite, now VP of Education at edX, got Boston Startup School up and moving for a few awesome exercises. For a Monday morning, yoga, dancing and untying human knots was daunting, but now I know the coffee’s working!

Our first exercise had pairs of people attempting to ‘manage’ their partners step by step to make 60 steps in 2 minutes. We then changed roles and had the ‘manager’ simply time the exercise, leaving their partner to instruct themselves. The lesson was immediately clear: micromanaging slows things down enormously, and managing for autonomous productivity gets things done way faster and way easier.

That lesson was illuminated in each activity we did — when groups of 5-7 people created ‘human knots,’ leaving one person out of the knot to manage the unraveling process may have made things more streamlined, but it didn’t make things happen any faster, and people felt less proud of the results than they did when the whole team managed the process together. In this exercise, one of the coolest discoveries was that no person in the knot could really understand the full complexity of the issue. We each saw the knot of the person in front of us, and had to trust each other to call out the solutions we saw, and respect that they would be accurate. As Rubin said, “not one singe person at Apple knows how the whole MacBook works.” They each understand one part of the problem, and together, form the solution.

Here are a few other lessons learned this morning:

1. Never underestimate the power of the visual cortex. Visualizing processes and metrics is key to conveying their meaning.

2. You can expect what you can inspect. Metrics are essential to continuous learning, improvement and discovery. Know what you’re going to measure, and keep track of that data. Dig for data points everywhere.

3. Competitive environments do not breed productivity. Instead, being open, honest and direct facilitates a much more productive workplace. You should work with people you like, trust and respect, and support each other’s successes.

4. The power of the Note-taker is enormous. Like a historian, the person taking notes (ie. during a usability test) is in control of what information survives the moment and influences future decisions. The note-taker chooses what is remembered, and therefore ultimately decides what is changed about a product later down the road.

The Power of Boredom

Listening to your wandering mind is far more important than you might think — in a world that constantly gives us something to do, the art of boredom is fast disappearing, and with it we may be losing a window of creative brilliance that we can’t access any other way.

In an article in Wired magazine titled “The Importance of Mind-Wandering,” Jonah Lehrer collects some fascinating research on the cognitive implications of the wandering mind. The two most fascinating points come from Dr. Jonathan Schooler’s lab at UCSB. Their studies discern between ‘default’ and ‘executive’ network regions in the brain, so that our ‘default’ state is our wandering mind, and our ‘executive’ network region is the part of your brain that is cognitively aware. Schooler’s lab found that surprisingly, the two could work together when distracted:

The observed parallel recruitment of executive and default network regions—two brain systems that so far have been assumed to work in opposition—suggests that mind wandering may evoke a unique mental state that may allow otherwise opposing networks to work in cooperation.

As Lehrer puts it, mind-wandering might not be as mindless as we previously thought. What’s really fascinating, though, is what the lab discovered concerning a subject’s meta-awareness of their own day-dreaming. Neural activity in both regions was much higher when the subject was unaware of their own mind-wandering, however, people who didn’t catch themselves day-dreaming scored much lower on creativity tests. In other words, tapping into your wandering mind on purpose is key to creativity, but there may be a whole world of creative energy buried in our subconscious.

In a lecture on media’s impact on society, Dr. Michael Rich, The Director of the Center on Media and Child Health at The Children’s Hospital Boston, told a story of how Einstein discovered the equation that made him famous. Einstein’s first job was as a patent officer in Bern, Switzerland, and he was so bored with his job that he spent his daily commute through the city contemplating his boredom. It was on one of these walks that he discovered the theory of relativity. As the Health Resource Network puts it, “He did have a curious mind… and he wasn’t afraid to think differently than other people around him believed.” Dr. Rich lamented in his lecture the rapidly deteriorating time that a child spends being bored as they grow up. With so many media-rich distractions at their disposal, children spend much less time enduring the boredom that defined the cognitive growth of previous generations.

Ultimately, there’s a strong case for getting distracted every once in a while. In an article in Vogue a few years back, an artist whose name I can’t recall made a powerful suggestion: “Always leave time to do nothing.”

The Creative: Cornerstone of the Modern Community

A recent article in The Atlantic titled ‘The Recession’s Surprise Survivor: The Arts’ got me thinking (again) about the graphic designer’s role in society. But in a bigger sense, the creative’s role in America’s current economic situation is fascinating; when money is tight, budgets for the arts are some of the first things to go, but in a recession it becomes evident that people need an outlet and a way to feel that they still can afford to enjoy their existence. It’s within that basic need, which becomes urgent in times of real loss, that the arts become invaluable. Here’s an excerpt from the article:

As austerity descends on the states, theater and art budgets are fighting to keep taxpayer support. So where is the growth coming from? It’s coming from the private sector, where decidedly un-artistic occupations like engineering, technology, and health care are siphoning artists from the freelance world. For example, the NEA suggests that the growing demand for new health care facilities an the hospitality industry will lead to increased demand for interior designers. Explosive growth in online advertising and interactive multimedia serves as a boon for artists and illustrators, while the growth of digital publishing provide new opportunities for fledgling writers and authors.

Clearly it’s encouraging to a designer like myself that the artistic world has a bright future in America, but from a marketing standpoint it’s been evident for a while that as consumers are given more choice, more freedom and more access to customization, they expect much more out of every experience they have. Suddenly the average Joe knows when a room isn’t well designed, or an ad is low-quality, or a website should function more intuitively. Suddenly platforms are multiplying, and each new technology demands a whole new host of creative skills.

Collaboration, as well, is becoming increasingly common, and the definition of a “collaboration” has higher standards. People aren’t happy with clothing collaborations that just have someone’s name on them anymore — as Frank the Butcher, art director of Boylston Trading Company, said in a HypeBeast interview, “Collaborations on any front… only work when they are true creative partnerships…  The days of just slapping an “x” between names are over. People want more than just double billing.” now, a theater group might collaborate with a dance troupe on a project for a clothing store that’s doing a new online marketing series, for which they’re hiring a design studio who’s also working with a freelance illustrator. The same old, same old doesn’t work in advertising, and the answer to satisfying the increasingly demanding attention of consumers lies within the arts.

How Social Media is Pre-Historic

In a moment of foreshadowing, McLuhan wrote the following:

Electric circuitry is recreating in us the multi-dimensional space orientation of the “primitive.”

Multiple things recently have led me to the conclusion that social media, far from being an innovative thing of the future, is simply the natural result of human evolution and the closest we have ever been to human pre-history. Social media appeals to more basic human needs than any other medium of communication.

In The Medium is the Massage, Marshall McLuhan talks about how “all media are extension of some human faculty – psychic or physical.” While he uses examples such as radio – an extension of your ears – and TV – and extension of your sight – social media makes his point even more poignant. While facebook might be easy to mock and while incorporating “retweet” into an everyday conversation might still feel weird, these mediums just represent things that humans have been doing for centuries — but magnified to their maximum potential and broadcast across the web.

Marketers across the board should be glad to hear that what this scary ‘social media’ business really is, when broken down, is just ‘word of mouth’ magnified.

In his book, Socialnomics, Eric Qualman talks about how ‘word of mouth’ is now ‘world of mouth’ and how information passed around verbally through webs of people is now passed directly from one person to the hundreds of people in their network. But it’s still the same thing that has been driving ideas for centuries – word of mouth.

I really liked how Rick Sanchez, at the Gravity Summit a few weeks ago, talked about how what social media was actually doing was bringing us back to our roots. He introduced the idea of a “cyber-porch,” and explained that social platforms are really just giving us the chance, in our busy schedules and modern lives, to do what our grandparents, and their parents, and their parents before them did so much of: chatting on the front porch.

It goes deeper than that though– social media pulls on some of our deepest human needs and desires. Qualman explains that we “have the dichotomous psychological need to be an individual, yet feel connected to and accepted by a much larger social set,” a contradiction in terms that can’t be answered by television, print, or anything on the internet before, say, 2002. I remember the first time I was asked by a site to set up a ‘profile’ when I was in middle school. I couldn’t figure out why they wanted a photo, a bio, my favorite music, my favorite movies, my favorite quote and where I lived – but I loved filling out all the details and I loved even more the idea that my profile would be viewed and connected to others. It’s the perfect match to fulfill our basic needs.

So how, exactly, does social media take us back all the way to prehistory? That concept revolves not only around social media, but around the entire 21st century digital saturation that we’ve come to expect from almost every experience we have. We live in a 360, 3-D, interactive, real-time world – and while that sounds like the stuff of the future, Marshall McLuhan leads me to believe that what we’re really doing is stripping away the limitations on our perspective that came hand-in-hand with the dawn of civilization.

This applies to our interactions with the digital world especially — augmented reality, 3-D movies, advances in video gaming, responsive web design and social media are how the world SHOULD be seen. McLuhan points out that when, in the Renaissance, humans discovered The Vanishing Point and learned to paint perspective they created the detached observer. You were no longer a part of the painting, and the painting no longer depicted more than exactly what you saw. “The instantaneous world of electric informational media,” however “involves all of us, all at once. No detachment of frame is possible.” In other words, humans are back in the picture.

McLuhan goes on to describe the limitations that the alphabet put on this species that had previously lived in a primarily olfactory environment – pre-historic people perceived space differently, and therefore depicted it much differently. Time and space were integrated together, and both were boundless and horizonless in nature. In their art, pre-historic people put in everything they knew – not according to visual logic, but according to the full experience, so that they fully explained everything they wished to represent.

Isn’t that sort of what we do now?


“Plato pointed out that writing would place us in an unfortunate position: we would read ABOUT objects and then think that we understood them; we would read ABOUT subjects and then imagine that we knew them, as if from firsthand experience.”

If Plato had any hesitations about written word, God only knows what he’d be saying about social media.

Lately I’ve been thinking a lot about the implications of social media, which made Emerson College Professor, Thomas Cooper’s, new book really fascinating. Fast Media, Media Fast is a book about the over saturation of media in our lives in the information age, and it proposes the concept of a Thoreau-like “Media Fast,” in which you cut out all media, some media, or certain media for a period of time.

The most striking points he makes, to me, are those that question whether original thinking is possible – what do I actually know when I can only hear myself think? Does being a constant consumer of information impact your ability to be a TRUE creator?

For young generations that are practically born with a smartphone in hand, the question is no longer how the wired world transforms self and society, but what it means that that their handheld devices ARE self and society.

I wonder if I could really take a media fast – if I could shut out as much media as could possibly be practical for a month, maybe even more. And what would I learn if I did?

Infographic Takeover

We live in an age of constant, non-stop media-saturation. By the time you’ve arrived in the office by 9am, you’ve opened multiple applications on your phone, read a few emails, tweeted, and checked in on foursquare. By noon, you’re expected to have read the news, covered all the important blogs, tackled your over-crowded email, and finished half your to-do list for the day.

The amount of information we process on an hourly basis is growing exponentially, which means something big for Design: the advent of infographics.

Visual communication has always been an important element in the way we process information, but never before has design been so relevant in every communication effort we interact with. As Jon Bergher says in his film Ways of Seeing, “seeing comes before words,” and with so little time to process those words, the initial view of a document can be all the attention it gets.

So along come infographics – the ingenious practice of condensing mounds of information into one compact graphic composition. As every single type of written word becomes threatened by the digital revolution, information in paragraphs of Times New Roman 12 pt type doesn’t stand a chance.

Resumes, for example, now need to take it to never-before-required levels of creativity to stand out of the crowd. When Chris Spurlock, a journalism student at the University of Missouri, posted his infographic resume online after graduation, the power of an infographic really showed itself.

He probably didn’t realize that a resume, of all things, could go viral on the internet, but his did — soon, even the Huffington Post had covered how awesome his resume was. In recent months, infographics are everywhere, so I suppose it was only a matter of time before they became automated. Soon-to-be graduates like myself are going to have to think of something else to compete against Chris Spurlock, because the new website is automating the infographic-creation process and using your Linkedin resume to build visual resumes for everyone.


Here’s an example of a resume:

Crap. Now everyone is a designer! I don’t know what democratizing the process means for the future of design, infographics, and resume-creation, but can’t possible be very appealing to designers. I don’t know about other design-minded creatives entering the industry, but I feel threatened.

The only thing that gets me about the onslaught of infographics (they are honestly everywhere, and they’re multiplying like rabbits) is that I have always loved them, and they are going to become watered down FAST. The examples may be more exciting that a traditional resume, but they’re not well-done infographics. Putting together an infographic is 1 part design, 2 parts math and logic, and organizing information well visually can’t be done across the board by inserting data into a logarithm. Once the data or process is organized on paper, the infographic should be a unique organization of that information in visual form. is going to attempt to use the same process to turn anyone’s Linkedin information into graphics, and I don’t see that working very well.

Keeping in mind what really powerful infographics can accomplish, here’s one of my favorite sources for great information design: GOOD Infographics. Here are two of my favorite infographics they’ve posted this year, the first by Hyperakt and the second by Column Five.

Injuries at Burning Man Festival


Which Kore has the Bigger Army?