The Creative: Cornerstone of the Modern Community

A recent article in The Atlantic titled ‘The Recession’s Surprise Survivor: The Arts’ got me thinking (again) about the graphic designer’s role in society. But in a bigger sense, the creative’s role in America’s current economic situation is fascinating; when money is tight, budgets for the arts are some of the first things to go, but in a recession it becomes evident that people need an outlet and a way to feel that they still can afford to enjoy their existence. It’s within that basic need, which becomes urgent in times of real loss, that the arts become invaluable. Here’s an excerpt from the article:

As austerity descends on the states, theater and art budgets are fighting to keep taxpayer support. So where is the growth coming from? It’s coming from the private sector, where decidedly un-artistic occupations like engineering, technology, and health care are siphoning artists from the freelance world. For example, the NEA suggests that the growing demand for new health care facilities an the hospitality industry will lead to increased demand for interior designers. Explosive growth in online advertising and interactive multimedia serves as a boon for artists and illustrators, while the growth of digital publishing provide new opportunities for fledgling writers and authors.

Clearly it’s encouraging to a designer like myself that the artistic world has a bright future in America, but from a marketing standpoint it’s been evident for a while that as consumers are given more choice, more freedom and more access to customization, they expect much more out of every experience they have. Suddenly the average Joe knows when a room isn’t well designed, or an ad is low-quality, or a website should function more intuitively. Suddenly platforms are multiplying, and each new technology demands a whole new host of creative skills.

Collaboration, as well, is becoming increasingly common, and the definition of a “collaboration” has higher standards. People aren’t happy with clothing collaborations that just have someone’s name on them anymore — as Frank the Butcher, art director of Boylston Trading Company, said in a HypeBeast interview, “Collaborations on any front… only work when they are true creative partnerships…  The days of just slapping an “x” between names are over. People want more than just double billing.” now, a theater group might collaborate with a dance troupe on a project for a clothing store that’s doing a new online marketing series, for which they’re hiring a design studio who’s also working with a freelance illustrator. The same old, same old doesn’t work in advertising, and the answer to satisfying the increasingly demanding attention of consumers lies within the arts.

Advertisements

How Social Media is Pre-Historic

In a moment of foreshadowing, McLuhan wrote the following:

Electric circuitry is recreating in us the multi-dimensional space orientation of the “primitive.”

Multiple things recently have led me to the conclusion that social media, far from being an innovative thing of the future, is simply the natural result of human evolution and the closest we have ever been to human pre-history. Social media appeals to more basic human needs than any other medium of communication.

In The Medium is the Massage, Marshall McLuhan talks about how “all media are extension of some human faculty – psychic or physical.” While he uses examples such as radio – an extension of your ears – and TV – and extension of your sight – social media makes his point even more poignant. While facebook might be easy to mock and while incorporating “retweet” into an everyday conversation might still feel weird, these mediums just represent things that humans have been doing for centuries — but magnified to their maximum potential and broadcast across the web.

Marketers across the board should be glad to hear that what this scary ‘social media’ business really is, when broken down, is just ‘word of mouth’ magnified.

In his book, Socialnomics, Eric Qualman talks about how ‘word of mouth’ is now ‘world of mouth’ and how information passed around verbally through webs of people is now passed directly from one person to the hundreds of people in their network. But it’s still the same thing that has been driving ideas for centuries – word of mouth.

I really liked how Rick Sanchez, at the Gravity Summit a few weeks ago, talked about how what social media was actually doing was bringing us back to our roots. He introduced the idea of a “cyber-porch,” and explained that social platforms are really just giving us the chance, in our busy schedules and modern lives, to do what our grandparents, and their parents, and their parents before them did so much of: chatting on the front porch.

It goes deeper than that though– social media pulls on some of our deepest human needs and desires. Qualman explains that we “have the dichotomous psychological need to be an individual, yet feel connected to and accepted by a much larger social set,” a contradiction in terms that can’t be answered by television, print, or anything on the internet before, say, 2002. I remember the first time I was asked by a site to set up a ‘profile’ when I was in middle school. I couldn’t figure out why they wanted a photo, a bio, my favorite music, my favorite movies, my favorite quote and where I lived – but I loved filling out all the details and I loved even more the idea that my profile would be viewed and connected to others. It’s the perfect match to fulfill our basic needs.

So how, exactly, does social media take us back all the way to prehistory? That concept revolves not only around social media, but around the entire 21st century digital saturation that we’ve come to expect from almost every experience we have. We live in a 360, 3-D, interactive, real-time world – and while that sounds like the stuff of the future, Marshall McLuhan leads me to believe that what we’re really doing is stripping away the limitations on our perspective that came hand-in-hand with the dawn of civilization.

This applies to our interactions with the digital world especially — augmented reality, 3-D movies, advances in video gaming, responsive web design and social media are how the world SHOULD be seen. McLuhan points out that when, in the Renaissance, humans discovered The Vanishing Point and learned to paint perspective they created the detached observer. You were no longer a part of the painting, and the painting no longer depicted more than exactly what you saw. “The instantaneous world of electric informational media,” however “involves all of us, all at once. No detachment of frame is possible.” In other words, humans are back in the picture.

McLuhan goes on to describe the limitations that the alphabet put on this species that had previously lived in a primarily olfactory environment – pre-historic people perceived space differently, and therefore depicted it much differently. Time and space were integrated together, and both were boundless and horizonless in nature. In their art, pre-historic people put in everything they knew – not according to visual logic, but according to the full experience, so that they fully explained everything they wished to represent.

Isn’t that sort of what we do now?

An argument for destroying the 8-hour work day

Money in both the public and private sectors is pouring into inactivity. We are paying employees to occupy a desk 8 hours a day to complete 2 tasks that could have been completed in 2 hours and 15 minutes. These tasks may be very important, and may require a lot of energy, hard work and ingenuity, but they were completed from 10:30am-12:45pm, then the employee went to lunch, and now they’re sitting at their computer hitting the refresh button on Facebook. Or even better, they were started at 10:30am, dragged out until 4:30pm, and then finally thrown together poorly at 4:55pm.

In the creative industry – actually, in ANY industry – productivity is the result of a good mood, inspiration, and the energy needed to complete the task. But if there are 3 tasks to be completed for the day, why not let employees leave once they are completed? In our hyper-connected world, being on call 24/7 is hardly asking too much – you’re turned on 24/7 anyway. So if another task does come up once the employee has left for the day, feeling the need to slot it into the 9-5 availability period almost seems archaic.

If I could come into work when I wanted to come into work, but was paid for the timeliness and quality of the tasks I was held responsible for completing, I would do everything faster, better, and could easily do MORE. If I could sleep in when necessary, travel to gain inspiration when I felt it was needed, but was happy to do a 14-hour work day if I had a lot to accomplish, my life, and the life of my employer, would be improved.

Jason Fried, cofounder of 37signals, agrees. He insists that productivity comes in waves, and pushing through period of low-productivity is a waste of time for everyone. To solve the issue, he offers employees 30 days paid sabbatical (on top of vacation time) if they’ve put in 3-weeks work and feel that they need to recharge.

According to the Society for Human Resource Management, only about 2% of U.S. employers offer a “results-only” work environment.  The Wall Street Journal, however, recently highlighted the growing trend in unlimited vacation time – many white-collar companies, like Netflix, have stopped tracking vacation time in order to focus on what really matters: the quality of their work.

What employees do with such endless possibilities varies – many end up working more and vacationing less, simply because they don’t feel like they can ask for time off. That fact goes to show that ultimately, office policies aren’t as important as the office mentalities that become policy in practice. America is the only country I’ve ever been in that works as hard as it does, but it’s becoming increasingly evident that taking time off shouldn’t be taboo – it should be encouraged.

Need further evidence? Check out this GOOD infographic on The Overworked American

Over-Saturation

“Plato pointed out that writing would place us in an unfortunate position: we would read ABOUT objects and then think that we understood them; we would read ABOUT subjects and then imagine that we knew them, as if from firsthand experience.”

If Plato had any hesitations about written word, God only knows what he’d be saying about social media.

Lately I’ve been thinking a lot about the implications of social media, which made Emerson College Professor, Thomas Cooper’s, new book really fascinating. Fast Media, Media Fast is a book about the over saturation of media in our lives in the information age, and it proposes the concept of a Thoreau-like “Media Fast,” in which you cut out all media, some media, or certain media for a period of time.

The most striking points he makes, to me, are those that question whether original thinking is possible – what do I actually know when I can only hear myself think? Does being a constant consumer of information impact your ability to be a TRUE creator?

For young generations that are practically born with a smartphone in hand, the question is no longer how the wired world transforms self and society, but what it means that that their handheld devices ARE self and society.

I wonder if I could really take a media fast – if I could shut out as much media as could possibly be practical for a month, maybe even more. And what would I learn if I did?

Shoutout: Allan Peters

I discovered Allan Peters through Fabien Barral’s great blog the Graphic-Exchange.

I love this project for BBDO’s 80th anniversary, a speak-easy party. It has a great sense of humor, but maintains the aesthetic of a prohibition-era speak easy.

Splitscreen

This was shot with a CELL PHONE camera – what?!

Infographic Takeover

We live in an age of constant, non-stop media-saturation. By the time you’ve arrived in the office by 9am, you’ve opened multiple applications on your phone, read a few emails, tweeted, and checked in on foursquare. By noon, you’re expected to have read the news, covered all the important blogs, tackled your over-crowded email, and finished half your to-do list for the day.

The amount of information we process on an hourly basis is growing exponentially, which means something big for Design: the advent of infographics.

Visual communication has always been an important element in the way we process information, but never before has design been so relevant in every communication effort we interact with. As Jon Bergher says in his film Ways of Seeing, “seeing comes before words,” and with so little time to process those words, the initial view of a document can be all the attention it gets.

So along come infographics – the ingenious practice of condensing mounds of information into one compact graphic composition. As every single type of written word becomes threatened by the digital revolution, information in paragraphs of Times New Roman 12 pt type doesn’t stand a chance.

Resumes, for example, now need to take it to never-before-required levels of creativity to stand out of the crowd. When Chris Spurlock, a journalism student at the University of Missouri, posted his infographic resume online after graduation, the power of an infographic really showed itself.

He probably didn’t realize that a resume, of all things, could go viral on the internet, but his did — soon, even the Huffington Post had covered how awesome his resume was. In recent months, infographics are everywhere, so I suppose it was only a matter of time before they became automated. Soon-to-be graduates like myself are going to have to think of something else to compete against Chris Spurlock, because the new website Vizualize.me is automating the infographic-creation process and using your Linkedin resume to build visual resumes for everyone.

 

Here’s an example of a Vizualize.me resume:

Crap. Now everyone is a designer! I don’t know what democratizing the process means for the future of design, infographics, and resume-creation, but Vizualize.me can’t possible be very appealing to designers. I don’t know about other design-minded creatives entering the industry, but I feel threatened.

The only thing that gets me about the onslaught of infographics (they are honestly everywhere, and they’re multiplying like rabbits) is that I have always loved them, and they are going to become watered down FAST. The Visualize.me examples may be more exciting that a traditional resume, but they’re not well-done infographics. Putting together an infographic is 1 part design, 2 parts math and logic, and organizing information well visually can’t be done across the board by inserting data into a logarithm. Once the data or process is organized on paper, the infographic should be a unique organization of that information in visual form. Visualize.me is going to attempt to use the same process to turn anyone’s Linkedin information into graphics, and I don’t see that working very well.

Keeping in mind what really powerful infographics can accomplish, here’s one of my favorite sources for great information design: GOOD Infographics. Here are two of my favorite infographics they’ve posted this year, the first by Hyperakt and the second by Column Five.

Injuries at Burning Man Festival

 

Which Kore has the Bigger Army?