By Veronica Penaloza, Media Intern
In my past semester as an intern at the agency, I’ve come to recognize that a huge topic for GSD&M’s media department is ad blocking. The following is my reflection on the topic and how we’ve created solutions that make a difference for our clients and consumers.
Ad blocking is a sensitive topic for the media world. With the rise of programmatic, DSPs (Demand Side Platform) and SSPs (Supply Side Platform), GSD&M recognized that turning off banner ads is more than a fad—it is likely to stay, and performance will be improved with time.
According to a survey by Adobe and PageFair, 28 percent of people in the U.S. browse the web using ad blockers. However, we have found that ad blockers directly affect vendor performance and is one of the reasons why partners under-deliver impressions. As such, consumers are at the forefront of today’s interconnected world and will remain so for the long term.
With this in mind, GSD&M has taken a mindful approach to the current digital trend and is moving forward with alternative options that still allow the content to reach the target consumer using a sponsor-content model, which includes native ads.
GSD&M has already had the opportunity to conduct creative and native-ad media placements with a few key clients. LeapFrog, for example, leveraged its unique children’s educational products through “influencers who believed in the brand’s message” on social media platforms. The media team drafted a native strategy that gave community mavens the freedom to create their own content while incorporating LeapFrog’s brand experience into their own personal content. The campaign outperformed initial KPIs on different social media platforms.
To further the brand’s efforts, our LeapFrog media team partnered with BuzzFeed to produce and publish the video “9 Things I’m Excited to Teach My Kids: Presented by BuzzFeed and LeapFrog.” With more than 400 comments and over 600,000 views on YouTube, the partnership succeeded in reaching an audience that might have otherwise never experienced the content.
Looking ahead into 2016, a lot of GSD&M’s media plans will include more consumer interaction and innovative organic placements as more brands express interest in joining the stream of sponsored content. With ad blocking specifically, we recognize its ability to alter the industry, and so we are rethinking how to further grow the relationships brands have with consumers.
Remaining on the vanguard and leading by example, GSD&M will continue to serve ads that organically live in the user path, adapt to publishers’ internal strategies and solve the current challenge of ad blocking. Once again, GSD&M stays ahead of the curve in the digital media space, further exemplifying our vision “Ideas that Make a Difference” into our clients’ media catalog.
By Rye Clifton, Product Strategy Director
One of the biggest elements of RadioShack’s The Phone Call was keeping everything a secret. We wanted the Super Bowl spot and contest to be a big surprise, which meant we had to launch a new campaign, a contest, a series of messages across social accounts, paid media, social skins, landing pages, and PR… all in a 30 second window.
At times the rehearsals felt ridiculous, but they were necessary. We learned several things along the way that helped us optimize and shave time: using tabs, arranging lists in reverse order, and color coding action items on a white board. Our best practice run had us going live in less than 25 seconds.
As the spot aired, we went heads-down. By the twenty-first second, the contest was up, the first tweet was live, and responses started pouring in. Within the first couple of minutes we had four trending topics: #RadioShack, Mary Lou Retton, California Raisins, and Kid ‘N Play. Topsy.com showed initial sentiment levels over 90% positive.
During the contest we logged over 100,000 organic tweets, equating to over 270 million potential related impressions (this accounts for all the conversations surrounding the celebrities, the prizes, and the campaign in general). This was helped by paid targeting, a range of mentions by people like Judd Apatow, Perez Hilton, and Andy Roddick… and a wide variety of news sources from WSJ and Forbes to People and E! Online. The contest alone had over 40,000 entries, and Ad Age said we won the night with a 22-times increase in social mentions. By Monday morning there were over 1,100 articles listed on Google News.
After five months of preparation, we broke through.
Now on to phase two.
(Photos from the war room)
I am of the mindset: Advertising that “works” is rooted in proven science and math.
A few weeks ago, 10 people liked this Instagram photo within 3 seconds of posting, and it made me wonder: Is there a way to measure content “efficiency” – how quickly content moves through a medium (audience) towards a desired direction?
We constantly measure the effectiveness of content a la number of likes, views, RTs, etc. But, in the real-time digital world, speed is also an important factor. Speed combined with traction creates efficiency. And efficiency typically wins in our increasingly digital world.
If we want to truly understand the impact of content we’re producing, what if we applied physics concepts to digital advertising?
Physics: A science that deals with matter and space and their interactions.
Content Physics: A science that deals with content (matter) and its audience (space) and their interactions.
Velocity: The speed at which something moves in a given direction (positive or negative). If the object returns to its starting position, then the velocity is zero.
To calculate Velocity, V = Displacement / Time Taken
Content Velocity: The speed at which content (something) moves in a given direction (positive or negative). If the content (object) returns to its starting position, then the velocity is zero.
To calculate Content Velocity, CV = Displacement / Time Taken
The higher the absolute value of content velocity, the more efficient the content’s influence over that specific time period. And, at times when brands are in a low season, their content velocity may slow down, maybe even hitting zero on a day or two.
This could also dig into how conducive specific platforms are to high levels of content velocity. Just as how a ball may move at different speeds through different liquids, content may move at different speeds through specific platforms, depending on its properties – how tightly packed the atoms are, the viscosity, the density.
</ Nerd Moment>Anyway, happy Tuesday!
By Kathie Haydon
Data and large amounts of it is nothing new to anyone who works in media. I can remember sitting in the media lab as an assistant media planner on a big DOS computer with its blinking yellow cursor typing in code lines for MRI from a 1,200-page binder.
Yes, the digitization of pretty much every channel has opened up the floodgates when it comes to data, and we in media, the original number geeks, are embracing the fascinating evolution of our practice. It makes us smarter in everything we do, from audience development to channel planning to instantaneous optimizations and closing the loop with business results.
But then there is the dangerous side of this data influx: the tendency to lose all constraint and become a data hoarder just for the sake of having it, as well as the darkness that can follow as you climb into a deep hole for days (even weeks) trying to make sense of it all. The cardinal rule of data is quality over quantity. It’s only as good as the actions that come from it. If you’re spending 80% of your time compiling and sifting through the data and only 20% of your time thinking about it, then you have the ratio all wrong.
The good news is that with the sophistication of data comes a whole set of tools that are helping us rewrite the 80/20 rule. The most basic example is Clear Decisions—this tool has transformed the act of entering MRI codes into a DOS computer to get one set of simple demographics against a brand into sitting at your desk and being able to drag and drop coding from multiple sources, as well as manipulate it to build complex audiences while creating crosstabs against thousands of data points with the click of a mouse.
But it’s not just about the existing industry software solutions that are being rolled out to keep pace with the data available; it’s about creating your own tools to fill the gaps or overcome subpar products.
We are working on the next iteration of our longstanding engagement tool, ALLI Plan. Simply put, we are refreshing an already powerful product to better account for changes in how people are consuming media. The key measures of attention, loyalty and lifestyle will remain, but expanding those measures to pull in the deeper data out there—ranging from social chatter around TV shows to time spent with magazines on their tablet editions. This expansion of data is only valuable if it can be layered on our existing baseline measures in a way that makes sense and is easily updated.
GSD&M’s proprietary dashboard, Telescope, is transforming the way we report to our clients. It’s made us faster while reducing propensity for human error, the net being better results. Why? Because reports that used to take us 14 hours to compile now take less than two. We can get to the more actionable insights more quickly, and the tangible outcome is substantial improvement on key performance indicators—in some cases 70%.
Welcome to the new world—where we spend 20% of our time compiling information and 80% thinking about. That is a powerful new ratio.
Our client Katie Livesay (Senior Analyst) talks about L.L.Bean’s largest issues with analyzing data and how implementing the Telescope platform helped automate data flow/changed the way she views reporting.
What is the largest hurdle when it comes to analyzing your data?
The most difficult aspect of analyzing our digital data was being able to get a quick snapshot of program performance. Data was coming in from various sources and required manually pulling and aggregating. This took eight to 10 man hours each week in addition to equal, if not more, agency time.
GSD&M recommended implementing Telescope to help automate your data flow. What did you have to do differently to get ready to use the tool?
We worked with the team to show how we wanted to connect our site-side analytics with the ad-serving data. I set up an automated nightly report to be delivered to them, and that was it. They figured out how to show the data in a single view, and that was it.
How has Telescope changed the way you view reporting?
Telescope has enabled us to view program performance daily (vs. weekly) and has freed up staff hours to focus on optimizing program performance and to proactively respond to the marketplace. Telescope simplified not only the reporting process but also reporting deliverables, which can now be easily shared with senior management.
Has Telescope taken the manual burden out of reporting?
We no longer need to pull data from multiple sources. Automated reports are sent nightly to Telescope, and they handle the rest. It just happens.
In business today, many feel trapped by the amount of data that is created by the marketplace interacting with our brands. We feel compelled to measure and have a desire to understand every ounce of data, every tweet and every line of a report. Data is real time, complex and a seemingly uphill battle in regard to the exponential rate at which it is created. More data was produced in 2011 than in the entire history of mankind through 2010! What do we do? How do we give our data relevance and meaning? Is it possible to transform our data into information? Ultimately, how do we produce actionable business insights from data that allow us to make decisions to move our brands forward? To get a handle on the fire hose of ones and zeroes, we must define success, categorize data into relevant groups, integrate analytics into every department, use the correct tools and automate this process.
Albert Einstein once said, “Not everything that counts can be measured. Not everything that can be measured counts.” It is easy to want to measure everything, but it takes true talent to show that what you are measuring is important. This is the single greatest thing to focus on and will allow you to filter out the noise. Concentrate on segmenting your data to expose a possible insight into understanding a specific persona that aligns most appropriately with your brand. Invest in the time to understand what data you can access and choose tools that allow you to quickly transform that into bite-size portions of information. Additionally, one must be patient, because it takes time to correctly set up a process to understand and process your data.
Each year, designers and developers get a peak at sxsw of all the latest technologies, coding languages, platforms and methodologies for interactive work which is fantastic. However, on the first day (Friday), I found myself attending two design panels back-to-back that reshaped my personal framework for approaching interactive design problems altogether. They turned things upside down for me a bit, and because they were juxtaposed in the same afternoon, I left downtown on the first day with a new sense of what it means to conceptualize a digital solution for our life experience.
The first panel I attended was Designing For Context (see the slides here). Originally, I thought I was going to hear about devices, browsers and platforms, but instead I learned that big picture conceptual thinking about the physical world should be the first way in which we access design thinking. When you have to work out design problems for every foreseeable contextual environment and situation, it’s important to start with the human experience. The panelists laid out 5 main elements that impact interactive projects, which are as follows: Time, Ecosystems, Location, Form & Technology, and Brands & Relationships. These are largely different questions than browser compatibility, page optimization, and click-to-touch cross-device experiences. They are fundamental dynamics that we experience in our daily lives, and should be at the core of what we are trying to create, whether it be a digital solution that compliments, enhances and/or corrects the world. Have a look at the sketchnotes that I took on this panel below.
The second panel I attended was titled Design From The Gut: Dangerous or Differentiator. The panelists discussed the battle between emotional, gut-based design versus design strictly based on research, data, and analytics. Through conversation, both of these dispositions were said to have value based on the project type, scope, timing and “filters” (experience) of those involved. With larger projects (such as designing Facebook or Twitter), research is a much more highly valued asset over the emotional intuition of the design team involved (which also has it’s valid and appropriate role). For smaller business projects, it really comes down to the gut of the creator/designer, who hopefully stays on the same page as the client, has a proven track record of success, and performs a small amount of research (such as A/B testing, surveys, and double-checking their solutions with valued friends and colleagues). Have a look at the sketchnotes for this talk below.
In conclusion, as I move forward in trying to blend a new set of universal human-to-digital experiential questions (Designing For Context) and a balanced application of both research and emotion (Design From The Gut), I’m hoping the outcomes of my future design projects will be more meaningful, purposeful, and effective. I encourage fellow designers and creatives to consider these dynamics in their own work processes too! Let’s build a better world through better design together.
Hubspot’s Science of Social Media Webinar will be certified by Guinness World Records as the largest online marketing seminar ever. Fast talking Dan Zarrella shared data backed social media insights and scored a brief spot on Twitter’s trending topics with the hash tag #smsci. Here are some 140 character highlights and the stats behind them:
Creating better content is greatest correlation to growth of social media following. #smsciSocial proof is a risk reduction mechanism #smsci #danzarrella Peeps want to use safe source, but be first to send as well.Best time 2 tweet is towards the end of the week, the rate of RTs spikes on Friday. #smsci
Nouns and verbs perform better than adjectives. Make sure #Snooki can read it (5th grade level) #smsci (more…)
(Note: Putting this in our Reinvention SXSW category because I think adding gaming elements requires you to think outside the box of the typical transactional/meat and potatoes experiences we have interacting with many digital properties. You have to up the ante in your thinking to make user experience game-like and thus more fun!)
This is what every SXSW panel should be like in my humble opinion. In Game On: Design Patterns for User Engagement, the very charming and clearly-sharp-as-a-tack Ms. Direkova led us on an exploration of incorporating gaming elements in your work and the “gamification” seen in many popular digital properties. Think gaming features are lame? Then chew on stats like registration increasing 200% on one site when gaming elements were incorporated — and the opportunity to harvest lots of analytics to throw at the feet of your doubtful bosses/clients post-launch. Eat those apples! Nadya elaborated three aspects to consider for the user journey:
Yes. That’s right. When it comes to using personalization and targeting, marketers have been likened to 17 year old virgins rushing to get into a girl’s pants. Often, brands get so excited about the technology that they don’t think through the proper way to use it or how it might impact the long term relationship. Marketing’s use of consumer personalization, just like a love relationship, requires time to build trust and intimacy between the two parties to be done right. If any one party is forced into it too early or that trust is broken, the relationship is over immediately.
The panel today on “How to personalize without being creepy” did a great job in debating how wonderful personalization is to a user, as well as how creepy it can turn out to be.