Anastasia Leng is the Founder & CEO of CreativeX, a company that powers creative excellence for the world’s most loved brands. By analyzing creative at scale, the technology aims to advance creative expression through the clarity of data.
You learned marketing at Google and stayed for 6 years. What were your key takeaways from this experience?
Marketing at Google is far from traditional marketing. The work I did during my time there from 2007 – 2012 was a mixture of marketing, product, and business development. All of my work focused on launching, positioning, and convincing people to use or buy a new technology or product for the very first time. Here are the top three learnings that I still carry with me today (and annoy our marketing team about):
1. Always put users first: It seems simple enough, but it’s astounding how many marketers treat this as a platitude. Don’t assume that what you want is what your users want (a mistake I see over and over again). In fact, a 2016 Thinkbox study and a 2018 Reach Solutions study compared marketers’ beliefs with those of the general public only to find that we erroneously attribute many of our own beliefs to our customers. The researchers described this as an “empathy delusion” and it really put some data behind the fact that we need to do a better job understanding our users.
2. Always avoid jargon: Google did a wonderful job instilling in us the value of clear and simple communication. Even their terms and conditions were written in a way that someone without a legal degree had a chance to comprehend. As a result, I have a Pavlovian cringe response to terms like “thought leadership” or “omnichannel” and I do my best to push our team, and myself, to articulate our views in concise, human, approachable language.
3. Measure everything: Early on in my Google career, I made the rookie mistake of rationalizing my justification for a decision by saying that “we did it this way in the past, so we should do it this way again here.” I chose comfort and familiarity over truly understanding what the situation in front of me actually warranted, and the response from my counterparts was enough for me to avoid making this mistake again. It is obvious but rarely practiced: use data to inform your decisions.
CreativeX is actually your second start-up, could you share the genesis story behind it?
I left Google in 2012 to start Hatch, an ecommerce company that sold customizable lifestyle products. Our thesis was that the typical online shopping experience was exhausting, with consumers having to scroll through pages and pages of products that weren’t quite right. Small to medium-sized businesses took on the burden of predicting consumer demand and were left holding leftover inventory that didn’t sell. Our solution was to create a customizable retail experience, a place where every product could be tweaked to meet the customer’s specifications while reducing the inventory risk carried by the maker.
It remains an idea I deeply believe in, but ecommerce businesses are tough to get off the ground without significant capital investment. As we were building Hatch, we naturally spent a lot of time thinking about how to get consumers over to our site and we were forced to compete for consumer attention with all the usual suspects (Google, Facebook, etc.) but with a fraction of the financial resources. Given that we couldn’t outbid the major ecommerce players, we started wondering how we could outsmart them. We were making data-informed decisions about everything: our audience, the time of day we were advertising, the keywords, etc. Everything but the creative itself. We realized that creative assets were the most important part of our marketing yet the part we understood the least.
We started to build technology to address that problem, and it was that technology, initially intended for our own internal analytics, that led to the birth of CreativeX. Today, CreativeX provides technology to help brands reach creative excellence by measuring, tracking, and improving creative quality, brand consistency, and in-content representation.
Could you discuss the different machine learning technologies that are used at CreativeX to break down images and videos into thousands of attributes?
CreativeX processes every single creative asset pulled into our system (images, videos, and GIFs) and uses a variety of technologies to gather and create a comprehensive set of metadata that enables us to correctly categorize those assets in a customized way.
We analyze four elements of every creative asset.
1. The image and video file: We extract common properties from each file, including asset length dimensions, file type, etc.
2. The image and video content: We use two types of technologies to understand the content inside each image and video.
- Computer Vision: This allows us to understand the contents of any visual at scale and data is returned back as dozens, sometimes hundreds of tags for every creative asset.
- Optical-Character Recognition: This allows us to pick up any words used inside the creative. The technology determines not only the amount of text being used, but any text-specific branding requirements (i.e. taglines, positioning, language, etc.)
3. The copy accompanying each visual: If the creative is live, we also pull in any accompanying text description.
4. The sound file for video: Every audio file is translated into parseable text that enables set up of audio rules for each brand.
We’ve built tools to combine all that data in smart ways to scalably and accurately analyze and content for both the presence of objects as well as concepts that marketers want to measure.
How important is customizing the visual cues and elements that are measured?
The ability to customize what we track for each brand is critical. Data is only as powerful as its ability to provide clarity on something that’s topical to your organization, which is why one-size-fits-all computer vision recognition may be tough for marketers to use off the shelf. This is the problem that we struggled with in the early days of Hatch: we might detect the presence of dresses and understand how frequently we’re using them, but if you’re a car company, that insight is irrelevant. Which is why we’ve invested a tremendous amount of time in being able to customize the type of detection that we provide so we can map it to what is unique about that brand, its industry, and its challenges. That often includes building detection that reflects that brand’s guidelines or voice, how it’s positioned in the market, how it’s differentiating from its competitors, and that ultimately gets to the heart of the big creative questions the marketeers on that team are debating.
What type of actionable insights can be gained from this application?
CreativeX technology can help you get insights about the creative quality, brand consistency, compliance, and representation of all your image and video content. With this information, marketers can determine how much of their content meets their minimum standard of quality and is set up for success based on the unique parameters that are required on each platform and how much money they (& their agencies) are spending on promoting and producing content that does (and does not) adhere to these standards. They can measure how consistently their brand teams are communicating about the brand (are they marching to the beat of the same drum? Consistently using the same distinctive brand assets?) and how representative their casting decisions have been. All of this can help marketers take back control over their creative content to truly understand and measure, at scale, the health and alignment of their creative decisions.
CreativeX has performed both a racial and gender analysis of thousands of ads, what were some of the results from this analysis?
We analyzed 2,378 FMCG (fast moving consumer goods) ads in the US and have found that despite much attention given to the topic of representation, the reality of inclusive representation still requires much work. Our analysis of racial diversity, for example, showed that Black people are more likely to be cast in ads where sport or exercise is a theme and less likely to be cast in leadership roles. When we looked at gender representation, we found that brands are still perpetuating negative gender stereotypes: Men dominate professional roles and women are more likely to feature doing certain domestic activities like cleaning. Even with fewer on-screen appearances, men feature in more speaking roles but we’re seeing some progress with increased portrayal of women in leadership roles.
What are some other ways that you can see machine learning improving the advertising landscape in the next 5 years?
One of our investors used to say that lots of industries that claim to use machine learning have machines, and they have learning, but it’s not always clear that it’s the machines that are learning.
My view is that we’ll see deeper (or in some cases, actual) application of machine learning in advertising to continue to improve the bread and butter things that the industry is already doing: predicting consumer propensity to click and buy (targeting), generating creative variations based on consumer data (dynamic ad creative), parsing through more data to generate insights (reporting).
Machine learning is likely to get put on the case of figuring out what other signals can replace the loss of third-party cookies on Chrome and IDFA on iOS and how we can continue to personalize advertising despite the loss of that information.
Is there anything else that you would like to share about CreativeX?
A bit cheeky but… we’re hiring! If you’ve made it to the bottom of this article and are interested in how to better unite data and creative expression, we’d love to talk!
Thank you for the great interview, readers who wish to learn more should visit CreativeX.
- Hobbling Computer Vision Datasets Against Unauthorized Use
- Faisal Ahmed. Co-Founder & CTO at Knockri – Interview Series
- The Shortcomings of Amazon Mechanical Turk May Threaten Natural Language Generation Systems
- AI Chipmaker Deep Vision Raises $35 Million in Series B Funding
- Shay Sabhikhi, CEO of CognitiveScale – Interview Series