MozCon 2019 in Seattle, WA featured dozens of speakers and a wealth of information.
Every year since 2006, SaaS company Moz hosts their own digital marketing conference, and this was the first year I've been able to go. Despite the large crowd, I found everyone to be extremely friendly compared to other summits I've been to.
Any digital marketer can gain quite a bit from going, however obviously not everyone is able to go. Or you did attend and your dog ate your notes (that excuse still works, right?). So I thought I'd impart as much of a helpful synopsis of this year's event as I can muster - hopefully making your lives a little easier.
Keep in mind these summaries will be full of my own biases about what was the most impactful and helpful to me personally in each presentation, so feel free to download the slides by clicking the titles if you're looking for more details - or clicking the presenter's name below and reach out to them on Twitter. Every image links to its respective source.
Now, without further ado, here’s the MozCon 2019 recap you’ve clicked to read.
- Projections aren't necessarily true. For instance, despite many predictions that new mediums kill old ones, old media tends to stick around (see radio, TV, apps not replacing websites, etc). Latest example is "voice search killing web searches" - not founded in the data.
- Google still sends the overwhelming majority of traffic.
Both mobile and desktop are still important enough to consider for websites + search.
Voice may not be worth a big investment just yet.
Web-based video, websites, and podcasts are still excellent investments while mobile apps aren't as much.
Zero click searches are now 48% of all searches across mobile and desktop (last year that percentage reflected just mobile searches).
Among the ranking factors, coming from a trusted source and has having the most accurate info are trending as important in 2019.
Social is becoming more about viral content instead of traditional shares (see: Twitter showing you tweets with lots of engagement, even when you may not be following the user).
Rich snippets on SERPs continue to be a big focus, but that also means less trackable marketing. 😞
Human > Machine > Human: Understanding Human-Readable Quality Signals and Their Machine-Readable Equivalents by Ruth Burr Reedy
Folks in the SEO field tend to focus on inputs like keyword use, link volume, metadata etc, but you also have output data on the SERP. Be looking at the output data for clues.
Natural language processing is evolving - see "why does my TV look strange" = a snippet explaining the soap opera effect phenomena.
Try Google's Natural Language API demo and see it in action (you really should, it's cool!).
- When thinking about Tag Manager, Analytics, and Data Studio, think about them working together like the factory, warehouse, and showroom respectively.
When setting up Google Analytics, name things clearly and document what you did. If you don't explain why it exists, you might forget later.
An advanced tip for Analytics: save the CID, which is GA's client ID, as a custom dimension. Below is how you find it in Inspect Element, and here is the custom script you can use on CodePen.
- Create a custom parameter in Google Analytics (how to) that can talk to a webhook platform like Zapier.
- Utilize Google Data Studio to blend your data together and have it appear in one place. Here are some cool chart examples.
By and large, there's no such thing as a national SERP - every results page is refactored depending on the user's perceived location.
Across a dataset of 1.2 million SERPs, 73% conatined some kind of local feature.
15%-85% saw variance with a simple change in searcher's zip code for the SAME TERM.
Multisampling is your friend - tracking terms in a multitude of zip codes simultaneously.
What do you know, Moz is coming out with a tool that specializes in that (nice plug)! Get an invite: Mz.cm/LMA2019
You want to make content that has a high likelihood of driving results, no? Here's a hint: look at what's worked in the past! New methodology: research, rethink, remix.
Research is about knowing which platform your audience is on and the channels they're visiting.
Rethinking is gathering ideas that have already worked, like searching by top posts on sites like Reddit or niche forums that focus on the content your target audience has already upvoted or content that's gone viral.
Remixing is imagining how you could reinvent what's worked in the past to be applicable today. This is not straight-up copying what was done before.
ProductHunt is a cool website to see what tech products and platforms are trending (thank you Ross for mentioning this site - I'm a huge fan of it myself!).
Planned Editorial: Preparing content for an event that's coming - as much as you can until you have the last bits of information to hit publish (think: the birth of a celebrity child).
Planned Reactive Editorial: Focusing on interest and responses gained from your planned editorial items (think: featuring a common reaction to child being born).
Reactive Editorial: modifying your content to react to outside phenomena or hot news (think: relating recent birth to pattern of children being born during month of June).
When done properly you can get a heavy amount of all three in one larger, all-encompassing piece of content. But you need to proactively seek out reactive editorial opportunities.
When it comes to creating a GMB presence from scratch, primary category and additional categories are the most effective way to optimize the listings.
Generally, filling out every field and using the latest features allow for engagement signals and potential conversions.
We don't exactly know if a GMB homepage helps the listings rank, but it doesn't hurt. And an actual website helps the listings rank better.
When building citations, use a secondary phone number to avoid spam calls, but use it as the additional phone number field on your GMB listings to make sure the data is still tied together.
Use the Google Indexation Tester to determine if your citation links are paying off.
Adding multiple service areas to a listing doesn't help rank, it's simply visual.
Google posts and reviews helped in ranking. But with enough volume, it hits the point of diminishing returns.
Other highlights: track local rankings from multiple zip codes around the city, ensure your citations are indexed, an aggregate of all activities that will move the needle.
The goal should be to be rather than seem.
Old SEO practices were all about seeming rather than being, we now call that black hat SEO.
How do we move from seeming to being? By content thoroughness, accessibility, and page speed.
Focus on adjustments that improve the user experience.
When it comes to tactics, there's plenty of room for SEOs to play around in while staying kosher.
Building a Discoverability Powerhouse: Lessons From Merging an Organic, Paid, & Content Practice by Heather Physioc
Rather than aiming to have complimentary team dynamics, strive for interdisciplinary.
Settling for complimentary means you can divide up into silos as your team grows, and the reports you send clients or your stakeholders easily reflect that.
Cross-training means each team member can speak intelligently about what the other one is doing - multi-channel reporting should be done in the same room and not cobbled together.
Fine tuning Google SERPs involves keeping the item's data accurate, visually appealing, and complete. Google wants GMB to be about "real things in real places."
From 2004-2014 Google was still using the closest US Post Box to measure distance to listings, since then it's measured by the user's location (thank the Lord).
To show up, Google cares about three things: Relevance (gauges whether or not the listing provides what the user is looking for based on what's provided in the GMB listing + reviews + social), Prominence (how well-known is it based on if it's linked to or if others are talking about it), and Proximity (is the listing close enough to be considered good?).
Depending on the query, listings may show up in different orders. "Near me" may prefer literal proximity, where as "open now" will deprioritize listings that are closed at the time of the user's search.
Since 2015, Google has doubled-down on local search, as they've been continually improving how listings look on mobile with new features and enhanced UI.
The GMB cover photo is a critical image that will be the first impression users see.
It's possible to allow stars to appear on organic listings if you use proper schema and as long as you don't do review gating.
Google posts can be better than blogging.
Be proactive on Google A+Q - answer the questions or someone else will.
Try other new features like GMB Profile Messaging, Bookings, and Services as it relates to your industry.
If you focus on being the best brand, you'll be rewarded.
Unfortunately, 47% of B2B marketers don't measure ROI from their content marketing strategy - but luckily 70% prioritize content quality over quantity.
The most frequently used visual content are Stock photos (40%) and infographics (37%).
Folks remember stories 22 times more than just facts.
[Wil is pretty angry at Google for using suckers' money to pay themselves for platform flaws]
In general, most companies are wasting a lot of money on CPC helping fund Google's sometimes-sloppy keyword logic (example: assuming GA = Georgia and adding irrelevant phrases to the mix).
Be careful with the word "with."
The main four problems with the state of keyword research is: your data is too small (monthly search volume is a joke metric), your data is too siloed (you need to look at paid data, too, when considering SEO opportunities), your have bias (when data says your assumptions are wrong), and your data is slow (use his thing!).
It's scary how there's an actual limit on negative terms, because of course there is. Too many limitations and that would prevent Google from making more money.
This quote from Ben Gomes, the VP Search at Google is pretty intriguing: "You can view the rater guidelines as where we want the search algorithm to go. They don't tell you how the algorithm is ranking results, but they fundamentally show what the algorithm should do." Source.
Here's how Google Fights Disinformation. It's basically by seeking out trustworthiness.
According to Jon Mueller, setting up author profiles on your website that contain the proper structured data can help tell Google who the authors of your piece of content are.
Running tests is an important practice.
"Technical problems are people problems."
We often codify our amendment findings to sound like we're more authoritative than we are to avoid looking stupid. AKA The Supplementary Findings are actually more like The Findings I Should’ve Found The First Time Round But Didn’t So I’m Choosing To Call It Supplementary Findings To Sound Like An Expert. (this cracked me up!)
As a tech SEO, the best you can do is influence priorities, but you're not actually able to implement them.
Instead of arguing for every change with a list of 50+ recommended tweaks, pick 1 of the largest, most impactful changes and push for those. If results improve, the nice-to-haves can follow.
Not all stories are successes, but we should be able to learn from each one nonetheless.
75% of hosueholds will have at least one smart speaker by 2020.
Search is moving from answers to actions.
Voice is so accessible, a 1-year old can use it.
AnswerThePublic is a cool tool that gives you a visualization of questions and comparisons for similar terms and phrases.
Voice is being used for finding a quick fact, asking questions, getting directions, and much more.
Optimize featured snippets for voice, utilize voice schema, use bots and actions for vCommerce.
Technical SEO is a much broader term than most assume, since it can be applied in almost every link building, digital PR, content strategy, or "traditional" SEO situation. Therefore it's a mistake to relegate the definition to simply "website infrastructure."
So here's a new definition: "Any sufficiently technical action undertaken with the intent to improve search results." Thanks, Russ Jones.
Checklist technical SEO is what most generally consider to be website infrastructure-related items.
General technical SEO is made up of crawling, indexing, rendering, and internal linking analysis.
Blurred-responsibility technical SEO relates to user experience, front end web development, and structured data.
Advanced applied technical SEO would be like testing, data science, and automation.
Knowledge of coding can make applied, more advanced technical SEO much easier. But, it's not the only access to the practice.
- There's a relatively new section on Google SERPs known as the People Also Ask section. Here's what it looks like.
- PAA has recently blown up. Even though it looks like a featured snippet, CTR is actually pretty low.
The Inverted Pyramid for Answers goes Answer > Detail > Data.
Want to get your top 20 questions for your own website or topic? You can sign up for them here: https://moz.com/20q Thanks, Dr. Pete!
Mobile-first indexing is entity-first indexing. The knowledge graph is growing, and featured snippets are increasing in number.
People Also Ask is exploding (here's a similar blue graph to the one we saw from Dr. Pete above but its colors are inverted).
Generally it's difficult to track and attribute traffic from SEO - so we need to constantly be doing searches ourselves to see what its doing.
Over time Google realized pages are an inefficient way to organize answers, so hello Jump Links on AMP Featured Snippets. Not only will it grab just the part of the page it thinks is relevant, but a click will have the page move immediately to that part in the paragraph and highlight it for the user. Chrome users will be able to share links to words or phases on pages soon.
Google wants to index more than just websites - and windowing content into a SERP is how Google plans to do it.
Try indexing via API instead of crawling - it's faster and less reliant on links.
When it comes to conversion rate optimization, there could be potentially thousands of places where demand is outstripping supply.
What if your eCommerce site's search results got indexed for 0 result pages? Check if they're a proper 404... if not, you're probably getting a lot of frustrated users landing on your pages from organic searches (lol Best Buy).
Custom extraction allows you to scrape any data from the HTML of web pages when crawling. Thanks, Screaming Frog, for that definition.
You can use Screaming Frog as the tool to perform a custom extraction by hitting Configuration > Custom > Extraction.
35% of what people purchase on Amazon, and 75% of Netflix watches are based on recommendations. I guess they're pretty important.
Figure out what "bolt on" products customers proactively buy at scale and vet your category navigation URLs in main menus - make sure they're not too thin.
It's generally true that folks that land on a blog post are less likely to convert than people who land on a purchase page. That just makes sense - it's based on intent. But think about what people are more likely to link to - your blog content of course! That is, if it's incredible and not crap.
Bad content is worthless, amazing content is incredible.
75% of articles have zero external links.
Make your content worthwhile. How? Try starting with presenting original research. Only 27% of companies publish original research, it's a huge opportunity to be the authoritative source to something that no one's done before. Imagine being the best page on the internet for your topic.
Relationships in content marketing are invaluable, so collaborate with influencers. Since content creators are the 1% and consumers are the 99%, don't let go when you find a good one. As a result, you'll getbetter content quality, better social reach, while also growing your professional network.
Write guest posts, you coward.
You don't need 1000 articles, you need 100 really good ones.
Upgrading to visual formats means finding unicorns to make baby unicorns. Make your blog posts infographics, videos, or something else.
SEO is changing. The top 10 local ranking factors of 2005 aren't the same as the top 10 2019 ranking factors.
Poor user experience has shown massive decreases in traffic. See: Forbes post-ad attack.
Best practices aren't as clear cut as some may suggest. The same optimizations can result in improvements on one website, but do completely nothing to traffic on the other. This is why testing is so vital.
After a change, it takes 2-4 days on average to see changes. Luckily Google can discover and index changes quickly, and the effects of changes are reversible as long as you're quick.
Best case scenario: setup a parallel universe. Take two pages that have very similar traffic and are similar in general and apply changes to one while leaving the other as your control group. Give some time and see what performs better. Then update everything to the one that did better.
Make sure to have numbers attached to each of your potential changes to keep track of them.
Local Search ranking factors are: Link signals, on page signals, and behavior signals.
Google My Business got 32% bigger than last year - so embrace it as your new home page and give it some love. It might just be the first impression users have, so spruce it up and use the new features no one else is using.
Google Posts stay live for 7 days (unless you use the "Event" type of post) and attached images are 1200 by 900 pixels (for now) - but they're cropped oddly depending on screen form factor. Also try uploading videos - you use any video under 100MB or under 30 seconds.
Be wary that depending on the type of post you choose, you'll have a different potential amount of real estate to use. For example, a "What's New" post gives you the most with 3 visible text lines and a CTA link.
Add UTM tracking to CTAs - then you can know if they're performing.
Google Q+A: check it regularly. If you're not answering, someone else will. 3 upvotes on an answer have it show up naively, but the most upvoted answer shows first. If you include a phone number or URL in your response they will get filtered - so avoid it.
A good number of people don't really get the feature and type random stuff in - they either think it's a review or a lead form. Pay attention!
Load your own questions on Q+A and answer them yourself! That's totally allowed.
Ableism refers to discrimination in favor of able-bodied people.
When writing for your website, think about inclusive, gender-neutral language.
Using words like "crazy" or "stupid" can be potentially upsetting - beyond that, they're often stand ins for more accurate, precise words.
85% of consumers place more importance on visuals than on text information when shopping online for clothing or furniture according to The Intent Lab.
62% of millennials would like to be able to search by image.
58% of millennials would like to be able to click to purchase directly from content.
Pinterest is the platform for visually-based purchases.
Product images should: be on brand, be clutter free, have a clear focal point in the foreground, provide context, use customized stock photography, include multiple angles, and add value.
The #1 thing that factors into ranking into local results is location / proximity. Case study.
Use LocalFalcon to put in a business, then a keyword and scan. Get some local pack insights.
PlacesScout also provides visibility by providing map pack reporting.
Use UTM codes in Google My Business. The native reporting tools aren't great.
Tracking calls from Google My Business is a good idea.
Something Joy observed is when she changed the GMB listings' "website" field from the local page on the website to the homepage, traffic went up, even though the page's content was less relevant. But test this for yourself.
Reviews account for 15% of how Google ranks a local business. Review gating is a big no (asking for an experience, sending positives to review and negatives to submit a private form). GatherUp is a good platform that provides alternative methods that are Google-compliant (also Aaron and Mike are really cool guys!).
Keep in mind, "removed" reviews aren't truly deleted - they're just hidden.
The Possum Filter occurs when Google believes that two listings in the same industry that are very close together are actually duplicates, and it hides the weaker of the two.
Keyword stuffing in GMB titles massively improves ranking, but don't do it because it's against Google's TOS and you may get removed permanently if you try it.
The types of featured snippets are: paragraph, list, table, video, and accordion (similar to People Also Ask).
2 main takeaways: Always be testing, create high quality content for people, not machines.
National SERPs don't exist, it's about very specific locations.
24% of SERPs have a featured snippet.
People Also Ask is featured on 93% of featured snippet SERPs.
Featured snippets are only on 34% of PAA SERPs.
65% of all SERPs have a PAA box.
50% of all FS are part of a carousel.
Keep start word triggers in mind. Lists, paragraphs, and tables each have trigger words like "how," "does," or "best."
Keep written length in mind - if you write something beyond 270 characters, your competitors 115 character answer might show up as the snippet instead (also think about voice - shorter answers are better).
Constantly re-indexing to see results may result in being penalized, although it can't be proven.
Google's Natural Language Processing API has a front end available for free.
That's a wrap
I hope you found this recap helpful. If you're a presenter from MozCon 2019 and you believe I've mischaracterized some of your points or felt I didn't present your findings accurately - please leave me a comment on this post or shoot me a tweet and I'll make some edits. ✌
Edit: Also worth reading is Ariel Macon's Top Takeaways list. It's the TL;DR version, highly recommend: https://www.linkedin.com/pulse/top-takeaways-from-mozcon-ariel-macon/
What else I'm doing...
Did you find this article valuable?
Support David V. Kimball by becoming a sponsor. Any amount is appreciated!