Posted by MiriamEllis Researchers estimate that it’s up to 25 times more expensive for a company to acquire a new customer than to keep an existing one, making ongoing investments in consumer satisfaction a priority. There's nothing more disheartening to a local business owner than receiving a very negative review — and given that as little as 13% of consumers will patronize a business with a 1- or 2-star rating, there may be nothing more important than the owner taking every possible step to resolve negative reviews with speed and skill. Negative reviews don’t write themselves. While looking at restaurant reviews recently, I came across an owner-consumer interaction that perfectly encapsulates the typical steps that take a transaction from bad to worse. It serves as a diagram of how these costly scenarios begin, proceed, and escalate, ultimately resulting in permanent damage to the company’s reputation. The blame isn’t one-sided, and my goal here isn't to make the customer or the owner out to be "the villain." Rather, I'd like to point out key elements that actually worsen the situation, rather than improving it. Both owners and consumers sincerely want to feel satisfied, and the good news is that, in most cases, the only thing standing in the way of this is responsible communication. Let’s take a look! The key to the "Food Truck Fiasco"This story begins at a family-owned Philly Cheesesteak food truck that signed up to be a concession at a festival in the Southwest. One customer describes what happened on the day of the event this way, with my interpretation to the right:
The customer’s complaints are certainly understandable: he was honestly disappointed that it took so long for his food to be ready and then felt the portions were overpriced. It didn’t help matters that the staff over-promised and under-delivered in estimating the wait time. Up until this point, the consumer is blameless. But then he makes two mistakes:
Regardless of the customer’s tone, the owner’s job is to be professional at all times. I’ve seen adept business owners handle even the rudest customers with a skill that leaves me in awe, but in this case, the owner of the food truck went down the worst possible road. Far from remedying the initial negative review, the owner’s response brought the customer back with further negativity, including taking off stars. Here’s how the owner responded (Eds. note: original spelling and grammar intact), with my interpretation on the right:
Reading between the lines of the owner’s response, a picture emerges of a business that underestimated how busy it would be at an event and did not have adequate cooking facilities or staff to fulfill orders within a normal timeframe. This was the initial mistake that set the stage for all that transpired. Unfortunately, the owner then worsened the scenario by making the following additional mistakes:
Perhaps the most powerful element of the owner response function is that it is not just for a single customer to read, but for all future customers to read. Respond well, and you may not only win a second chance with the customer, but also prove to all future potential customers that they will be treated with respect, empathy, and fairness by your company. Crafting a powerful owner responseIf the food truck owner were my client, this is a sample of how I would have helped him respond, with my key on the right:
Contrast the owner’s real response with this sample suggested response, and you are likely to come away with a completely different, more positive impression of the business. A few quick suggestions for coming across well:
Those are quick tips that should immediately help you to improve your reputation in the eyes of all who read your owner responses. Ready to dig deeper into developing a powerful, permanent mindset for all future tough transactions? Read on. 3 empowering tactics for better reputation managementEvery business encounters criticism. Meet this reality better prepared with these three tips: 1. In business, we wear the mask.When your spouse tells you're inattentive, when your friend points out that you chew with your mouth open, when your children berate you for not letting them adopt another dog, it’s personal. It’s your privilege to respond with tears, embarrassment, a lecture, or whatever you feel you need to express at that moment, reacting to personal criticism in your private life. In business, it’s different. In a civil society, and particularly in a business setting, it’s simple reality that we tend to suppress strong reactions and strong words for the sake of professionalism. If you feel the color rising to your face when a customer insinuates that you actually founded your whole company for the purpose of ripping him off for $9.99, try picturing in your mind the image of the most serene, inscrutable face of a statue you’ve ever seen. Perhaps it’s the face of the Buddha, or a classical Greek god, or a Tlingit totem being. Imagine donning that mask, like a zone of safety, between the disgruntled customer’s business complaint and your personal life. It’s cooler behind the mask and you can respond to almost any commercial criticism, knowing your personal feelings are completely safe behind the barrier you’ve established. 2. Muster empathy to integrate as much of yourself into the interaction as you feel comfortable with.Now that you’ve tried on the mask, and you’ve got your worries, your insecurities, loves, family, and everything else personal safely behind its barrier, see how much of yourself you feel safe putting outside the mask for the world to see. Your life may feel too divided if your business and personal worlds are kept 100% separate, and you may not be able to pour the full passion of your heart and intellect into the business you are building if you have to be a statue at all times. Some customers may be so irrational in their expectations or conduct that the only way to manage them is with a marble coolness or a wooden face, but hopefully that will be the exception. For most customers, this technique will help you integrate your genuine human feelings into a situation in which distress is being expressed. Picture a person you not only really love, but also of whom you feel protective. For just a moment, substitute that special person for the complaining customer. Imagine that it is your grandmother who had to wait in line for 45 minutes (she might have gotten heat stroke), or your nephew who was still hungry after being overcharged for lunch (he’s had trouble getting up to a healthy weight), or your spouse who was treated rudely (how dare someone disrespect him/her), or your friend whose product broke after a week of use (she can’t afford to replace it). Suddenly, that customer is transformed from an unknown complainer into an important person who deserves fair, empathetic treatment. Integrate as much of the empathy you’d feel for a friend or relative as you can for the customer. The health of your local business, and your good feelings about the way you conduct it, depend upon turning as many unknown neighbors as you can into loyal customers and, hopefully, friends. 3. Master catching complaints before they become negative reviews.It may seem counterintuitive to want to receive as many complaints as possible, but when you consider that they are your best safeguard against the publication of negative reviews, making your business complaint-friendly is incredibly smart! Implement these tips:
Speaking of GetFiveStars, I highly recommend taking the time to read the series of articles they’ve been publishing regarding the subject of consumer complaints, including some really insightful surveys. My favorite tip from co-founder Mike Blumenthal is this one: “Make a complainer feel like your most valued customer because, in some ways, they are.” Happier endings for everybodyThe art of customer service is one you’ll be training yourself and your staff in for as long as you serve the public. Even if you’ve made every effort to catch complaints on the spot, no method is foolproof and every business is almost guaranteed to have to deal with a negative review here and there. Some customers will not speak up for themselves, even when expressly invited to do so, because they are shy, dread confrontation, or are so accustomed to being treated poorly that they don’t believe their voice will be genuinely heard. They may utilize online reviews as substitute for having to "make a fuss" in person about their dissatisfaction. Then there are those truly awful customers no business can avoid. They may have entitlement issues, unrealistic expectations, unpleasant personalities, or even have made it a life practice to throw tantrums in hopes of receiving free stuff. They may utilize online reviews as a place to spew rude language and invent false accusations because they have personal problems. No business is immune to either type of customer, but if you plot out your company’s reputation management course, you can weather most storms and end up looking like one smooth sailor! Your plan might look something like this timeline: I continue to be amazed at how many negative reviews slip through and sit unanswered on major review platforms, raising doubts in potential customers’ minds and giving a neglectful impression of the business. With the right mindset that delineates comfortable boundaries between your personal and business worlds, cultivation of empathy, a clear plan, and concentrated devotion to staff training, no business need suffer dread of negative feedback, and can, in fact, view it as a powerful resource for making meaningful improvements pre-guaranteed to resolve existing issues. And when those negative reviews do squeak through your process, a beautiful, professional response can write a happy ending, just like this one: *Review star screenshots used in this post from Yelp. Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read! via The Moz Blog http://tracking.feedpress.it/link/9375/4049037
0 Comments
Posted by Philip Walton, Developer Programs Engineer Autotrack is a JavaScript library built for use with analytics.jsthat provides developers with a wide range of plugins to track the most common user interactions relevant to today's modern web. The first version of autotrack for analytics.js was released on Github earlier this year, and since then the response and adoption from developers has been amazing. The project has been starred over a thousand times, and sites using autotrack are sending millions of hits to Google Analytics every single day. Today I'm happy to announce that we've released autotrack version 1.0, which includes several new plugins, improvements to the existing plugins, and tons of new ways to customize autotrack to meet your needs. Note: autotrack is not an official Google Analytics product and does not qualify for Google Analytics 360 support. It is maintained by members of the Google Analytics developer platform team and is primarily intended for a developer audience. New pluginsBased on the feedback and numerous feature requests we received from developers over the past few months, we've added the following new autotrack plugins: Impression TrackerThe impression tracker plugin allows you to track when an element is visible within the browser viewport. This lets you much more reliably determine whether a particular advertisement or call-to-action button was seen by the user. Impression tracking has been historically tricky to implement on the web, particularly in a way that doesn't degrade the performance of your site. This plugin leverages new browser APIs that are specifically designed to track these kinds of interactions in a highly performant way. Clean URL TrackerIf your analytics implementation sends pageviews to Google Analytics without modifying the URL, then you've probably experienced the problem of seeing multiple different page paths in your reports that all point to the same place. Here's an example:
The clean URL tracker plugin avoids this problem by letting you set your preferred URL format (e.g. strip trailing slashes, remove index.html filenames, remove query parameters, etc.), and the plugin automatically updates all page URLs based on your preference before sending them to Google Analytics. Note: setting up View Filters in your Google Analytics view settings is another way to modify the URLs sent to Google Analytics. Page Visibility TrackerIt's becoming increasingly common for users to visit sites on the web and then leave them open in an inactive browser tab for hours or even days. And when users return to your site, they often won't reload the page, especially if your site fetches new content in the background. If your site implements just the default javascript tracking snippet, these types of interactions will never be captured. The page visibility tracker plugin takes a more modern approach to what should constitute a pageview. In addition to tracking when a page gets loaded, it also tracks when the visibility state of the page changes (i.e. when the tab goes into or comes out of the background). These additional interaction events give you more insight into how users behave on your site. Updates and improvementsIn addition to the new plugins added to autotrack, the existing plugins have undergone some significant improvements, most notably in the ability to customize them to your needs. All plugins that send data to Google Analytics now give you 100% control over precisely what fieldsget sent, allowing you to set, modify, or remove anything you want. This gives advanced users the ability to set their own custom dimensions on hits or change the interaction setting to better reflect how they choose to measure bounce rate. Users upgrading from previous versions of autotrack should refer to the upgrade guide for a complete list of changes (note: some of the changes are incompatible with previous versions). Who should use autotrackPerhaps the most common question we received after the initial release of autotrack is who should use it. This was especially true of Google Tag Managerusers who wanted to take advantage of some of the more advanced autotrack features. Autotrack is a developer project intended to demonstrate and streamline some advanced tracking techniques with Google Analytics, and it's primarily intended for a developer audience. Autotrack will be a good fit for small to medium sized developer teams who already have analytics.js on their website or who prefer to manage their tracking implementation in code. Large teams and organizations, those with more complex collaboration and testing needs, and those with tagging needs beyond just Google Analytics should instead consider using Google Tag Manager. While Google Tag Manager does not currently support custom analytics.js plugins like those that are part of autotrack, many of the same tracking techniques are easy to achieve with Tag Manager’s built-in triggers, and others may be achieved by pushing data layer events based on custom code on your site or in Custom HTML tags in Google Tag Manager. Read Google Analytics Events in the Google Tag Manager help center to learn more about automatic event tracking based on clicks and form submissions. Next stepsIf you're not already using autotrack but would like to, check out the installation and usage section of the documentation. If you already use autotrack and want to upgrade to the latest version, be sure to read the upgrade guide first. To get a sense of what the data captured by autotrack looks like, the Google Analytics Demos & Tools site includes several reports displaying its own autotrack usage data. If you want to go deeper, the autotrack library is open source and can be a great learning resource. Have a read through the plugin source code to get a better understanding of how some of the advanced analytics.js features work. Lastly, if you have feedback or suggestions, please let us know. You can report bugs or submit any issues on Github. via Google Developers Blog http://developers.googleblog.com/2016/08/autotrack-turns-10.html
Below is what happened in search today, as reported on Search Engine Land and from other places across the web. The post SearchCap: Google AMP in search, Search Analytics semantics & more appeared first on Search Engine Land.
Please visit Search Engine Land for the full article. via Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing http://searchengineland.com/searchcap-google-amp-search-search-analytics-semantics-255004 Posted by ronell-smith Topic Trends, the latest feature added to Moz Content, allows marketers to quickly access a snapshot of the most popular and the most relevant content in any vertical. By accessing the content in the Content Search index, Topic Trends highlights the topics that were written about most frequently in the previous five days. Since the presidential election is the hottest thing going at the moment, it's little surprise that election news is dominant: This feature is based on the Sharing Trends Graph, which highlights the number of articles matching your search in the Moz Content index, in addition to factoring in the median number of shares per article. By typing "Featured Snippets" into the search field, for example, you get a two-line graph that's rich in details that can instantly inform your content strategy:
“We're using the graph as a rough indicator of audience interest in the topic,” says Jay Leary, Moz's senior product manager for Moz Content and one of the lead architects behind the product. “It’s sort of like a Google Trends, but instead of searches for a topic, we're looking at the sharing of articles about the topic." Below the graph you'll find a list of results along with their content metrics, including Reach, Links, Discovery Date and a host of other metrics, including those associated with social shares:
Created by content marketers, for content marketersEver since Matthew J. Brown announced the Beta version of Moz Content at MozCon 2015, we've been focused on designing, creating and delivering a tool that will make it easier for marketers to create the types of content that'll resonate with their audiences. The Tracked Audits feature is ideal for brands who already have an audience, but if you're just getting started, the focus is usually on research. That's where Content Search comes in. The Tracked Audits feature, for example, provides marketers with all of the information a normal content audit would, but has the added dimension of an extended and customizable timeline. Instead of spending hours manually (which I've done numerous times; it's no fun) you can simply have updates emailed to you detailing everything you need to know to track content performance. Also, thanks to Content Search, you can find the most popular pieces of content from across the web via a simple topic search: Not only will this allow you to be better informed about the content types and content topics you should create, but it also alerts you to who your main competitors are for content you desire to create, as well. Share your feedbackIf you haven't tried Moz Content yet, now is as good a time as any to give it a whirl: https://moz.com/content If you've already been using the product, try out the newest feature and let us know what you think. Either way, we'd love to hear from users of the product. We're always looking for ways to improve it and welcome your input. Feel free to share your thoughts in the comments below. Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read! via The Moz Blog http://tracking.feedpress.it/link/9375/4044373
The company says that AMP pages will not receive a rankings boost. The post AMP breaks out of news into the main Google search results appeared first on Search Engine Land.
Please visit Search Engine Land for the full article. via Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing http://searchengineland.com/amp-breaks-news-main-google-search-results-254965
Columnist Brian Harnish tackles how to analyze links from a Google Webmaster Guidelines perspective, and how to make sure that your client's linking activities don't cause them to lose everything they have gained. The post Link profile analysis: How to prevent penalties by being proactive appeared...
Please visit Search Engine Land for the full article. via Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing http://searchengineland.com/link-profile-analysis-prevent-penalties-proactive-254512
Link building is an art that many, including columnist Andrew Dennis, have learned through trial and error. Today, he shares some of what he's learned about how to identify and obtain better links. The post Link prospecting tips and tricks appeared first on Search Engine Land.
Please visit Search Engine Land for the full article. via Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing http://searchengineland.com/link-prospecting-tips-tricks-254590
No more arguing about what Google means by positions in the Google Search Analytics report. Google has defined the multitude of metrics and cases in a help document. The post New Google help document defines Search Analytics impressions, position and clicks appeared first on Search Engine Land.
Please visit Search Engine Land for the full article. via Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing http://searchengineland.com/google-help-document-defines-impressions-position-clicks-254950 Posted by gfiorelli1 In 2011 I wrote a post here on Moz. The title was "Wake Up SEOs, the New Google is Here." In that post I presented some concepts that, in my personal opinion, we SEOs needed to pay attention to in order to follow the evolution of Google. Sure, I also presented a theory which ultimately proved incorrect; I was much too confident about things like rel="author", rel="publisher", and the potential decline of the Link Graph influence. However, the premises of that theory were substantially correct, and they remain correct five years later:
Many things have changed in our industry in the past 5 years. The time has come to pause, take a few minutes, and assess what Google is and where it's headed. I'll explain how I "study" Google and what I strongly believe we, the SEOs, should pay attention to if we want not only to survive, but to anticipate Google's end game, readying ourselves for the future. Obviously, consider that, while I believe it's backed up by data, facts, and proof, this is my opinion. As such, I kindly ask you not to take what I write for granted, but rather as an incentive for your own investigations and experiments. Exploring the expanded universe of GoogleSEO is a kingdom of uncertainty. However, one constant never changes: almost every SEO dreams of being a Jedi at least once in her life. I, too, fantasize about using the Force… Gianlu Ka Fiore Lli, Master Jedi. Honestly, though, I think I'm more like Mon Mothma. Like her, I am a strategist by nature. I love to investigate, to see connections where nobody else seems to see them, and to dig deeper into finding answers to complex questions, then design plans based on my investigations. This way of being means that, when I look at the mysterious wormhole that is Google, I examine many sources:
Now, when examining all these sources, it's easy to create amazing conspiranoiac (conspiracy + paranoia) theories. And I confess: I helped create, believed, and defended some of them, such as AuthorRank. In my opinion, though, this methodology for finding answers about Google is the best one for understanding the future of our beloved industry of search. If we don't dig into the "Expanded Universe of Google," what we have is a timeline composed only by updates (Panda 1.N, Penguin 1.N, Pigeon…), which is totally useless in the long term: Instead, if we create a timeline with all the events related to Google Search (which we can discover simply by being well-informed), we begin to see where Google's heading: The timeline above confirms what Google itself openly declared: "Machine Learning is a core, transformative way by which we’re rethinking how we’re doing everything." Google is becoming a “Machine Learning-First Company,” as defined by Steven Levy in this post. Machine learning is becoming so essential in the evolution of Google and search, perhaps we should go beyond listening only to official Google spokespeople like Gary Illyes or John Mueller (nothing personal, just to be clear... for instance, read this enlightening interview of Gary Illyes by Woj Kwasi). Maybe we should start paying more attention to what people like Christine Robson, Greg Corrado, Jeff Dean, and the staff of Google Brain write and say. The second timeline tells us that starting in 2013 Google started investing money, intellectual efforts, and energy on a sustained scale in:
2013: The year when everything changedGoogle rolled out Hummingbird only three years ago, but it's not just a saying: that feels like decades ago. Let’s quickly rehash: what's Hummingbird? Hummingbird is the Google algorithm as a whole. It's composed of four phases:
This last phase, Search, is where we can find the “200+ ranking factors” (RankBrain included) and filters like Panda or anti-spam algorithms like Penguin. Remember that there are as many search phases as vertical indices exist (documents, images, news, video, apps, books, maps...). We SEOs tend to fixate almost exclusively on the Search phase, forgetting that Hummingbird is more than that. This approach to Google is myopic and does not withstand a very simple logical square exercise.
If even one of the three elements of the logical square is missing, organic visibility is missing; think about non-optimized AngularJS websites, and you’ll understand the logic. How can we be SEO Jedi if we only see one facet of the Force? Parsing and indexing: often forgottenOver the past 18 months, we've a sort of technical SEO Renaissance, as defined by Mike King in this fundamental deck and despite attempts to classify technical SEOs as makeup artists. On the contrary, we're still struggling to fully understand the importance of the Parsing and Indexing phases. Of course, we can justify that by claiming that parsing is the most complex of the four phases. Google agrees, as it openly declared when announcing SintaxNet. However, if we don't optimize for parsing, then we're not going to fully benefit from organic search, especially in the months and years to come. How to optimize for parsing and indexingAs a premise to parsing and indexing optimization, we must remember an oft-forgotten aspect of search, which Hummingbird highlighted and enhanced: entity search. If you remember what Amit Singhal said when he announced Hummingbird, he declared that it had “something of Knowledge Graph.” That part was — and I'm simplifying here for clarity's sake — entity search, which is based over two kinds of entities:
Why does entity search matter?It matters because entity search is the reason Google better understands the personal and almost unique context of a query. Moreover, thanks to entity search, Google better understands the meaning of the documents it parses. This means it's able to index them better and, finally, to achieve its main purpose: serving the best answers to the users' queries. This is why semantics is important: semantic search is optimizing for meaning. It's not a ranking factor, it's not needed to improve crawling, but it is fundamental for Parsing and Indexing, the big forgotten-by-SEOs algorithm phases. Semantics and SEOFirst of all, we must consider that there are different kinds of semantics and that, sometimes, people tend to get them confused.
Logical semanticsStructured data is the big guy right now in logical semantics, and Google (both directly and indirectly) is investing a lot in it. A couple of months ago, when the mainstream marketing gurusphere was discussing the 50 shades of the new Instagram logo or the average SEO was (justifiably) shaking his fists against the green “ads” button in the SERPs, Google released the new version of Schema.org. This new version, as Aaron Bradley finely commented here, improves the ability to disambiguate between entities and/or better explain their meaning. For instance, now:
At the same time, we shouldn't forget to always use the most important property of all: “SameAs”, one of few properties that's present in every Schema.org type. Finally, as Mike Arnesen recently explained quite well here on the Moz blog, take advantage of the semantic HTML attributes ItemRef and ItemID. How do we implement Schema.org in 2016?It is clear that Google is pushing JSON-LD as the preferred method for implementing Schema.org The best way to implement JSON-LD Schema.org is to use the Knowledge Graph Search API, which uses the standard Schema.org types and is compliant with JSON-LD specifications. As an alternative, you can use the recently rolled out JSON-LD Schema Generator for SEO tool by Hall Analysis. To solve a common complaint about JSON-LD (its volume and how it may affect the performance of a site), we can:
The importance Google gives to Schema.org and structured data is confirmed by the new and radically improved version of the Structured Data Testing Tool, which is now more actionable for identifying mistakes and test solutions thanks to its JSON-LD (again!) and Schema.org contextual autocomplete suggestions. Semantics is more than structured data #FTW!One mistake I foresee is thinking that semantic search is only about structured data. It's the same kind of mistake people do in international SEO, when reducing it to hreflang alone. The reality is that semantics is present from the very foundations of a website, found in:
HTMLSince its beginnings, HTML included semantic markup (e.g.: title, H1, H2...). Its latest version, HTML5, added new semantic elements, the purpose of which is to semantically organize the structure of a web document and, as W3C says, to allow “data to be shared and reused across applications, enterprises, and communities.” A clear example of how Google is using the semantic elements of HTML are its Featured Snippets or answer boxes. As declared by Google itself (“We do not use structured data for creating Featured Snippets”) and explained well by Dr. Pete, Richard Baxter, and very recently Simon Penson, the documents that tend to be used for answer boxes usually display these three factors:
The conclusion, then, is that semantic search starts in the code and that we should pay more attention to those "boring," time-consuming, not-a-priority W3C error reports. ArchitectureThe semiotician in me (I studied semiotics and the philosophy of language in university with the likes of Umberto Eco) cannot help but not consider information architecture itself as semantics. Let me explain. Everything starts with the right ontologyOntology is a set of concepts and categories in a subject area (or domain) that shows their properties and the relations between them. If we take the Starwars.com site as example, we can see in the main menu the concepts in the Star Wars subject area:
Ontology leads to taxonomy (because everything can be classified)If we look at Starwars.com, we see how every concept included in the Star Wars domain has its own taxonomy. For instance, the Databank presents several categories, like:
Ontology and taxonomy, then, lead to contextIf we think of Tatooine, we tend to think about the planet where Luke Skywalker lived his youth. However, if we visit a website about deep space exploration, Tatooine would be one of the many exoplanets that astronomers have discovered in the past few years. As you can see, ontology (Star Wars vs celestial bodies) and taxonomies (Star Wars planets vs exoplanets) determine context and help disambiguate between similar entities. Ontology, taxonomy, and context lead to meaningThe better we define the ontology of our website, structure its taxonomy, and offer better context to its elements, the better we explain the meaning of our website — both to our users and to Google. Starwars.com, again, is very good at doing this. For instance, if we examine how it structures a page like the one on TIE fighters, we see that every possible kind of content is used to help explain what a TIE fighter is:
In the case of characters like Darth Vader, the information can be even richer. The effectiveness of the information architecture of the Star Wars website (plus its authority) is such that its Databank is one of the very few non-Wikidata/Wikipedia sources that Google is using as a Knowledge Graph source. What tool can we use to semantically optimize the structure of a website?There are, in fact, several tools we can use to semantically optimize the information architecture of a website. Knowledge Graph Search APIThe first one is the Knowledge Graph Search API, because in using it we can get a ranked list of the entities that match given criteria. This can help us better define the subjects related to a domain (ontology) and can offer ideas about how to structure a website or any kind of web document. RelFinderA second tool we can use is RelFinder, which is one of the very few free tools for entity research. As you can see in the screencast below, RelFinder is based on Wikipedia. Its use is quite simple:
RelFinder will detect entities related to both (e.g.: George Lucas or Marcia Lucas), their disambiguating properties (e.g.: George Lucas as director, producer, and writer) and factual ones (e.g.: lightsabers as an entity related to Star Wars and first seen in Episode IV). RelFinder is very useful if we must do entity research on a small scale, such as when preparing a content piece or a small website. However, if we need to do entity research on a bigger scale, it's much better to rely on the following tools: AlchemyAPI and other toolsAlchemyAPI, which was acquired by IBM last year, uses machine and deep learning in order to do natural language processing, semantic text analysis, and computer vision. AlchemyAPI, which offers a 30-day trial API Key, is based on the Watson technology; it allows us to extract a huge amount of information from text, with concepts, entities, keywords, and taxonomy offered by default. Resources about AlchemyAPI
Others tools that allow us to do entity extraction and semantic analysis on a big scale are: Lexical semanticsAs said before, lexical semantics is that branch of semantics that studies the meaning of words and their relations. In the context of semantic search, this area is usually defined as keyword and topical research. Here on Moz you can find several Whiteboard Friday videos on this topic:
How do we conduct semantically focused keyword and topical research?Despite its recent update, Keyword Planner still can be useful for performing semantically focused keyword and topical research. In fact, that update could even be deemed as a logical choice, from a semantic search point of view. Terms like "PPC" and "pay-per-click" are synonyms, and even though each one surely has a different search volume, it's evident how Google presents two very similar SERPs if we search for one or the other, especially if our search history already exhibits a pattern of searches related to SEM. Yet this dimming of keyword data is less helpful for SEOs in that it makes for harder forecasting and prioritization of which keywords to target. This is especially true when we search for head terms, because it exacerbates a problem that Keyword Planner had: combining stemmed keywords that — albeit having "our keyword" as a base — have nothing in common because they mean completely different things and target very different topics. However (and this is a pro tip), there is a way to discover the most useful keyword, even when they all have the same search volume: how much advertisers bids for it. Trust the market ;-). (If you want to learn more about the recent changes to Keyword Planner, go read this post by Bill Slawski.) Keyword Planner for semantic searchLet's say we want to create a site about Star Wars lightsabers (yes, I am a Star Wars geek). What we could do is this:
Google will offer us these Ad Groups as results: The Ad Groups are a collection of semantically related keywords. They're very useful for:
Remember, then, that Keyword Planner allows us to do other kinds of analysis too, such as breaking down how the discovered keywords/Ad Groups are used by device or by location. This information is useful for understanding the context of our audience. If you have one or a few entities for which you want to discover topics and grouped keywords, working directly in Keyword Planner and exporting everything to Google Sheets or an Excel file can be enough. However, when you have tens or hundreds of entities to analyze, it's much better to use the Adwords API or a tool like SEO Powersuite, which allows you to do keyword research following the method I described above. Google Suggest, Related Searches, and Moz Keyword ExplorerAlongside with using Keyword Planner, we can use Google Suggest and Related Searches. Not for simply individuating topics that people search and then writing an instant blog post or a landing page about them, but for reaffirming and perfecting our site's architecture. Continuing with the example of a site or section specializing in lightsabers, if we look at Google Suggest we can see how "lightsaber replica" is one of the suggestions. Moreover, amongst the Related Searches for "lightsaber," we see "lightsaber replica" again, which is a clear signal of its relevance to "lightsaber." Finally, we can click on and discover "lightsaber replica"-related searches, thus creating what I define as the "search landscape" about a topic. The model above is not scalable if we have many entities to analyze. In that case, a tool like Moz Keyword Explorer can be helpful thanks to the options it offers, as you can see in the snapshot below: Other keywords and topical research sourcesRecently, Powerreviews.com presented survey results that state how Internet users tend to prefer Amazon over Google for searching information about a product (38% vs 35%). So, why not use Amazon for doing keyword and topical research, especially if we are doing it for ecommerce websites or for the MOFU and BOFU phases of our customers' journey? We can use the Amazon Suggest: Or we can use a free tool like the Amazon Keyword Tool by SISTRIX. The Suggest function, though, is present in (almost) every website that has a search box (your own site, even, if you have it well-implemented!). This means that if we're searching for more mainstream and top-of-the-funnel topics, we can use the suggestions of social networks like Pinterest (i.e.: explore the voluptous universe of the "lightsaber cakes" and related topics): Pinterest, then, is a real topical research goldmine thanks to its tagging system:On-pageOnce we've defined the architecture, the topics, and prepared our keyword dictionaries, we can finally work on the on-page facet of our work. The details of on-page SEO are another post for another time, so I'll simply recommend you read this evergreen post by Cyrus Shepard. The best way to grade the semantic search optimization of a written textis to use TF-IDF analysis, offered by sites like OnPage.org (which offers also a clear guide about the advantages and disadvantages of TF-IDF analyisis). Remember that TF-IDF can also be used for doing competitive semantic search analysis and to discover the keyword dictionaries used by our competitors. User behavior / Semiotics and contextIn the beginning of this post, we saw how Google is heavily investing in better understanding the meaning of the documents it crawls, so to better answer the queries users perform. Semantics (and semantic search) is only one of the pillars on which Google is basing this tremendous effort. The other pillar consists of understanding user search behaviors and the context of the users performing a search. User search behaviorRecently, Larry Kim shared two posts based on experiments he did, demonstrating his theory about how RankBrain is about factors like CTR and dwell time. While these posts are super actionable, present interesting information with original data, and confirm other tests conducted in the past, these so-called user signals (CTR and dwell time) may not be directly related to RankBrain but, instead, to user search behaviors and personalized search. Be aware, however, that my statement here above should be taken as a personal theory, because Google itself doesn't really know how RankBrain works. AJ Kohn, Danny Sullivan, and David Harry wrote additional interesting posts about RankBrain, if you want to dig into it (for the record, I wrote about it too here on Moz). Even if RankBrain may be included in the semantic search landscape due to its use of Word2Vec technology, I find it better to concentrate on how Google may use user search behaviors to better understand the relevance of the parsed and indexed documents. Click-through rateSince Rand Fishkin presented his theory — backed up with tests — that Google may use CTR as a ranking factor more than two years ago, a lot has been written about the importance of click-through rate. Common sense suggests that if people click more often on one search snippet than another that perhaps ranks in a higher position, then Google should take that users' signal into consideration, and eventually lift the ranking of the page that consistently receives higher CTR. Common sense, though, is not so easy to apply when it comes to search engines, and repeatedly Googlers have declared that they do not use CTR as a ranking factor (see here and here). And although Google has long since developed a click fraud detection system for Adwords, it's still not clear if it would be able to scale it for organic search. On the other hand — let me be a little bit conspiranoiac — if CTR is not important at all, then why Google has changed the pixels of the title tag and meta description? Just for "better design?" But as Eric Enge wrote in this post, one of the few things we know is that Google filed a patent (Modifying search result ranking based on a temporal element of user feedback, May 2015) about CTR. It's surely using CTR in testing environments to better calculate the value and grade of other rankings factors and — this is more speculative — it may give a stronger importance to click-through rate in those subsets of keywords that clearly express a QDF (Query Deserves Freshness) need. What's less discussed is the importance CTR has in personalized search, as we know that Google tends to paint a custom SERP for each of us depending on both our search history and our personal click-through rate history. They're key in helping Google determine which SERPs will be the most useful for us. For instance:
Finally, even if Google does not use CTR as a ranking factor, this doesn't mean it's not an important metric and signal for SEOs. We have years of experience and hundreds of tests proving how important is to optimize our search snippets (and now Rich Cards) with the appropriate use of structured data in order to earn more organic traffic, even if we rank worst than our competitors. Watch timeHaving good CTR metrics is totally useless if the pages our visitors land on don't fulfill the expectation the search snippet created. This is similar to the difference between a clickbait and a persuasive headline. The first will probably cause a click back to the search results page and the second, instead, will trap and engage the visitors. The ability of a site to retain its users is what we usually call dwell time, but that Google defines as watch time in this patent: Watch Time-Based Ranking (March 2013). This patent is usually cited in relation to video because the patent itself uses video as content example, but Google doesn't restrict its definition to videos alone: In general, "watch time" refers to the total time that a user spends watching a video. However, watch times can also be calculated for and used to rank other types of content based on an amount of time a user spends watching the content. Watch time is indeed a more useful user signal than CTR for understanding the quality of a web document and its content. Are you skeptical and don't trust me? Trust Facebook, then, because it also uses watch time in its news feed algorithm: We’re learning that the time people choose to spend reading or watching content they clicked on from News Feed is an important signal that the story was interesting to them. Context and the importance of personalized searchI usually joke and say that the biggest mistake a gang of bank robbers could do is bring along their smartphones. It'd be quite easy to do PreCrime investigations simply by checking their activity board, which includes their location history on Google Maps. In order to fulfill its mission of offering the best answers to its users, Google must not only understand the web documents it crawls so to index them properly, and not only improve its own ranking factors (taking into consideration the signals users give during their search sessions), but it also needs to understand the context in which users performs a search. Here's what Google knows about us: It's because of this compelling need to understand our context that Google hired the entire Behav.io team back in 2013. Behav.io, if you don't know already, was a company that developed an alpha test software based on its open source framework Funf (still alive), the purpose of which was to record and analyze the data that smartphones keep track of: location, speed, nearby devices and networks, phone activity, noise levels, et al. All this information is required in order to better understand the implicit aspects of a query, especially if done from a smartphone and/or via voice search, and to better process what Tom Anthony and Will Critchlow define as compound queries. However, personalized search is also determined by (again) entity search, specifically by search entities. The relation between search entities creates a "probability score," which may determine if a web document is shown in a determined SERP or not. For instance, let's say that someone performs a search about a topic (e.g.: Wookies) for which she never clicked on a search snippet of our site, but on another that had content about that same topic (e.g.: Wookieepedia) and which linked to the page about it on our site (e.g.: "How to distinguish one wookiee from another?"). Those links — specifically their anchor texts — would help our site and page to earn a higher probability score than a competitor site that isn't linked to by those sites present in the user's search history. This means that our page will have a better probability of appearing in that user's personalized SERP than our competitors'. You're probably asking: what's the actionable point of this patent?Link building/earning is not dead at all, because it's relevant not only to the Link Graph, but also to entity search. In other words, link building is semantic search, too. The importance of branding and offline marketing for SEOOne of classic complaints SEOs have about Google is how it favors brands. The real question, though, should be this: "Why aren't you working to become a brand?" Be aware! I am not talking about "vision," "mission," and "values" here — I'm talking about plain and simple semantics. All throughout this post I spoke of entities (named and search ones), cited Word2Vec (vectors are "vast amounts of written language embedded into mathematical entities"), talked about lexical semantics, meaning, ontology, personalized search, and implied topics like co-occurrences and knowledge base. Branding has a lot to do with all of these things. I'll try to explain it with a very personal example. Last May in Valencia I debuted as conference organizer with The Inbounder. One of the problems I faced when promoting the event was that "inbounder," which I thought was a cool name for an event targeting inbound marketers, is also a basketball term. The problem was obvious: how do I make Google understand that The Inbounder was not about basketball, but digital marketing? The strategy we followed from the very beginning was to work on the branding of the event (I explain more about The Inbounder story here on Inbound.org). We did this:
As a result, right now The Inbounder occupies all the first page of Google for its brand name and, more importantly in semantics terms, Google presents The Inbounder events as suggested and related searches. It associates it with all the searches I could ever want: Another example is Trivago and its global TV advertising campaigns: Trivago was very smart in constantly showing "Trivago" and "hotel" in the same phrase, even making their motto "Hotel? Trivago." This is a simple psychological trick for creating word associations. As a result, people searched on Google for "hotel Trivago" (or "Trivago hotel"), especially just after the ads were broadcasted: One of the results is that now, Google suggests "hotel Trivago" when we start typing "hotel" and, as in the case of The Inbounder, it presents "hotel Trivago" as a related search: Wake up SEOs, the new new Google is hereYes, it is. And it's all about better understanding web documents and queries in order to provide the best answers to its users (and make money in the meantime). To achieve this objective, ideally becoming the long-desired "Star Trek computer," Google is investing money, people, and efforts into machine/deep learning, neural networks, semantics, search behavior, context analysis, and personalized search. Remember, SEO is no longer just about "200 ranking factors." SEO is about making our websites become the sources Google cannot help but use for answering queries. This is exactly why semantic search is of utmost importance and not just something worth the attention of a few geeks passionate about linguistics, computer science, and patents. Work on parsing and indexing optimization now, seriously implement semantic search in your SEO strategy, take advantage of the opportunities personalized search offers you, and always put users at the center of everything you do. In doing so you'll build a solid foundation for your success in the years to come, both via classic search and with Google Assistant/Now. Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read! via The Moz Blog http://tracking.feedpress.it/link/9375/4039617
Below is what happened in search today, as reported on Search Engine Land and from other places across the web. The post SearchCap: Bing & Google Olympics, Yahoo tests & more appeared first on Search Engine Land.
Please visit Search Engine Land for the full article. via Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing http://searchengineland.com/searchcap-bing-google-olympics-yahoo-tests-254926
Powered by Bing Predicts, the new "Events to Watch" tool will be updated daily and offer a schedule of "must-watch" Olympic moments. The post Bing adds 2016 Rio Olympic Games search feature & “Events to Watch” prediction tool appeared first on Search Engine Land.
Please visit Search Engine Land for the full article. via Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing http://searchengineland.com/bing-adds-2016-rio-olympic-games-search-feature-events-watch-prediction-tool-254900
Searching your name will return a "Stay in the loop" tool at bottom of results that makes it easy to set up a Google Alert for yourself. The post New Google Alert widget saves us from repeatedly Googling ourselves appeared first on Search Engine Land.
Please visit Search Engine Land for the full article. via Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing http://searchengineland.com/new-google-alert-widget-saves-us-repeatedly-googling-254878
Olympic event information will be delivered via Google search, along with broadcaster videos on YouTube. The post Google gears up for 2016 Rio Olympics with search updates & YouTube highlight videos appeared first on Search Engine Land.
Please visit Search Engine Land for the full article. via Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing http://searchengineland.com/google-gears-2016-rio-olympics-search-updates-youtube-highlight-videos-254855
Columnist Winston Burton discusses how search engine optimization (SEO) has evolved over time and wonders whether the job title is truly representative of the work we now do. The post Is “SEO” the right term anymore? appeared first on Search Engine Land.
Please visit Search Engine Land for the full article. via Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing http://searchengineland.com/seo-right-term-anymore-254487
Columnist Andrew Shotland shares insights gleaned from a large-scale statistical analysis of local search ranking factors in Google. The post How does Google’s local algorithm work in 2016? appeared first on Search Engine Land.
Please visit Search Engine Land for the full article. via Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing http://searchengineland.com/googles-local-algorithm-work-2016-254579
Google Maps for iOS adds multiple destination support for driving directions. The post Google adds multi-stop support to Google Maps iOS app appeared first on Search Engine Land.
Please visit Search Engine Land for the full article. via Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing http://searchengineland.com/google-adds-multi-stop-support-google-maps-ios-app-254846
Days after Verizon announces it will acquire Yahoo, Yahoo is testing a new search bar at the top with a missing logo at the left. The post Yahoo testing new search bar where logo is on the right, not the left appeared first on Search Engine Land.
Please visit Search Engine Land for the full article. via Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing http://searchengineland.com/yahoo-testing-new-search-bar-logo-right-not-left-254840 Posted by Cyrus-Shepard Is it time to rewrite the SEO playbooks? For what seems like forever, SEOs have operated by a set of best practices that dictate how to best handle redirection of URLs. (This is the practice of pointing one URL to another. If you need a quick refresher, here’s a handy guide on HTTP status codes.) These tried and true old-school rules included:
These represent big concerns for anyone who wants to change a URL, deal with an expired product page, or move an entire website. The risk of losing traffic can mean that making no change at all becomes the lesser of two evils. Many SEOs have delayed site migrations, kept their URLs ugly, and have put off switching to HTTPS because of all the downsides of switching. The New Rules of 3xx RedirectionPerhaps because of the downsides of redirection — especially with HTTPS — Google has worked to chip away at these axioms over the past several months.
30x redirects don't lose PageRank anymore. Do these surprising changes mean all is well and good now? Yes and no. While these are welcome changes from Google, there are still risks and considerations when moving URLs that go way beyond PageRank. We’ll cover these in a moment. First, here’s a diagram that attempts to explain the old concepts vs. Google’s new announcements. Let’s cover some myths and misconceptions by answering common questions about redirection. Q: Can I now 301 redirect everything without risk of losing traffic?A: NoAll redirects carry risk. While it’s super awesome that Google is no longer “penalizing” 301 redirects through loss of PageRank, keep in mind that PageRank is only one signal out of hundreds that Google uses to rank pages. Ideally, if you 301 redirect a page to an exact copy of that page, and the only thing that changes is the URL, then in theory you may expect no traffic loss with these new guidelines. That said, the more moving parts you introduce, the more things start to get hairy. Don’t expect to your redirects to non-relevant pages to carry much, if any, weight. Redirecting your popular Taylor Swift fan page to your affiliate marketing page selling protein powder is likely dead in the water. In fact, Glenn Gabe recently uncovered evidence that Google treats redirects to irrelevant pages as soft 404s. In other words, it's a redirect that loses both link equity and relevance. See: How to Completely Ruin (or Save) Your Website With Redirects Q: Is it perfectly safe to use 302 for everything instead of 301s?A: Again, noA while back we heard that the reason Google started treating 302 (temporary) redirects like 301s (permanent) is that so many websites were implementing the wrong type (302s when they meant 301s), that it caused havoc on how Google ranked pages. The problem is that while we now know that Google passes PageRank though 302s, we still have a few issues. Namely:
Rand Fishkin summed it up nicely. On Google's announcement that "30xs pass pagerank" -- be wary. Test. Don't assume. Pagerank isn't the only or most important ranking signal. Google's made announcements like this before that later showed to work differently in the real world. Pays to be a skeptic in our field. Q: If I migrate my site to HTTPS, will I keep all my traffic?A: MaybeHere’s the thing about HTTPS migrations: they’re complicated. A little backstory. Google wants the entire web to switch to HTTPS. To this end, they announced a small rankings boost to encourage sites to make the switch. The problem was that a lot of webmasters weren’t willing to trade a tiny rankings boost for the 15% loss in link equity they would experience by 301 redirecting their entire site. This appears to be the reason Google made the switch to 301s not losing PageRank. Even without PageRank issues, HTTPS migrations can be incredibly complicated, as Wired discovered to their dismay earlier this year. It’s been over a year since we migrated Moz.com, and we’re glad we did, but there were lots of moving parts in play and the potential for lots of things to go wrong. So as with any big project, be aware of the risks as well as the rewards. Case study: Does it work?Unknowingly, I had the chance to test Google’s new 3xx PageRank rules when migrating a small site a few months ago. (While we don’t know when Google made the change, it appears it’s been in place for awhile now.) This particular migration not only moved to HTTPS, but to an entirely new domain as well. Other than the URLs, every other aspect of the site remained exactly the same: page titles, content, images, everything. That made it the perfect test. Going in, I fully expected to see a drop in traffic due to the 15% loss in PageRank. Below in the image, you can see what actually happened to my traffic. Instead of a decline as expected, traffic actually saw a boost after the migration. Mind. Blown. This could possibly be from the small boost that Google gives HTTPS sites, though we can’t be certain. Certainly this one small case isn't enough to prove decisively how 301s and HTTPS migrations work, but it's a positive sign in the right direction. The New Best PracticesWhile it’s too early to write the definitive new best practices, there are a few salient points to keep in mind about Google’s change to how PageRank passes through 3xx redirects.
When in doubt, see Best Practice #1. Happy redirecting! Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read! via The Moz Blog http://tracking.feedpress.it/link/9375/4030137
Below is what happened in search today, as reported on Search Engine Land and from other places across the web. The post SearchCap: AdWords reports, CTR data & Google Maps ads appeared first on Search Engine Land.
Please visit Search Engine Land for the full article. via Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing http://searchengineland.com/searchcap-adwords-reports-ctr-data-google-maps-ads-254806
The agency looked at expanded text ad performance from both brand and non-brand traffic. The post Merkle’s early data on expanded text ad CTRs: results are mixed appeared first on Search Engine Land.
Please visit Search Engine Land for the full article. via Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing http://searchengineland.com/expanded-text-ads-ctr-early-results-merkle-254766
Google Maps ads are changing to help local businesses become more visible. Columnist Will Scott discusses the four features you should be most excited about. The post Excited about Google’s new map ads? You should be! appeared first on Search Engine Land.
Please visit Search Engine Land for the full article. via Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing http://searchengineland.com/excited-googles-new-map-ads-254629
New to the world of search engine optimization (SEO)? Columnist John Lincoln explains some things you might not know about this online marketing discipline. The post 9 things most people don’t understand about SEO appeared first on Search Engine Land.
Please visit Search Engine Land for the full article. via Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing http://searchengineland.com/9-things-people-dont-understand-seo-252839
In this week’s Search In Pictures, here are the latest images culled from the web, showing what people eat at the search engine companies, how they play, who they meet, where they speak, what toys they have and more. Google’s Gary Illyes in scary clown mask Source: Twitter Real Google...
Please visit Search Engine Land for the full article. via Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing http://searchengineland.com/search-pics-nba-players-google-pokemon-go-gamers-google-koolaid-254751 Should SEOs and Marketers Continue to Track and Report on Keyword Rankings? - Whiteboard Friday7/29/2016 Posted by randfish Is the practice of tracking keywords truly dying? There's been a great deal of industry discussion around the topic of late, and some key points have been made. In today's Whiteboard Friday, Rand speaks to the biggest challenges keyword rank tracking faces today and how to solve for them. Video TranscriptionHowdy, Moz fans, and welcome to another edition of Whiteboard Friday. This week we're going to chat about keyword ranking reports. There have been a few articles that have come out recently on a number of big industry sites around whether SEOs should still be tracking their keyword rankings. The case against — and solutions for — keyword ranking dataA. People, places, and thingsSo let's start with the case against keyword ranking data. First off, "keyword ranking reports are inaccurate." There's personalization, localization, and device type, and that biases and has removed what is the "one true ranking." We've done a bunch of analyses of these, and this is absolutely the case. Personalization, turns out, doesn't change ranking that much on average. For an individual it can change rankings dramatically. If they visited your website before, they could be historically biased to you. Or if they visited your competitor's, they could be biased. Their previous search history might have biased them in a single session, those kinds of things. But with the removal of Google+ from search results, personalization is actually not as dramatically changing as it used to be. Localization, though, still huge, absolutely, and device differences, still huge. SolutionBut we can address this, and the way to do that is by tracking these things separately. So here you can see I've got a ranking report that shows me my mobile rankings versus my desktop rankings. I think this is absolutely essential. Especially if you're getting a lot of traffic from both mobile and desktop search, you need to be tracking those separately. Super smart. Of course we should do that. We can do the same thing on the local side as well. So I can say, "Here, look. This is how I rank in Seattle. Here's how I rank in Minneapolis. Here's how I rank in the U.S. with no geographic personalization," if Google were to do that. Those types of rankings can also be pretty good. It is true that local ranked tracking has gotten a little more challenging, but we've seen that folks like, well Moz itself, but folks like STAT (GetStat), SERPs.com, Search Metrics, they have all adjusted their rank tracking methodologies in order to have accurate local rank tracking. It's pretty good. Same with device type, pretty darn good. B. Keyword value estimationAnother big problem that is expressed by a number of folks here is we no longer know how much traffic an individual keyword sends. Because we don't know how much an individual keyword sends, we can't really say, "What's the value of ranking for that keyword?" Therefore, why bother to even track keyword rankings? I think this is a little bit of spurious logic. The leap there doesn't quite make sense to me. But I will say this. If you don't know which keywords are sending you traffic specifically, you still know which pages are receiving search traffic. That is reported. You can get it in your Google Analytics, your Omniture report, whatever you're using, and then you can tie that back to keyword ranking reports showing which pages are receiving traffic from which keywords. Most all of the ranked tracking platforms, Moz included, has a report that shows you something like this. It says, "Here are the keywords that we believe are likely to have sent these percentages of traffic to this page based on the keywords that you're tracking, based on the pages that are ranking for them, and how much search traffic those pages receive." SolutionSo let's track that. We can look at pages receiving visits from search, and we can look at which keywords they rank for. Then we can tie those together, which gives us the ability to then make not only a report like this, but a report that estimates the value contributed by content and by pages rather than by individual keywords. In a lot of ways, this is almost superior to our previous methodology of tracking by keyword. Keyword can still be estimated through AdWords, through paid search, but this can be estimated on a content basis, which means you get credit for how much value the page has created, based on all the search traffic that's flowed to it, and where that's at in your attribution lifecycle of people visiting those pages. C. Tracking rankings and keyword relevancyPages often rank for keywords that they aren't specifically targeting, because Google has gotten way better with user intent. So it can be hard or even impossible to track those rankings, because we don't know what to look for. Well, okay, I hear you. That is a challenge. This means basically what we have to do is broaden the set of keywords that we look at and deal with the fact that we're going to have to do sampling. We can't track every possible keyword, unless you have a crazy budget, in which case go talk to Rob Bucci up at STAT, and he will set you up with a huge campaign to track all your millions of keywords. SolutionIf you have a smaller budget, what you have to do is sample, and you sample by sets of keywords. Like these are my high conversion keywords — I'm going to assume I have a flower delivery business — so flower delivery and floral gifts and flower arrangements for offices. My long tail keywords, like artisan rose varieties and floral alternatives for special occasions, and my branded keywords, like Rand's Flowers or Flowers by Rand. I can create a bunch of different buckets like this, sample the keywords that are in them, and then I can track each of these separately. Now I can see, ah, these are sets of keywords where I've generally been moving up and receiving more traffic. These are sets of keywords where I've generally been moving down. These are sets of keywords that perform better or worse on mobile or desktop, or better or worse in these geographic areas. Right now I can really start to get true intelligence from there. Don't let your keyword targeting — your keyword targeting meaning what keywords you're targeting on which pages — determine what you rank track. Don't let it do that exclusively. Sure, go ahead and take that list and put that in there, but then also do some more expansive keyword research to find those broad sets of search terms and phrases that you should be monitoring. Now we can really solve this issue. D. Keyword rank tracking with a purposeThis one I think is a pretty insidious problem. But for many organizations ranking reports are more of a historical artifact. We're not tracking them for a particular reason. We're tracking them because that's what we've always tracked and/or because we think we're supposed to track them. Those are terrible reasons to track things. You should be looking for reasons of real value and actionability. Let's give some examples here. SolutionWhat I want you to do is identify the goals of rank tracking first, like: What do I want to solve? What would I do differently based on whether this data came back to me in one way or another? If you don't have a great answer to that question, definitely don't bother tracking that thing. That should be the rule of all analytics. So if your goal is to say, "Hey, I want to be able to attribute a search traffic gain or a search traffic loss to what I've done on my site or what Google has changed out there," that is crucially important. I think that's core to SEO. If you don't have that, I'm not sure how we can possibly do our jobs. We attribute search traffic gains and losses by tracking broadly, a broad enough set of keywords, hopefully in enough buckets, to be able to get a good sample set; by tracking the pages that receive that traffic so we can see if a page goes way down in its search visits. We can look at, "Oh, what was that page ranking for? Oh, it was ranking for these keywords. Oh, they dropped." Or, "No, they didn't drop. But you know what? We looked in Google Trends, and the traffic demand for those keywords dropped," and so we know that this is a seasonality thing, or a fluctuation in demand, or those types of things. And we can track by geography and device, so that we can say, "Hey, we lost a bunch of traffic. Oh, we're no longer mobile-friendly." That is a problem. Or, "Hey, we're tracking and, hey, we're no longer ranking in this geography. Oh, that's because these two competitors came in and they took over that market from us." We could look at would be something like identify pages that are in need of work, but they only require a small amount of work to have a big change in traffic. So we could do things like track pages that rank on page two for given keywords. If we have a bunch of those, we can say, "Hey, maybe just a few on-page tweaks, a few links to these pages, and we could move up substantially." We had a Whiteboard Friday where we talked about how you could do that with internal linking previously and have seen some remarkable results there. We can track keywords that rank in position four to seven on average. Those are your big wins, because if you can move up from position four, five, six, seven to one, two, three, you can double or triple your search traffic that you're receiving from keywords like that. You should also track long tail, untargeted keywords. If you've got a long tail bucket, like we've got up here, I can then say, "Aha, I don't have a page that's even targeting any of these keywords. I should make one. I could probably rank very easily because I have an authoritative website and some good content," and that's really all you might need. We might look at some up-and-coming competitors. I want to track who's in my space, who might be creeping up there. So I should track the most common domains that rank on page one or two across my keyword sets. I can track specific competitors. I might say, "Hey, Joel's Flower Delivery Service looks like it's doing really well. I'm going to set them up as a competitor, and I'm going to track their rankings specifically, or I'm going to see..." You could use something like SEMrush and see specifically: What are all the keywords they rank for that you don't rank for? This type of data, in my view, is still tremendously important to SEO, no matter what platform you're using. But if you're having these problems or if these problems are being expressed to you, now you have some solutions. I look forward to your comments. We'll see you again next week for another edition of Whiteboard Friday. Take care. Video transcription by Speechpad.com Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read! via The Moz Blog http://tracking.feedpress.it/link/9375/4003716
Based on AdWords cross-device conversion data, the new reports show device influence through the full conversion path. The post AdWords gains 3 new cross-device attribution reports appeared first on Search Engine Land.
Please visit Search Engine Land for the full article. via Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing http://searchengineland.com/adwords-gains-3-new-cross-device-attribution-reports-254725 |
AuthorWrite something about yourself. No need to be fancy, just an overview. Archives
October 2016
Categories |