Your readers download your mobile app? Well, Google won't be sending them to those deep links if you have AMP URLs active on your web site. The post Google will show AMP URLs before App deep link URLs in mobile results appeared first on Search Engine Land.
Please visit Search Engine Land for the full article. via Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing http://searchengineland.com/google-will-show-amp-urls-app-deep-link-urls-mobile-results-259204
0 Comments
Posted by Joe.Robison A lot has changed in the five years since I first wrote about what was Google Webmaster Tools, now named Google Search Console. Google has unleashed significantly more data that promises to be extremely useful for SEOs. Since we’ve long since lost sufficient keyword data in Google Analytics, we’ve come to rely on Search Console more than ever. The “Search Analytics” and “Links to Your Site” sections are two of the top features that did not exist in the old Webmaster Tools. While we may never be completely satisfied with Google’s tools and may occasionally call their bluffs, they do release some helpful information (from time to time). To their credit, Google has developed more help docs and support resources to aid Search Console users in locating and fixing errors. Despite the fact that some of this isn’t as fun as creating 10x content or watching which of your keywords have jumped in the rankings, this category of SEO is still extremely important. Looking at it through Portent’s epic visualization of how Internet marketing pieces fit together, fixing crawl errors in Search Console fits squarely into the "infrastructure" piece: If you can develop good habits and practice preventative maintenance, weekly spot checks on crawl errors will be perfectly adequate to keep them under control. However, if you fully ignore these (pesky) errors, things can quickly go from bad to worse. Crawl Errors layoutOne change that has evolved over the last few years is the layout of the Crawl Errors view within Search Console. Search Console is divided into two main sections: Site Errors and URL Errors. Categorizing errors in this way is pretty helpful because there’s a distinct difference between errors at the site level and errors at the page level. Site-level issues can be more catastrophic, with the potential to damage your site’s overall usability. URL errors, on the other hand, are specific to individual pages, and are therefore less urgent. The quickest way to access Crawl Errors is from the dashboard. The main dashboard gives you a quick preview of your site, showing you three of the most important management tools: Crawl Errors, Search Analytics, and Sitemaps. You can get a quick look at your crawl errors from here. Even if you just glance at it daily, you’ll be much further ahead than most site managers. 1. Site ErrorsThe Site Errors section shows you errors from your website as a whole. These are the high-level errors that affect your site in its entirety, so don’t skip these. In the Crawl Errors dashboard, Google will show you these errors for the last 90 days. If you have some type of activity from the last 90 days, your snippet will look like this: If you’ve been 100% error-free for the last 90 days with nothing to show, it will look like this: That’s the goal — to get a “Nice!” from Google. As SEOs we don’t often get any validation from Google, so relish this rare moment of love. How often should you check for site errors?In an ideal world you would log in daily to make sure there are no problems here. It may get monotonous since most days everything is fine, but wouldn’t you kick yourself if you missed some critical site errors? At the extreme minimum, you should check at least every 90 days to look for previous errors so you can keep an eye out for them in the future — but frequent, regular checks are best. We’ll talk about setting up alerts and automating this part later, but just know that this section is critical and you should be 100% error-free in this section every day. There’s no gray area here. A) DNS ErrorsWhat they meanDNS errors are important — and the implications for your website if you have severe versions of these errors is huge. DNS (Domain Name System) errors are the first and most prominent error because if the Googlebot is having DNS issues, it means it can’t connect with your domain via a DNS timeout issue or DNS lookup issue. Your domain is likely hosted with a common domain company, like Namecheap or GoDaddy, or with your web hosting company. Sometimes your domain is hosted separately from your website hosting company, but other times the same company handles both. Are they important?
While Google states that many DNS issues still allow Google to connect to your site, if you’re getting a severe DNS issue you should act immediately. There may be high latency issues that do allow Google to crawl the site, but provide a poor user experience. A DNS issue is extremely important, as it's the first step in accessing your website. You should take swift and violent action if you’re running into DNS issues that prevent Google from connecting to your site in the first place. How to fix
Other tools
B) Server ErrorsWhat they meanA server error most often means that your server is taking too long to respond, and the request times out. The Googlebot that's trying to crawl your site can only wait a certain amount of time to load your website before it gives up. If it takes too long, the Googlebot will stop trying. Server errors are different than DNS errors. A DNS error means the Googlebot can’t even lookup your URL because of DNS issues, while server errors mean that although the Googlebot can connect to your site, it can’t load the page because of server errors. Server errors may happen if your website gets overloaded with too much traffic for the server to handle. To avoid this, make sure your hosting provider can scale up to accommodate sudden bursts of website traffic. Everybody wants their website to go viral, but not everybody is ready! Are they important?Like DNS errors, a server error is extremely urgent. It’s a fundamental error, and harms your site overall. You should take immediate action if you see server errors in Search Console for your site. Making sure the Googlebot can connect to the DNS is an important first step, but you won’t get much further if your website doesn’t actually show up. If you’re running into server errors, the Googlebot won’t be able to find anything to crawl and it will give up after a certain amount of time. How to fixIn the event that your website is running fine at the time you encounter this error, that may mean there were server errors in the past Though this error may have been resolved for now, you should still make some changes to prevent it from happening again. This is Google’s official direction for fixing server errors: “Use Fetch as Google to check if Googlebot can currently crawl your site. If Fetch as Google returns the content of your homepage without problems, you can assume that Google is generally able to access your site properly.” Before you can fix your server errors issue, you need to diagnose specifically which type of server error you’re getting, since there are many types:
Addressing how to fix each of these is beyond the scope of this article, but you should reference Google Search Console help to diagnose specific errors. C) Robots failureA Robots failure means that the Googlebot cannot retrieve your robots.txt file, located at [yourdomain.com]/robots.txt. What they meanOne of the most surprising things about a robots.txt file is that it’s only necessary if you don’t want Google to crawl certain pages. From Search Console help, Google states: “You need a robots.txt file only if your site includes content that you don't want search engines to index. If you want search engines to index everything in your site, you don't need a robots.txt file — not even an empty one. If you don't have a robots.txt file, your server will return a 404 when Googlebot requests it, and we will continue to crawl your site. No problem.” Are they important?This is a fairly important issue. For smaller, more static websites without many recent changes or new pages, it’s not particularly urgent. But the issue should still be fixed. If your site is publishing or changing new content daily, however, this is an urgent issue. If the Googlebot cannot load your robots.txt, it’s not crawling your website, and it’s not indexing your new pages and changes. How to fixEnsure that your robots.txt file is properly configured. Double-check which pages you’re instructing the Googlebot to not crawl, as all others will be crawled by default. Triple-check the all-powerful line of “Disallow: /” and ensure that line DOES NOT exist unless for some reason you do not want your website to appear in Google search results. If your file seems to be in order and you’re still receiving errors, use a server header checker tool to see if your file is returning a 200 or 404 error. What’s interesting about this issue is that it’s better to have no robots.txt at all than to have one that’s improperly configured. If you have none at all, Google will crawl your site as usual. If you have one returning errors, Google will stop crawling until you fix this file. For being only a few lines of text, the robots.txt file can have catastrophic consequences for your website. Make sure you’re checking it early and often. 2. URL ErrorsURL errors are different from site errors because they only affect specific pages on your site, not your website as a whole. Google Search Console will show you the top URL errors per category — desktop, smartphone, and feature phone. For large sites, this may not be enough data to show all the errors, but for the majority of sites this will capture all known problems. Tip: Going crazy with the amount of errors? Mark all as fixed. Many site owners have run into the issue of seeing a large number of URL errors and getting freaked out. The important thing to remember is a) Google ranks the most important errors first and b) some of these errors may already be resolved. If you’ve made some drastic changes to your site to fix errors, or believe a lot of the URL errors are no longer happening, one tactic to employ is marking all errors as fixed and checking back up on them in a few days. When you do this, your errors will be cleared out of the dashboard for now, but Google will bring the errors back the next time it crawls your site over the next few days. If you had truly fixed these errors in the past, they won’t show up again. If the errors still exist, you’ll know that these are still affecting your site. A) Soft 404A soft 404 error is when a page displays as 200 (found) when it should display as 404 (not found). What they meanJust because your 404 page looks like a 404 page doesn’t mean it actually is one. The user-visible aspect of a 404 page is the content of the page. The visible message should let users know the page they requested is gone. Often, site owners will have a helpful list of related links the users should visit or a funny 404 response. The flipside of a 404 page is the crawler-visible response. The header HTTP response code should be 404 (not found) or 410 (gone). A quick refresher on how HTTP requests and responses look: If you're returning a 404 page and it's listed as a Soft 404, it means that the header HTTP response code does not return the 404 (not found) response code. Google recommends “that you always return a 404 (not found) or a 410 (gone) response code in response to a request for a non-existing page.” Another situation in which soft 404 errors may show up is if you have pages that are 301 redirecting to non-related pages, such as the home page. Google doesn’t seem to explicitly state where the line is drawn on this, only making mention of it in vague terms. Officially, Google says this about soft 404s: “Returning a code other than 404 or 410 for a non-existent page (or redirecting users to another page, such as the homepage, instead of returning a 404) can be problematic.” Although this gives us some direction, it’s unclear when it’s appropriate to redirect an expired page to the home page and when it’s not. In practice, from my own experience, if you're redirecting large amounts of pages to the home page, Google can interpret those redirected URLs as soft 404s rather than true 301 redirects. Conversely, if you were to redirect an old page to a closely related page instead, it's unlikely that you'd trigger the soft 404 warning in the same way. Are they important?If the pages listed as soft 404 errors aren't critical pages and you're not eating up your crawl budget by having some soft 404 errors, these aren't an urgent item to fix. If you have crucial pages on your site listed as soft 404s, you’ll want to take action to fix those. Important product, category, or lead gen pages shouldn't be listed as soft 404s if they're live pages. Pay special attention to pages critical to your site’s moneymaking ability. If you have a large amount of soft 404 errors relative to the total number of pages on your site, you should take swift action. You can be eating up your (precious?) Googlebot crawl budget by allowing these soft 404 errors to exist. How to fixFor pages that no longer exist:
For pages that are live pages, and are not supposed to be a soft 404:
Soft 404s are strange errors. They lead to a lot of confusion because they tend to be a strange hybrid of 404 and normal pages, and what is causing them isn't always clear. Ensure the most critical pages on your site aren't throwing soft 404 errors, and you’re off to a good start! B) 404A 404 error means that the Googlebot tried to crawl a page that doesn’t exist on your site. Googlebot finds 404 pages when other sites or pages link to that non-existent page. What they mean404 errors are probably the most misunderstood crawl error. Whether it’s an intermediate SEO or the company CEO, the most common reaction is fear and loathing of 404 errors. Google clearly states in their guidelines: “Generally, 404 errors don't affect your site's ranking in Google, so you can safely ignore them.” I’ll be the first to admit that “you can safely ignore them” is a pretty misleading statement for beginners. No — you cannot ignore them if they are 404 errors for crucial pages on your site. (Google does practice what it preaches, in this regard — going to google.com/searchconsole returns a 404 instead of a helpful redirect to google.com/webmasters) Distinguishing between times when you can ignore an error and when you’ll need to stay late at the office to fix something comes from deep review and experience, but Rand offered some timeless advice on 404s back in 2009: “When faced with 404s, my thinking is that unless the page: The hard work comes in deciding what qualifies as important external links and substantive quantity of traffic for your particular URL on your particular site. Annie Cushing also prefers Rand’s method, and recommends: “Two of the most important metrics to look at are backlinks to make sure you don’t lose the most valuable links and total landing page visits in your analytics software. You may have others, like looking at social metrics. Whatever you decide those metrics to be, you want to export them all from your tools du jour and wed them in Excel.” One other thing to consider not mentioned above is offline marketing campaigns, podcasts, and other media that use memorable tracking URLs. It could be that your new magazine ad doesn’t come out until next month, and the marketing department forgot to tell you about an unimportant-looking URL (example.com/offer-20) that’s about to be plastered in tens thousands of magazines. Another reason for cross-department synergy. Are they important?This is probably one of the trickiest and simplest problems of all errors. The vast quantity of 404s that many medium to large sites accumulate is enough to deter action. 404 errors are very urgent if important pages on your site are showing up as 404s. Conversely, like Google says, if a page is long gone and doesn’t meet our quality criteria above, let it be. As painful as it might be to see hundreds of errors in your Search Console, you just have to ignore them. Unless you get to the root of the problem, they’ll continue showing up. How to fix 404 errorsIf your important page is showing up as a 404 and you don’t want it to be, take these steps:
In short, if your page is dead, make the page live again. If you don’t want that page live, 301 redirect it to the correct page. How to stop old 404s from showing up in your crawl errors reportIf your 404 error URL is meant to be long gone, let it die. Just ignore it, as Google recommends. But to prevent it from showing up in your crawl errors report, you’ll need to do a few more things. As yet another indication of the power of links, Google will only show the 404 errors in the first place if your site or an external website is linking to the 404 page. In other words, if I type in your-website-name.com/unicorn-boogers, it won’t show up in your crawl errors dashboard unless I also link to it from my website. To find the links to your 404 page, go to your Crawl Errors > URL Errors section: Then click on the URL you want to fix: Search your page for the link. It’s often faster to view the source code of your page and find the link in question there: It’s painstaking work, but if you really want to stop old 404s from showing up in your dashboard, you’ll have to remove the links to that page from every page linking to it. Even other websites. What’s really fun (not) is if you’re getting links pointed to your URL from old sitemaps. You’ll have to let those old sitemaps 404 in order to totally remove them. Don’t redirect them to your live sitemap. C) Access deniedAccess denied means Googlebot can’t crawl the page. Unlike a 404, Googlebot is prevented from crawling the page in the first place. What they meanAccess denied errors commonly block the Googlebot through these methods:
Are they important?Similar to soft 404s and 404 errors, if the pages being blocked are important for Google to crawl and index, you should take immediate action. If you don’t want this page to be crawled and indexed, you can safely ignore the access denied errors. How to fixTo fix access denied errors, you’ll need to remove the element that's blocking the Googlebot's access:
While not as common as 404 errors, access denied issues can still harm your site's ranking ability if the wrong pages are blocked. Be sure to keep an eye on these errors and rapidly fix any urgent issues. D) Not followedWhat they meanNot to be confused with a “nofollow” link directive, a “not followed” error means that Google couldn’t follow that particular URL. Most often these errors come about from Google running into issues with Flash, Javascript, or redirects. Are they important?If you’re dealing with not followed issues on a high-priority URL, then yes, these are important. If your issues are stemming from old URLs that are no longer active, or from parameters that aren't indexed and just an extra feature, the priority level on these is lower — but you should still analyze them. How to fixGoogle identifies the following as features that the Googlebot and other search engines may have trouble crawling:
Use either the Lynx text browser or the Fetch as Google tool, using Fetch and Render, to view the site as Google would. You can also use a Chrome add-on such as User-Agent Switcher to mimic Googlebot as you browse pages. If, as the Googlebot, you’re not seeing the pages load or not seeing important content on the page because of some of the above technologies, then you've found your issue. Without visible content and links to crawl on the page, some URLs can’t be followed. Be sure to dig in further and diagnose the issue to fix. For parameter crawling issues, be sure to review how Google is currently handling your parameters. Specify changes in the URL Parameters tool if you want Google to treat your parameters differently. For not followed issues related to redirects, be sure to fix any of the following that apply:
Google used to include more detail on the Not Followed section, but as Vanessa Fox detailed in this post, a lot of extra data may be available in the Search Console API. Other tools
E) Server errors & DNS errorsUnder URL errors, Google again lists server errors and DNS errors, the same sections in the Site Errors report. Google's direction is to handle these in the same way you would handle the site errors level of the server and DNS errors, so refer to those two sections above. They would differ in the URL errors section if the errors were only affecting individual URLs and not the site as a whole. If you have isolated configurations for individual URLs, such as minisites or a different configuration for certain URLs on your domain, they could show up here. Now that you’re the expert on these URL errors, I’ve created this handy URL error table that you can print out and tape to your desktop or bathroom mirror. ConclusionI get it — some of this technical SEO stuff can bore you to tears. Nobody wants to individually inspect seemingly unimportant URL errors, or conversely, have a panic attack seeing thousands of errors on your site. With experience and repetition, however, you will gain the mental muscle memory of knowing how to react to the errors: which are important and which can be safely ignored. It’ll be second nature pretty soon. If you haven’t already, I encourage you to read up on Google’s official documentation for Search Console, and keep these URLs handy for future questions:
We're simply covering the Crawl Errors section of Search Console. Search Console is a data beast on its own, so for further reading on how to make best use of this tool in its entirety, check out these other guides:
Google has generously given us one of the most powerful (and free!) tools for diagnosing website errors. Not only will fixing these errors help you improve your rankings in Google, they help provide a better user experience to your visitors, and help meet your business goals faster. Your turn: What crawl errors issues and wins have you experienced using Google Search Console? Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read! via The Moz Blog http://tracking.feedpress.it/link/9375/4462399
Below is what happened in search today, as reported on Search Engine Land and from other places across the web. The post SearchCap: AMP live in Google, Bing partners with CBS & EU copyright appeared first on Search Engine Land.
Please visit Search Engine Land for the full article. via Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing http://searchengineland.com/searchcap-amp-live-google-bing-partners-cbs-eu-copyright-259154
We’re only six weeks away from MarTech Europe in London, November 1-2, and I’m thrilled with the program — an incredible roster of speakers bringing deep insights and experience across the intersecting fields of marketing, technology, and management. I’m excited to give you a preview of what the...
Please visit Search Engine Land for the full article. via Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing http://searchengineland.com/martech-europe-preview-heres-expect-6-weeks-259127
The deal is aimed at boosting mobile search volume and share for the Bing network. The post Bing partners with CBS Interactive in syndication deal appeared first on Search Engine Land.
Please visit Search Engine Land for the full article. via Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing http://searchengineland.com/bing-ads-cbs-interactive-syndication-259124
Google reiterated that the move does not include a rankings change. The post AMP — Accelerated Mobile Pages — begin global rollout in Google mobile search results appeared first on Search Engine Land.
Please visit Search Engine Land for the full article. via Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing http://searchengineland.com/amp-live-in-google-259109
In this helpful how-to, columnist Todd Saunders explains how to structure your AdWords account so as to glean useful insights about your target audience. The post The junior data scientist’s guide to AdWords search campaign structure: how to mine hidden gems for huge wins appeared first on...
Please visit Search Engine Land for the full article. via Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing http://searchengineland.com/junior-data-scientists-guide-adwords-search-campaign-structrure-mine-hidden-gems-huge-wins-258789
Columnist Ryan Shelley believes that good long-tail keyword targeting is all about knowing your audience -- something Netflix excels at. The post What Netflix can teach us about long-tail keyword research appeared first on Search Engine Land.
Please visit Search Engine Land for the full article. via Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing http://searchengineland.com/netflix-can-teach-us-long-tail-keyword-research-258556 Act now: your chance to learn actionable SEO and SEM tactics ends soon. SMX East is next week!9/20/2016
SMX East kicks off in just a few days! Why settle for flat SEO and SEM performance? Get actionable SEO and SEM tactics and best practices in retargeting, AdWords scripts, backlinks, adaptable content, and more. View the exceptional content in our agenda, then register for the ultimate search...
Please visit Search Engine Land for the full article. via Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing http://searchengineland.com/act-now-chance-learn-actionable-seo-sem-tactics-ends-soon-smx-east-next-week-259084 Posted by dohertyjf I remember when I first started in SEO back in 2010 full-time. It feels like forever ago and yesterday at the same time. I was constantly plugged into the SEO Twitter firehose of information. I subscribed to the popular SEO blogs of the day, soaking up information about SEO that wasn’t even relevant to my day job at the time building links. While I read plenty of content about link acquisition, I also went deep into the geeky sides of technical SEO because it appealed to my web developer background. Every week or two, Google was announcing something new. Some new feature, some new snippet, some new ad type, some new way of getting your pages/sites indexed faster and making them stand out from the crowd. I remember SMX 2012 in New York City where I sat in on a session where now-former Mozzer Matt Brown spoke on Schema.org and counseled all of us to hop on the Schema bandwagon because it was the future of search. You can see that presentation here and I’ll reference it a few times in this post. Five years later, I can look back and say, “Yes, they were right. Schema has stuck around and proven to be a stronger and stronger part of search algorithms and you should learn it and implement it, if you haven't already.” It works and we know that now in 2016, but back in 2012 it was new and took a lot of effort to implement. And so many people simply didn’t. So how can you, as either a small business owner dabbling in SEO (while also doing all the things as the owner) or a professional SEO/digital marketer, know when you should implement something that's brand-new, or whether you should wait on it until you have more data? Is there a history of it?Google is almost twenty years old, if you can believe that. They’ve been around a long time, built a huge business, and changed the way the world’s information is organized, found, and consumed. Google is a once-in-a-lifetime company, and I say that as someone with a love/hate relationship with them (alongside many other SEOs/digital marketers). In spite of their growth and current size, their mission has always been the same: Google’s mission is to organize the world’s information and make it universally accessible and useful. This is at Google’s core. Google has moved into other areas, such as social, but haven't seen great success because they're better at organizing content than creating it. Check out this from Matthew Brown’s talk: The Authorship program was killed in 2014 (post here on SEL), though the idea behind it (identifying who wrote what and where online) lives on to help Google organize the world’s information better. This is a great example of something that everyone said you *should* do (and maybe short-term helped with clickthrough rates), but which Google eventually killed because it was a new initiative. You would have been much better served to spend your time writing around the Internet and marketing your company than just trying to get an image in the SERPs. Are others already implementing it?I hate the United States culture of consumerism and keeping up with the Joneses. Why do we feel the need to spend money that we don’t have to buy things we don’t want to impress people we don’t really like (paraphrase from here)? The same thing happens in digital marketing. If we see someone implementing something, we should rightly ask "Why are they doing that?", then make our own decisions. The interesting thing — just like with impressing our neighbors — is that sometimes (but not always) they will have the inside line on something great that a) you can afford (aka get done for your company) and b) is in line with your personal strategy and values (aka you’re true to yourself). HTTPS is one such example. If you’re a business with customers (which all of you are, because how do you make money without customers? If you can, I’d like to speak with you), then you care about them and want them to be safe and happy. While HTTPS takes time to deploy on large websites, and can have very real challenges as Wired is learning the hard way, on smaller sites it can be much simpler and can be implemented more quickly. You may not see a bump in rankings, traffic, or revenue right away, but you can be sure that HTTPS is something Google wants to and is beginning to reward. Finally, if you see something rolled out and not many people are implementing it, ask why. If it’s because it’s difficult technically but you can get it done fast and it’s true to your strategy, then get it done — it'll help you get ahead of the pack. If it requires a huge undertaking, however, take your time and wait until the barrier to entry is lower or until the search engines finally start making good on their promises. Is it a continuation or a new initiative for Google?Earlier I mentioned Google’s core mission of organizing the world’s information. This is why Google was initially created, and it's what they still do incredibly well. Over time, they’ve (finally) taken the user into account and realized that offering a great user experience benefits their bottom line. User experience (and design!) has become part of their core. There are a few things that Google is terrible at, such as social or content. They’re also terrible at launching software that works really well and can displace incumbents. Google Flights is great, but online travel agencies (OTAs) like Expedia are still winning, even as Google puts themselves above the organic results. That’s just one example. If it’s a brand-new initiative that Google has not previously gone after, be very suspicious. I like the "hurry up and wait" approach here — hurry up to learn all that you can about it, but wait on implementing it, especially if you're a small company with a million things to do already. Stay true to your strategy. If it’s a continuation of something they've already been doing and received traction on, then you should take more notice and seriously consider how you can implement it for your company. Take, for example, the recent rollout of AMP (Accelerated Mobile Pages), which essentially allows Google to display a cached version of your page to mobile users so that it loads quickly and makes users happy(er). Google has said for years that they want above-the-fold content on mobile sites to load in under one second. AMP is a continuation of something they've been conveying for quite a while, a promised initiative they're finally making good on. Within mobile search results, we now see how sites that load quickly tend to rank better than they otherwise would. I’ve witnessed this firsthand on some of the sites I've touched — when engineers care about speed, your site makes both search engines and users very happy. Is it passive or active?Sometimes Google creates new initiatives within search that require no implementation on your end. They run tests all the time (SERPtests is a great resource from Conrad O’Connell) that affect the way your site shows up. Don’t assume that just because they’ve changed something that it’s in your best interest. Google is a business and they exist to make themselves money, not you. As an SEO, you are not Google’s friend. So, once again, we hurry up to learn and then decide whether we should take action (adjust your meta descriptions, add Schema, etc.) or just sit back and let the data accumulate to inform better decisions. The answer will always be different depending on your business, and I can’t tell you whether you'll benefit from specific changes or not. But you're empowered to make that decision. If a new feature requires active development from your end, take the time to figure out why Google's made the change, what it might mean for the future, and how much work it’s going to take to achieve the expected outcome. If you’re a consultant and not helping your clients prioritize their work based off the predicted impact and the amount of effort, you’re not doing your job. And if you’re an in-house SEO in this boat, same message: you’re not doing your job. Does it fit with your current SEO strategy?I’ve touched on this point a few times, but I consider it so important that it merits its own section. I’ve been a consultant since 2011. I’ve worked with businesses of all kinds and ran marketing directly on a few bigger brands as well. I’ve seen companies with zero SEO strategy where we built it from scratch, and I’ve seen companies with an SEO strategy that was set years ago and hasn’t changed. Neither of these is good. An SEO strategy should be set, to a degree. You should know what your business needs to do in order to rank and drive the business results needed from organic search. However, your strategy should not be so set that you're unable to implement new things that are both true to your business and will move that metrics needle. Know where you're going with your strategy and what your metrics are. By having those goals in mind and by putting in place processes that allow you to grow passively, you can confidently say "yes" or "no" to new features that may move the needle or may be a distraction. What about first-mover advantage?Now, I know there are a lot of people who believe that being a first mover is a great thing. And when you’re launching a new business, this mindset is incredibly pervasive. Everyone wants to "find the niche where no one is and be there to be the first mover." The problem is that first mover advantage doesn’t always exist. From a Harvard Business Review article: First-mover status can confer advantages, but it does not do so categorically. Much depends on the circumstances. I don’t really believe in first-mover advantage, and as an entrepreneur, going into a completely new realm where no one else has gone before feels too risky to me. I’d rather take my time to learn from others who are trying to do something similar, figure out the unique angle on the business (whether the vertical or the business model), and then build something that users really want. This is called being wise (listening to others) not just smart (figuring it all out on your own). SEO is a constantly shifting industry. We’re built on the back of a computer algorithm, after all. Because of this, things will change constantly and all digital marketers need to develop a rubric through which you can decide whether a new feature or opportunity is worth your time, effort, and change of strategy long-term.
Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read! via The Moz Blog http://tracking.feedpress.it/link/9375/4454176
Europe sees a fairer marketplace, Google sees a bonanza for lawyers. The post New EU copyright rules: basic fairness or punitive media subsidy? appeared first on Search Engine Land.
Please visit Search Engine Land for the full article. via Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing http://searchengineland.com/new-eu-copyright-rules-basic-fairness-punitive-media-subsidy-259042
Posted by Scott Huffman, VP of Engineering
We have been investing in the core machine learning technologies that enable natural language interfaces for years. To continue that investment, we’re excited to welcome API.AI to Google! API.AI has a proven track record for helping developers design, build and continuously improve their conversational interfaces. Over 60,000 developers are using API.AI to build conversational experiences, for environments such as Slack, Facebook Messenger and Kik, to name just a few. API.AI offers one of the leading conversational user interface platforms and they’ll help Google empower developers to continue building great natural language interfaces. Stay tuned for more on details on integrations into Google. And if you’re already using API.AI, keep building your conversational interfaces and if you’re not, start today! via Google Developers Blog http://developers.googleblog.com/2016/09/making-conversational-interfaces-easier-to-build.html
Below is what happened in search today, as reported on Search Engine Land and from other places across the web. The post SearchCap: SEO vectors, Google data studio & more appeared first on Search Engine Land.
Please visit Search Engine Land for the full article. via Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing http://searchengineland.com/searchcap-seo-vectors-google-data-studio-259018
Currently in beta, Google Data Studio allows you to create branded reports with data visualizations to share with your clients. Columnist Sherry Bonelli explains the benefits and how to try it out. The post What is Google Data Studio and how can you use it? appeared first on Search Engine Land.
Please visit Search Engine Land for the full article. via Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing http://searchengineland.com/google-data-studio-258871
Contributor JR Oakes takes look at technology from the natural language processing and machine-learning community to see if it's useful for SEO. The post Using word vectors and applying them in SEO appeared first on Search Engine Land.
Please visit Search Engine Land for the full article. via Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing http://searchengineland.com/word-vectors-implication-seo-258599 Posted by BrianChilds Coming up with blog titles and topics can be a struggle. Most small businesses aim to publish blogs 3-10 times a month and then use these blog articles to populate everything from newsletters to conversion funnels. When you publish content on a regular basis it's easy to burn through your initial list of blog titles in a few months. Coming up with good titles also takes a lot of time, and when you work on a team defining what's "good" becomes subjective. Because regular blogging has such a positive impact on inbound traffic, the process of coming up with ideas shouldn't be a burden. Never worry about blog topics again: I'll show you how to generate 100+ long-tail blog title ideas that include estimates of search volume and competitiveness. What makes a good blog title?Before jumping into how to generate 100+ blog topics quickly, let's discuss the importance of having good titles. I think of blog content development as having two parts: blog articles that form the core of my SEO or inbound marketing strategy, and a backup list of blog ideas I can pull from in a pinch. Both types benefit from having great titles. Good topics generally follow some basic rules, including:
When it comes to generating a great backup list of blog topics quickly, it can be hard to identify titles that meet those criteria without succumbing to clickbait. There are several blog title generator tools available, but I find that they tend toward clickbait or "catchy" titles that are more useful for paid channels rather than the long-term value expected from organic search. Some of the more popular blog title generators are: HubSpot's Blog Topic Generator Impact's BlogAbout Title Generator Portent's Content Idea Generator It should come as no surprise that there's been a backlash against clickbait titles recently. I recommend against using traditionally clickbait titles since they often result in only one type of beneficial metric: page views. To positively impact both search rank position and on-site conversions you need to focus on valuable content that delivers high engagement measured by things like better-than-average time on page, good page depth, and low bounce rates. Clickbait titles and content generally do not provide this. A better way to generateOkay, so let's take a look at a quick way to generate blog titles. Read it, try it, and time it.
Boom! There you have it. Never hunt for blog titles again. You've created a list you can choose from in a pinch, knowing you have quality titles based on search volume, difficulty, and opportunity. See how fast you can create a great list of blog titles! More tips for professional marketersAs you analyze results from the Keyword Suggestions feature in Keyword Explorer, here are some additional things you can do to learn about your target customers: Look for trends in the questions people ask. Do most questions center on a specific pain point, such as cost, quality, or ease of use? Consider segmenting your users based on these different pain points and their associated value drivers. Find the "best question." In your list of blog titles, look for the one question that best aligns with your target customer. Then run a Keyword Explorer query on that question by selecting the magnifying glass icon on the right side of the webpage. Often, these results will display an even longer, more targeted list of questions to choose from. Hope this helps your blogging efforts! Tell us about your experience using Keyword Explorer to generate targeted blog titles. If you want to keep mastering keywords and blog titles after your Moz Pro free trial ends, check out Moz Pro Medium or Keyword Explorer standalone subscriptions. Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read! via The Moz Blog http://tracking.feedpress.it/link/9375/4447697 Posted by sam.nemzer As of June this year, Google is now grouping keyword volumes for similar keywords in Keyword Planner. I wanted to investigate whether or not this is having an impact on the pages that rank for these similar, grouped keywords. My hypothesis is that, given that Google is associating keywords closely enough to group their volumes, we should expect that the search results would be very similar too. What has Google changed and why does it matter?The grouping of keyword volumes is a problem for anyone working in search because Keyword Planner is the primary source for volume data that we use in keyword research, whether that be from Keyword Planner directly, or through a third party tool that takes Keyword Planner data as its input—such as SEMRush, BrightEdge or SearchMetrics. By "grouping keyword volumes," we mean that different keywords that are slightly different (but generally convey the same meaning) are given the same volume, which represents the combined volume of every variation. For example, if (hypothetically) [SEO] is searched 21,000 times per month in the UK, and [Search Engine Optimisation] is searched 12,100 times per month, once these keywords are combined, each will be reported as receiving the total of the two—33,100 searches per month.\ On top of this, in the last few weeks Google have also been reducing access to keyword planner data for some accounts. Earlier this month, it was announced that Keyword Planner data will be given only in very broad buckets for advertisers with "lower monthly spend" (although some ways around this have been found). This is a separate change from the volume grouping, which is the main focus of this article. The fact that Google is grouping keyword volumes in this way implies that they see these keywords as equivalent, at least to some extent. The questions that this raised for me were:
There is further reason to think this way given the simple fact that Google is always getting smarter. As well as Parsey McParseface, the English language parser that Google released to the public, much of the research output that we see in patents and journal articles from Google relates to natural language processing, so it is clear that this is an area that Google see as a priority for their research. One way to test whether or not Google does indeed consider grouped keywords to be identical is to look at search results. The theory is that if keywords are viewed identically, we should see exactly the same pages ranking for the keywords. What's going on in the SERPs?I did a similar analysis a few months ago, which was focused more on general distinctions between keywords within a topic. This analysis is much more focused on the types of variations of keywords that we are seeing being grouped. These types of variations were categorised by, among others, Jennifer Slegg at The SEM Post. The five types of variations that I've looked into for this analysis are the following:
For each of these five categories, I put together a list of 50-100 keywords, along with a variation for each. Within these keyword pairs I investigated whether or not Keyword Planner reported the same volume, and also used the rank tracking tool STAT to see what pages are ranking for each keyword. From that analysis, I was able to measure the prevalence of grouping keyword volumes within each category (i.e. the percentage of keyword pairs that have grouped volumes), and the similarity of the SERPs (the number of top ten results that were shared between the two keywords) for grouped and ungrouped keyword pairs. ResultsThe results for those metrics are the following: I also looked at how common it is that SERPs are exactly identical, that is that the top ten results are the same pages, in the same order. This showed an interesting pattern. There are only two categories with significant numbers of identical SERPs—Punctuation and Typos. In the case of keywords with and without punctuation, you are more likely to see identical SERPs (implying that Google sees the pair of keywords as identical) if keyword volumes are grouped than if they are not. This is not a hard-and-fast rule though – there are still some ungrouped keywords which have identical SERPs. In the case of Typos, there are no grouped keyword pairs at all that have identical SERPs. Given also the low prevalence of grouped keywords in this category, it appears that the identical SERPs are coming from "showing results for" SERPs, where Google replaces results for the mistyped keywords with the correct one. What conclusions can we draw?
What should we take away from this?What does this mean for SEOs doing keyword research? Rank tracking companies such as STAT are looking into ways of splitting keyword volumes between the constituent keywords, so there is hope for at least semi-accurate volume data. What it does mean is that we should ignore the grouped volumes when targeting keywords—just because keywords are given the same volume, it doesn't mean you shouldn't target them individually on your site. On a wider scale, this tells us something about how the anthropomorphised "Google" thinks and works. There are two very separate factors at work here—what Google tells us, and what we actually see. This is something Rand picked up on in his recent Whiteboard Friday, and it applies across all of search—Google tells us one thing, but search rankings don't necessarily behave the same way. This backs up my belief to never take anything at face value, and always do your own research. Do these results surprise you as much as they do me? Let me know in the comments. Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read! via The Moz Blog http://tracking.feedpress.it/link/9375/4445461
Below is what happened in search today, as reported on Search Engine Land and from other places across the web. The post SearchCap: Google algorithm update, iOS10 widgets & images in mobile appeared first on Search Engine Land.
Please visit Search Engine Land for the full article. via Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing http://searchengineland.com/searchcap-google-algorithm-update-ios10-widgets-images-mobile-258948
While conventional wisdom and recently published studies may hold that link building takes a long time to have a positive impact, columnist Conrad Saam begs to differ and shares four case studies. The post The immediate results of link building appeared first on Search Engine Land.
Please visit Search Engine Land for the full article. via Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing http://searchengineland.com/immediacy-link-building-258239
Getting a handle on the data for multiple-location businesses can be a significant challenge. Columnist Brian Smith provides step-by-step guidelines to making it happen. The post How to manage local listings for enterprise brands appeared first on Search Engine Land.
Please visit Search Engine Land for the full article. via Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing http://searchengineland.com/manage-local-listings-enterprise-brands-258531
Was there a major Google algorithm change this week? Many webmasters believe so. The post Google downplays the Google algorithm ranking update this week as “normal fluctuations” appeared first on Search Engine Land.
Please visit Search Engine Land for the full article. via Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing http://searchengineland.com/google-downplays-google-algorithm-ranking-update-week-normal-fluctuations-258923
Google is now showing images in the mobile search results for product like queries. Do you like the new mobile search snippets? The post Google mobile search results now showing images in the snippets appeared first on Search Engine Land.
Please visit Search Engine Land for the full article. via Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing http://searchengineland.com/google-mobile-search-results-now-showing-images-snippets-258919
In this week’s Search In Pictures, here are the latest images culled from the web, showing what people eat at the search engine companies, how they play, who they meet, where they speak, what toys they have and more. Bing Ads agency awards event: Source: Twitter Google baby big: Source:...
Please visit Search Engine Land for the full article. via Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing http://searchengineland.com/search-pics-bing-ads-agency-awards-google-baby-bib-crossroads-258912
Google search could be on the iOS Search screen with a widget, if it wanted. Why isn't it offering that option to its users? The post There’s no Google Search widget for the iOS 10 Search screen: why that matters appeared first on Search Engine Land.
Please visit Search Engine Land for the full article. via Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing http://searchengineland.com/google-search-widget-ios-258885 Posted by rcancino With all the data that today's marketers can access, there's often still no substitute for the quality of information you can get from interviewing real people. In today's Whiteboard Friday, we welcome Rebekah Cancino -- a partner at Phoenix-based Onward and #MozCon 2016 speaker -- to teach us the whys and hows of great interviews. Video TranscriptionHi, Moz fans. I'm Rebekah Cancino. I'm a partner at Onward, and I lead content strategy and user experience design. Today I'm here to talk to you about how to support the data you have, your keyword data, data around search intent, analytics with real life user interviews. So recently, Rand has been talking a little more about the relationship between user experience design and SEO, whether it's managing the tensions between the two or the importance of understanding the path to customer purchase. He said that in order to understand that path, we have to talk to real people. We have to do interviews, whether that's talking to actual users or maybe just people inside your company that have an understanding of the psychographics and the demographics of your target audience, so people like sales folks or customer service reps. Now, maybe you're a super data-driven marketer and you haven't felt the need to talk to real people and do interviews in the past, or maybe you have done user interviews and you found that you got a bunch of obvious insights and it was a huge waste of time and money. I'm here to tell you that coupling your data with real interviews is always going to give you better results. But having interviews that are useful can be a little bit tricky. The interviews that you do are only as good as the questions you ask and the approach that you take. So I want to make sure that you're all set and prepared to have really good user interviews. All it takes is a little practice and preparation. It's helpful to think of it like this. So the data is kind of telling us what happened. It can tell us about online behaviors, things like keywords, keyword volume, search intent. We can use tools, like KeywordTool.io or Ubersuggest or even Moz's Keyword Explorer, to start to understand that. We can look at our analytics, entry and exit pages, bounces, pages that get a lot of views, all of that stuff really important and we can learn a lot from it. But with our interviews, what we're learning about is the why. This is the stuff that online data just can't tell us. This is about those offline behaviors, the emotions, beliefs, attitudes that drive the behaviors and ultimately the purchase decisions. So these two things working together can help us get a really great picture of the whole story and make smarter decisions. So say, for example, you have an online retailer. They sell mainly chocolate-dipped berries. They've done their homework. They've seen that most of the keywords people are using tend to be something like "chocolate dipped strawberries gifts" or "chocolate dipped strawberries delivered." And they've done the work to make sure that they've done their on-page optimization and doing a lot of other smart things too using that. But then they also noticed that their Mother's Day packages and their graduation gifts are not doing so well. They're starting to see a lot of drop-offs around that product description page and a higher cart abandonment rate than usual. Now, given the data they had, they might make decisions like, "Well, let's see if we can do a little more on-page keyword optimization to reflect what's special about the graduation and Mother's Day gifts, or maybe we can refine the user experience of the checkout process. But if they talk to some real users -- which they did, this is a real story -- they might learn that people who send food gift items, they worry about: Is the person I'm sending the gift to, are they going to be home when this gift arrives? Because this is a perishable item, like chocolate-dipped berries, will it melt? Now, this company, they do a lot of work to protect the berries. The box that they arrive in is super insulated. It's like its own cooler. They have really great content that tells that story. The problem is that content is buried in the FAQs instead of on the pages in places it matters most -- the product detail, the checkout flow. So you can see here how there's an opportunity to use the data and the interview insights together to make smarter decisions. You can get to insights like that for your organization too. Let's talk about some tips that are going to help you make smarter interview decisions. So the first one is to talk to a spectrum of users who represent your ideal audience. Maybe, like with this berry example, their ideal customer tends to skew slightly female. You would want that group of people, that you're talking to, to skew that way too. Perhaps they have a little more disposable income. That should be reflected in the group of people that you're interviewing and so forth. You get it. The next one is to ask day-in-the-life, open-ended questions. This is really important. If you ask typical marketing questions like, "How likely are you to do this or that?" or, "Tell me on a scale of 1 to 10 how great this was," you'll get typical marketing answers. What we want is real nuanced answers that tell us about someone's real experience. So I'll ask questions like, "Tell me about the last time you bought a food gift online? What was that like?" We're trying to get that person to walk us through their journey from the minute they're considering something to how they vet the solutions to actually making that purchase decision. Next is don't influence the answers. You don't want to bias someone's response by introducing an idea. So I wouldn't say something like, "Tell me about the last time you bought a food gift online. Were you worried that it would spoil?" Now I've set them on a path that maybe they wouldn't have gone on to begin with. It's much better to let that story unfold naturally. Moving on, dig deeper. Uncover the why, really important. Maybe when you're talking to people you realize that they like to cook and by sharing a food item gift with someone who's far away, they can feel closer to them. Maybe they like gifts to reflect how thoughtful they are or what good tastes they have. You always want to uncover the underlying motives behind the actions people are taking. So don't be too rushed in skipping to the next question. If you hear something that's a little bit vague or maybe you see a point that's interesting, follow up with some probes. Ask things like, "Tell me more about that," or, "Why is that? What did you like about it?" and so on. Next, listen more than you talk. You have maybe 30 to 45 minutes max with each one of these interviews. You don't want to waste time by inserting yourself into their story. If that happens, it's cool, totally natural. Just find a way to back yourself out of that and bring the focus back to the person you're interviewing as quickly and naturally as possible. Take note of phrases and words that they use. Do they say things like "dipped berries" instead of "chocolate-dipped strawberries?" You want to pay attention to the different ways and phrases that they use. Are there regional differences? What kinds of words do they use to describe your product or service or experience? Are the berries fun, decadent, luxurious? By learning what kind of language and vocabulary people use, you can have copy, meta descriptions, emails that take that into account and reflect that. Find the friction. So in every experience that we have, there's always something that's kind of challenging. We want to get to the bottom of that with our users so we can find ways to mitigate that point of friction earlier on in the journey. So I might ask someone a question like, "What's the most challenging thing about the last time you bought a food gift?" If that doesn't kind of spark an idea with them, I might say something even a little more broad, like, "Tell me about a time you were really disappointed in a gift that you bought or a food gift that you bought," and see where that takes them. Be prepared. Great interviews don't happen by accident. Coming up with all these questions takes time and preparation. You want to put a lot of thought into them. By asking questions that tell me about the nature of the whole journey, you want to be clear about your priorities. Know which questions are most important to you and know which ones are must have pieces of information. That way you can use your time wisely while you still let the conversation flow where it takes you. Finally, relax and breathe. The people you're interviewing are only going to be as relaxed as you are. If you're stiff or overly formal or treating this like it's a chore and you're bored, they're going to pick up on that energy and they're probably not going to feel comfortable sharing their thoughts with you, or there won't be space for that to happen. Make sure you let them know ahead of time, like, "Hey, feel free to be honest. These answers aren't going to be shared in a way that can be attributed directly to you, just an aggregate." And have fun with it. Be genuinely curious and excited about what you're going to learn. They'll appreciate that too. So once you've kind of finished and you've wrapped up those interviews, take a step back. Don't get too focused or caught up on just one of the results. You want to kind of look at the data in aggregate, the qualitative data and let it talk to you. What stories are there? Are you seeing any patterns or themes that you can take note of, kind of like the theme around people being worried about the berries melting? Then you can organize those findings and make sure you summarize it and synthesize it in a way that the people who have to use those insights that you've gotten can make sense of. Make sure that you tell real stories and humanize this information. Maybe you recorded the interviews, which is always a really good idea. You can go back and pull out little sound bites or clips of the people saying these really impactful things and use that when you're presenting the data. So going back to that berry example, if you recall, we had that data around: Hey, we're seeing a lot of drop-offs on the product description page. We're seeing a higher cart abandonment rate. But maybe during the user interviews, we noticed a theme of people talking about how they obsessively click the tracking link on the packages, or they wait for those gift recipients to send them a text message to say, "Hey, I got this present." As you kind of unraveled why, you noticed that it had to do with the fact that these berries might melt and they're worried about that. Well, now you can elevate the content that you have around how those berries are protected in a little cooler-like box on the pages and the places it matters most. So maybe there's a video or an animated GIF that shows people how the berries are protected, right there in the checkout flow. I hope that this encourages you to get out there and talk to real users, find out about their context and use that information to really elevate your search data. It's not about having a big sample size or a huge survey. It's much more about getting to real life experiences around your product or service that adds depth to the data that you have. In doing that, hopefully you'll be able to increase some conversions and maybe even improve behavioral metrics, so those UX metrics that, I don't know, theoretically could lead to higher organic visibility anyway. That's all for now. Thanks so much. Take care. Video transcription by Speechpad.com Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read! via The Moz Blog http://tracking.feedpress.it/link/9375/4423946 |
AuthorWrite something about yourself. No need to be fancy, just an overview. Archives
October 2016
Categories |