Users will occasionally come to a page that doesn't exist on your site, either by following a broken link or typing in the wrong URL. Having a custom 404 page30 that kindly guides users back to a working page on your site can greatly improve a user's experience. Your 404 page should probably have a link back to your root page and could also provide links to popular or related content on your site. You can use Google Search Console to find the sources of URLs causing "not found" errors31.
Social media is one of the most popular free marketing tools around, and plays a role in driving traffic to your website. Use Facebook, Instagram, and LinkedIn to promote blog posts and other useful pages on your website. This way you can turn your social media audience into website visitors, and draw traffic from their networks if you post shareable content.
On another note, we recently went through this same process with an entire site redesign. The executive team demanded we cut out over 75% of the pages on our site because they were useless to the visitor. It's been 60 days since the launch of the new site and I've been able to still increase rankings, long-tail keywords, and even organic traffic. It took a little bit of a "cowboy" mentality to get some simple things done (like using 301's instead of blocking the old content with robots.txt!). I predicted we would lose a lot of our long tail keywords...but we haven't....yet!
Robots.txt is not an appropriate or effective way of blocking sensitive or confidential material. It only instructs well-behaved crawlers that the pages are not for them, but it does not prevent your server from delivering those pages to a browser that requests them. One reason is that search engines could still reference the URLs you block (showing just the URL, no title or snippet) if there happen to be links to those URLs somewhere on the Internet (like referrer logs). Also, non-compliant or rogue search engines that don't acknowledge the Robots Exclusion Standard could disobey the instructions of your robots.txt. Finally, a curious user could examine the directories or subdirectories in your robots.txt file and guess the URL of the content that you don't want seen.
The intent behind “SEO agency” is obvious… The searcher is looking for an SEO agency. Most of these searchers aren’t looking for life lessons from an SEO agency owner. Instead, they are just looking for the best SEO agency to get them more traffic and customers from Google. Plain and simple. I knew this when I created that page, but my SEO ego was too big.
Use calls to action. Ask your readers to get involved. If the readers feel like the site owner is interested in them, they will be more likely to continue coming back. Calls to action engage the reader, and helps keep them on the page. Calls to action may include asking the reader to write in with responses, sound off in the comments, or visit a site.
You may not want certain pages of your site crawled because they might not be useful to users if found in a search engine's search results. If you do want to prevent search engines from crawling your pages, Google Search Console has a friendly robots.txt generator to help you create this file. Note that if your site uses subdomains and you wish to have certain pages not crawled on a particular subdomain, you'll have to create a separate robots.txt file for that subdomain. For more information on robots.txt, we suggest this Webmaster Help Center guide on using robots.txt files13.
“Syndicate carefully: If you syndicate your content on other sites, Google will always show the version we think is most appropriate for users in each given search, which may or may not be the version you’d prefer. However, it is helpful to ensure that each site on which your content is syndicated includes a link back to your original article. You can also ask those who use your syndicated material to use the noindex meta tag to prevent search engines from indexing their version of the content.”
The leading search engines, such as Google, Bing and Yahoo!, use crawlers to find pages for their algorithmic search results. Pages that are linked from other search engine indexed pages do not need to be submitted because they are found automatically. The Yahoo! Directory and DMOZ, two major directories which closed in 2014 and 2017 respectively, both required manual submission and human editorial review. Google offers Google Search Console, for which an XML Sitemap feed can be created and submitted for free to ensure that all pages are found, especially pages that are not discoverable by automatically following links in addition to their URL submission console. Yahoo! formerly operated a paid submission service that guaranteed crawling for a cost per click; however, this practice was discontinued in 2009.
Hey Ted, thanks for the great questions! The peak times refer to your particular time zone, if you are targeting an audience that resides in the same zone as you. You can also use tools to find out when most of your audience is online. For example, Facebook has this built into their Page Insights. For Twitter, you can use https://followerwonk.com/. Many social posting tools also offer this functionality.