For some reason I had to delete some pages, these pages are using the HTML suffix, so I blocked them in robots.txt use Disallow: /*.html, but it’s been almost a year, I found that google robot often capture these pages, How can I quickly let Google completely remove these pages? And I have removed these URL from google webmaster tool by google index-> remove URLs, but Google still capture these pages.

Provide full functionality on all devices. Mobile users expect the same functionality - such as commenting and check-out - and content on mobile as well as on all other devices that your website supports. In addition to textual content, make sure that all important images and videos are embedded and accessible on mobile devices. For search engines, provide all structured data and other metadata - such as titles, descriptions, link-elements, and other meta-tags - on all versions of the pages.
I am a newbie in the blogging field and started a health blog few months back. I read so many articles on SEO and gaining traffic to a blog. Some of the articles were very good but your article is great. Your writing style is amazing. The way you described each and every point in the article is very simple which becomes easy to learn for a newbie. Also, you mentioned numerous of ways to get the traffic to our blog which is very beneficial for us. I am highly thankful to you for sharing this information with us.
Search engines attempt to rank results for a given search based on their relevance to the topic, and the quality and reliability a site is judged to have. Google, the world’s most popular search engine, uses an ever-evolving algorithm that aims to evaluate sites in the way that a human reader would. This means that a key part of SEO involves ensuring that the website is a unique and relevant resource for readers.
Relevancy is the first qualifier of a quality link opportunity. The next qualifying factor is the authority of the opportunity. Since Google doesn’t update PageRank (PR) anymore, you must rely on third party metrics. I recommend you use Domain Authority (DA) from Open Site Explorer, Domain Rate (DR) from Ahrefs, or Trust Flow from Majestic to determine the quality of your link opportunities. You should use all three tools if you can.
Holy Engagement! This was an awesome post, full of great info… and then I realized that 3/4 of the actual page was comments… which is even better for shares, SEO and overall engagement. I was lucky enough to attend an event where Neil Patel was giving some great blogging training and a lot of what you covered was there. https://www.thatbloggingthing.com/69-blogging-secrets-i-stole-from-neil-patel/ The simple fact that you comment back is awesome.

Just a suggestion, but maybe you could write an article about generating traffic to a brand new blog. As you know, when you start out, you have only a couple posts and very little credibility with other bloggers, also the search engines will take considerable time to be of any benefit initially. Would be interesting to know how Brian Dean approaches that dilemma!
When Googlebot crawls a page, it should see the page the same way an average user does15. For optimal rendering and indexing, always allow Googlebot access to the JavaScript, CSS, and image files used by your website. If your site's robots.txt file disallows crawling of these assets, it directly harms how well our algorithms render and index your content. This can result in suboptimal rankings.
In December 2009, Google announced it would be using the web search history of all its users in order to populate search results.[33] On June 8, 2010 a new web indexing system called Google Caffeine was announced. Designed to allow users to find news results, forum posts and other content much sooner after publishing than before, Google caffeine was a change to the way Google updated its index in order to make things show up quicker on Google than before. According to Carrie Grimes, the software engineer who announced Caffeine for Google, "Caffeine provides 50 percent fresher results for web searches than our last index..."[34] Google Instant, real-time-search, was introduced in late 2010 in an attempt to make search results more timely and relevant. Historically site administrators have spent months or even years optimizing a website to increase search rankings. With the growth in popularity of social media sites and blogs the leading engines made changes to their algorithms to allow fresh content to rank quickly within the search results.[35]
When referring to the homepage, a trailing slash after the hostname is optional since it leads to the same content ("https://example.com/" is the same as "https://example.com"). For the path and filename, a trailing slash would be seen as a different URL (signaling either a file or a directory), for example, "https://example.com/fish" is not the same as "https://example.com/fish/".
Hi Brian, i absolutely love your content.My competitors and influence rs are very strong-most of them government bodies or supported by government, or travel guides known worldwide.I constantly follow them,get onto them.like,share,comment etc.they share photos that are submitted to them,and i do photography myself which takes hours and still can’t reach out big audience…Any idea please what could i create that my influence rs would love to share (hard to find out what they care,they get 100’s photos submitted daily and collaborate with other big names…) Please help me.
Fantastic stuff, as usual, Brian. The First Link Priority Rule is always one that causes me great angst. I often get torn between search engines and usability when it comes to the main navigation bar. And, I’ve never known what the heck to do about the “Home” link. You can hardly target your keywords with that one without it being anything but awkward.
So many great tips! There are a couple of things I’ve implemented recently to try and boost traffic. One is to make a pdf version of my post that people can download. It’s a great way to build a list:) Another way is to make a podcast out of my post. I can then take a snippet of it and place it on my Facebook page as well as syndicate it. As far as video I’ve started to create a video with just a few key points from the post. The suggestion about going back to past articles is a tip I am definitely going to use especially since long-form content is so important. Thanks!
Write a description that would both inform and interest users if they saw your description meta tag as a snippet in a search result. While there's no minimal or maximal length for the text in a description meta tag, we recommend making sure that it's long enough to be fully shown in Search (note that users may see different sized snippets depending on how and where they search), and contains all the relevant information users would need to determine whether the page will be useful and relevant to them.
While most search engine companies try to keep their processes a secret, their criteria for high spots on SERPs isn't a complete mystery. Search engines are successful only if they provide a user links to the best Web sites related to the user's search terms. If your site is the best skydiving resource on the Web, it benefits search engines to list the site high up on their SERPs. You just have to find a way to show search engines that your site belongs at the top of the heap. That's where search engine optimization (SEO) comes in -- it's a collection of techniques a webmaster can use to improve his or her site's SERP position.
By 2004, search engines had incorporated a wide range of undisclosed factors in their ranking algorithms to reduce the impact of link manipulation. In June 2007, The New York Times' Saul Hansell stated Google ranks sites using more than 200 different signals.[26] The leading search engines, Google, Bing, and Yahoo, do not disclose the algorithms they use to rank pages. Some SEO practitioners have studied different approaches to search engine optimization, and have shared their personal opinions.[27] Patents related to search engines can provide information to better understand search engines.[28] In 2005, Google began personalizing search results for each user. Depending on their history of previous searches, Google crafted results for logged in users.[29]
Thanks for bringing up this point - I agree Eric - competitive positioning can help you determine value that you bring to the table that your competitors dont.  I'm all for it.  Neilsen does some reports that provide awareness, likelihood to recommend, sentiment and other insightsfor your site/brand and your competitors. You can also pull some of that type of insight out of social listening platforms like NetBase, SM2, Radian6, Dow Jones, Nielsen, and so many others.  I've even done some hacked compeitove sentiment comprisons before using Search: searching for [brand or feature] + "like", "love", hate", "wish" etc. 
Relevancy is the first qualifier of a quality link opportunity. The next qualifying factor is the authority of the opportunity. Since Google doesn’t update PageRank (PR) anymore, you must rely on third party metrics. I recommend you use Domain Authority (DA) from Open Site Explorer, Domain Rate (DR) from Ahrefs, or Trust Flow from Majestic to determine the quality of your link opportunities. You should use all three tools if you can.

In 2007, Google announced a campaign against paid links that transfer PageRank.[30] On June 15, 2009, Google disclosed that they had taken measures to mitigate the effects of PageRank sculpting by use of the nofollow attribute on links. Matt Cutts, a well-known software engineer at Google, announced that Google Bot would no longer treat any nofollow links, in the same way, to prevent SEO service providers from using nofollow for PageRank sculpting.[31] As a result of this change the usage of nofollow led to evaporation of PageRank. In order to avoid the above, SEO engineers developed alternative techniques that replace nofollowed tags with obfuscated JavaScript and thus permit PageRank sculpting. Additionally several solutions have been suggested that include the usage of iframes, Flash and JavaScript.[32]

Hey Brian, This article is really really awesome, Seriously you have covered all 21 points which i never read on anywhere over internet. Everyone shares basics but here you have shared awesome info specially that face book keyword research and 1000+ words in a post, and the wiki pedia ideas are really good and helpful. learned many things from this article. keep sharing this kind of info thanks
You understand and agree that all information, including, without limitation, text, images, audio material, video material, links, addresses, data, functionality and other materials (“Content”) that You or a third party allow, submit, post, obtain, email or transmit (or the like) to the Service (collectively, “Your Content”) is Your responsibility and not Our responsibility.
In 1998, two graduate students at Stanford University, Larry Page and Sergey Brin, developed "Backrub", a search engine that relied on a mathematical algorithm to rate the prominence of web pages. The number calculated by the algorithm, PageRank, is a function of the quantity and strength of inbound links.[22] PageRank estimates the likelihood that a given page will be reached by a web user who randomly surfs the web, and follows links from one page to another. In effect, this means that some links are stronger than others, as a higher PageRank page is more likely to be reached by the random web surfer.

Relevancy is the first qualifier of a quality link opportunity. The next qualifying factor is the authority of the opportunity. Since Google doesn’t update PageRank (PR) anymore, you must rely on third party metrics. I recommend you use Domain Authority (DA) from Open Site Explorer, Domain Rate (DR) from Ahrefs, or Trust Flow from Majestic to determine the quality of your link opportunities. You should use all three tools if you can.
Laura,Great post.  This touches something I wish more SEOs practiced: conversion optimization. I think most SEOs think of what they do as a service for, instead of a partnership with clients.  The end result should never be raw traffic, but value obtained through targeted, CONVERTING traffic.You make excellent points about market research, product input, content creation, and other functions many SEOs and SEMs neglect.More and more SEO providers focus only on assembly line basics and worn out techniques instead of challenging themsleves to learn product marketing, usability, and conversion optimization.Your advice on market research is extremely valuable.Great start to a promising series.  I look forward to more!
incredible post and just what i needed! i’m actually kinda new to blogging (my first year coming around) and so far my expertise has been in copy writing/seo copy writing. however link building has become tedious for me. your talk about influencing influencers makes perfect sense, but i find it difficult for my niche. my blog site is made as “gift ideas” and holiday shoppers complete with social networks. i get shares and such from my target audience, but i find that my “influencers” (i.e etsy, red box, vat19, etc.) don’t allow dofollow links and usually can’t find suitable sources. I guess my trouble is just prospecting in general.

Search engines use complex mathematical algorithms to interpret which websites a user seeks. In this diagram, if each bubble represents a website, programs sometimes called spiders examine which sites link to which other sites, with arrows representing these links. Websites getting more inbound links, or stronger links, are presumed to be more important and what the user is searching for. In this example, since website B is the recipient of numerous inbound links, it ranks more highly in a web search. And the links "carry through", such that website C, even though it only has one inbound link, has an inbound link from a highly popular site (B) while site E does not. Note: Percentages are rounded.
Laura,Great post.  This touches something I wish more SEOs practiced: conversion optimization. I think most SEOs think of what they do as a service for, instead of a partnership with clients.  The end result should never be raw traffic, but value obtained through targeted, CONVERTING traffic.You make excellent points about market research, product input, content creation, and other functions many SEOs and SEMs neglect.More and more SEO providers focus only on assembly line basics and worn out techniques instead of challenging themsleves to learn product marketing, usability, and conversion optimization.Your advice on market research is extremely valuable.Great start to a promising series.  I look forward to more!
WOW. I consider myself a total newbie to SEO, but I’ve been working on my Squarespace site for my small business for about 3 years and have read dozens of articles on how to improve SEO. So far, this has been the MOST USEFUL and information-packed resource I’ve found so far. I’m honestly shocked that this is free to access. I haven’t even completely consumed this content yet (I’ve bookmarked it to come back to!) but I’ve already made some significant changes to my SEO strategy, including adding a couple of infographics to blog posts, changing my internal and external linking habits, editing meta descriptions, and a bunch more. Thanks for all the time and passion you’ve out into this.
To avoid undesirable content in the search indexes, webmasters can instruct spiders not to crawl certain files or directories through the standard robots.txt file in the root directory of the domain. Additionally, a page can be explicitly excluded from a search engine's database by using a meta tag specific to robots (usually ). When a search engine visits a site, the robots.txt located in the root directory is the first file crawled. The robots.txt file is then parsed and will instruct the robot as to which pages are not to be crawled. As a search engine crawler may keep a cached copy of this file, it may on occasion crawl pages a webmaster does not wish crawled. Pages typically prevented from being crawled include login specific pages such as shopping carts and user-specific content such as search results from internal searches. In March 2007, Google warned webmasters that they should prevent indexing of internal search results because those pages are considered search spam.[47]
A user-feedback poll is one great, easy way to help you better understand your customers. Kline claims, “Done incorrectly, these can be annoying for a user. Done well, it’s an excellent opportunity to help the customer feel that their opinion matters, while also getting needed insights to better market the company. One poll we ran for an e-commerce client helped us learn that 80% of potential customers cared more about the performance of the product than the price. [So,] we added as much helpful performance information to the website as we could.”
Hack #1: Hook readers in from the beginning. People have low attention spans. If you don’t have a compelling “hook” at the beginning of your blogs, people will click off in seconds. You can hook them in by teasing the benefits of the article (see the intro to this article for example!), telling a story, or stating a common problem that your audience faces.
Great post, your knowledge and innovative approach never fails to amaze me! This is certainly the first time I’ve heard someone suggest the Wikipedia dead link technique. It’s great that you’re getting people to think outside of the box. Pages like reddit are great for getting keywords and can also be used for link building although this can be difficult to get right. Even if you don’t succeed at using it to link build it’s still a really valuable platform for getting useful information. Thanks!

As an Internet marketing strategy, SEO considers how search engines work, the computer-programmed algorithms that dictate search engine behavior, what people search for, the actual search terms or keywords typed into search engines, and which search engines are preferred by their targeted audience. SEO is performed because a website will receive more visitors from a search engine the higher the website ranks in the search engine results page (SERP). These visitors can then be converted into customers.[4]
×