Creating high quality content takes a significant amount of at least one of the following: time, effort, expertise, and talent/skill. Content should be factually accurate, clearly written, and comprehensive. So, for example, if you describe your page as a recipe, provide a complete recipe that is easy to follow, rather than just a set of ingredients or a basic description of the dish.
When Googlebot crawls a page, it should see the page the same way an average user does15. For optimal rendering and indexing, always allow Googlebot access to the JavaScript, CSS, and image files used by your website. If your site's robots.txt file disallows crawling of these assets, it directly harms how well our algorithms render and index your content. This can result in suboptimal rankings.

Thanks for sharing these great tips last August! I’ve recently adopted them and I have a question (that’s kind of connected to the last post): how important would promoting content be when using this strategy? For example, through Google Adwords. As I guess that would depend on the circumstances, but I am trying to discover if there’s a ‘formula’ here. Thanks in advance!
Google Analytics is free to use, and the insights gleaned from it can help you to drive further traffic to your website. Use tracked links for your marketing campaigns and regularly check your website analytics. This will enable you to identify which strategies and types of content work, which ones need improvement, and which ones you should not waste your time on.

Inclusion in Google's search results is free and easy; you don't even need to submit your site to Google. Google is a fully automated search engine that uses web crawlers to explore the web constantly, looking for sites to add to our index. In fact, the vast majority of sites listed in our results aren't manually submitted for inclusion, but found and added automatically when we crawl the web. Learn how Google discovers, crawls, and serves web pages.3
Thank you so much for these great SEO techniques you posted on your blog. I also follow you on your youtube and listened to almost all of your videos and sometimes I re-listen just to refresh my mind. Because of your techniques, we managed to bring our website to the first pages within a month. Adding external links was something I never imagined that it would work. But it seems like it is working. Anyway, please accept my personal thank you for coming up with and sharing these techniques. I look forward to your new blog posts and youtube videos!

When Larry wrote about the kick in the proverbial teeth that eBay took from Google’s Panda update, we managed to secure a link from Ars Technica in the Editor’s Pick section alongside links to The New York Times and National Geographic. Not too shabby – and neither was the resulting spike in referral traffic. Learn what types of links send lots of referral traffic, and how to get them, in this post.
Guest post on other blogs, and invite other bloggers to guest post on your site. Guest posts are a great way to drive traffic between related blogs, and allow you to gain readers that might not normally make it to your site. Be sure to allow any guest posters to link back to their own site, and share any guest posts the same as you would your own posts.
Make it as easy as possible for website visitors to connect with you by adding a live chat box to your homepage. Include a name and photo in the chat box so that users know they are talking to a real, live person and not just an automated robot. When there is nobody to monitor the live chat, be sure to mention that, by saying something along the lines of, “Nobody is here right now but feel free to leave a message and we will get back to you shortly!”

2. Targeted Keyword Discovery: Ideally you’ll want to do keyword research based on what the audience wants, not solely on what content the site already has (or plans to have sans audience targeting), which may be limited. I can do keyword research on health conditions and drugs (content I have on my site) and determine what the general population is searching for and optimize my current content, or I can cast my net wide and look at what my target audience wants first, then do my keyword research. You may find there are needs that your site is not meeting. Knowing my senior audience is interested in primarily in prescription drug plans and cheap blood pressure medication, I can first make sure I’m providing that content, and then further determine the top keywords in these areas (in the next article Step 2), and use those terms in relevant and high visibility areas on my site.


Getting traffic is always important but one should not worry too much, nothing happens in overnight, Now I read this article and genuinely tried to make my own impression about the post which automatically creates a link to my blog but don’t try hard thinking back links in mind, you always get caught in some or the other way, Panda and Penguin are one such examples.


Brian hello! First off I want to THANK YOU for this fantastic post. I can’t emphasize that enough. I have this bookmarked and keep going through it to help boost our blog. I totally nerded out on this, especially the LSI keywords which made my day. I know, pathetic, right? But when so much changes in SEO all the time, these kinds of posts are so helpful. So thanks for this. So no question – just praise, hope that’s ok 😁
Social media. The algorithms have truly changed since social media first emerged. Many content websites are community-oriented -- Digg began allowing users to vote which stories make the front page, and YouTube factors views and user ratings into their front page rankings. Therefore, e-commerce stores must establish a strong social media presence on sites like Facebook , Pinterest, Twitter, etc. These social media sites send search engines signals of influence and authority.

I can feel the excitement in your writing, and thanks for all this free info you know how to get loyal subscribers, I believe you are one of the best in the business, no up selling just honesty, its so refreshing, i cant keep up with you I have only just finished the awesome piece of content you told me to write and just about to modify it then finally start promoting, i will be looking at this also THANK YOU, PS i couldn’t make your last course but i will get on board for the next one

Traditionally, defining a target audience involves determining their age, sex, geographic locations, and especially their needs (aka pain points). Check out usability.gov’s description of personas and how to do task analysis & scenarios for more details, or better yet, read Vanessa Fox’s upcoming book about personas related to search and conversion.
Google is the most popular spider-driven search engine. Its database currently has about 4 billion pages indexed and is known for finding the most relevant information. When Google spiders the Web, it finds sites by traveling through links. The more sites that link to you, the more important the engines believe your content to be. You should focus on getting many important sites to link to your site. You can do this in many ways: submit to online directories, exchange links with business partners and industry-related sites, or participate in Link Building.

Hello Brian, i am planing to start my blog soon and im in preparation phase (invastigating, learning, etc…). I have read a lot of books and posts about SEO and i can say that this is the best post so far. Its not even a book and you covered more than in books. I would like to thank you for sharing your knowledge with me and rest of the world, thats one of the most appriciate thing that someone can do, even if you do it for your own “good” you shared it! As soon as i start my site ill make and article about you!!
Vary your article length. You should have long, comprehensive articles as well as short and to-the-point articles. Let the content dictate the size; don’t spend too long belaboring a simple point, but don’t be too brief when detail is called for. research suggests the average length should be around 1,600 words, though feel free to vary as you see fit.
To avoid undesirable content in the search indexes, webmasters can instruct spiders not to crawl certain files or directories through the standard robots.txt file in the root directory of the domain. Additionally, a page can be explicitly excluded from a search engine's database by using a meta tag specific to robots (usually ). When a search engine visits a site, the robots.txt located in the root directory is the first file crawled. The robots.txt file is then parsed and will instruct the robot as to which pages are not to be crawled. As a search engine crawler may keep a cached copy of this file, it may on occasion crawl pages a webmaster does not wish crawled. Pages typically prevented from being crawled include login specific pages such as shopping carts and user-specific content such as search results from internal searches. In March 2007, Google warned webmasters that they should prevent indexing of internal search results because those pages are considered search spam.[47]
There are many SEO tactics you can perform on each of your website pages to increase their rank in search engines and get more visitors. This includes producing high-quality content that your audience is searching for, and writing concise meta descriptions for your pages. The meta description appears below your URL in search results. Knowing what a page is about and what will result in a click makes users much more likely to do so. On-page SEO tactics such as these are free, but do take some time. For more help with on-page SEO, check out this blog post: Google Ranking Factors: On-Page vs Off-Page SEO.
Early versions of search algorithms relied on webmaster-provided information such as the keyword meta tag or index files in engines like ALIWEB. Meta tags provide a guide to each page's content. Using metadata to index pages was found to be less than reliable, however, because the webmaster's choice of keywords in the meta tag could potentially be an inaccurate representation of the site's actual content. Inaccurate, incomplete, and inconsistent data in meta tags could and did cause pages to rank for irrelevant searches.[10][dubious – discuss] Web content providers also manipulated some attributes within the HTML source of a page in an attempt to rank well in search engines.[11] By 1997, search engine designers recognized that webmasters were making efforts to rank well in their search engine, and that some webmasters were even manipulating their rankings in search results by stuffing pages with excessive or irrelevant keywords. Early search engines, such as Altavista and Infoseek, adjusted their algorithms to prevent webmasters from manipulating rankings.[12]
Keep resources crawlable. Blocking page resources can give Google an incomplete picture of your website. This often happens when your robots.txt file is blocking access to some or all of your page resources. If Googlebot doesn't have access to a page's resources, such as CSS, JavaScript, or images, we may not detect that it's built to display and work well on a mobile browser. In other words, we may not detect that the page is "mobile-friendly," and therefore not properly serve it to mobile searchers.
Whatever industry you’re in, chances are there are at least one or two major conventions and conferences that are relevant to your business. Attending these events is a good idea – speaking at them is even better. Even a halfway decent speaking engagement is an excellent way to establish yourself as a thought leader in your industry and gain significant exposure for your site.

If you own, manage, monetize, or promote online content via Google Search, this guide is meant for you. You might be the owner of a growing and thriving business, the webmaster of a dozen sites, the SEO specialist in a Web agency or a DIY SEO ninja passionate about the mechanics of Search : this guide is meant for you. If you're interested in having a complete overview of the basics of SEO according to our best practices, you are indeed in the right place. This guide won't provide any secrets that'll automatically rank your site first in Google (sorry!), but following the best practices outlined below will hopefully make it easier for search engines to crawl, index and understand your content.


Write articles rich in content. Quality articles will get ranked better in search results. Make sure that your articles address the needs of your readers, and that they can find all of the information they need in one spot. This is the most effective means for increasing traffic to a website; offering people something that they cannot obtain elsewhere, or at least, not to the level of quality that you are offering it.[1]
Early versions of search algorithms relied on webmaster-provided information such as the keyword meta tag or index files in engines like ALIWEB. Meta tags provide a guide to each page's content. Using metadata to index pages was found to be less than reliable, however, because the webmaster's choice of keywords in the meta tag could potentially be an inaccurate representation of the site's actual content. Inaccurate, incomplete, and inconsistent data in meta tags could and did cause pages to rank for irrelevant searches.[10][dubious – discuss] Web content providers also manipulated some attributes within the HTML source of a page in an attempt to rank well in search engines.[11] By 1997, search engine designers recognized that webmasters were making efforts to rank well in their search engine, and that some webmasters were even manipulating their rankings in search results by stuffing pages with excessive or irrelevant keywords. Early search engines, such as Altavista and Infoseek, adjusted their algorithms to prevent webmasters from manipulating rankings.[12]
Thanks for the great post. I am confused about the #1 idea about wikipedia ded links…it seems like you didn’t finish what you were supposed to do with the link once you found it. You indicated to put the dead link in ahrefs and you found a bunch of links for you to contact…but then what? What do you contact them about and how do you get your page as the link? I’m obviously not getting something 🙁
Website owners recognized the value of a high ranking and visibility in search engine results,[6] creating an opportunity for both white hat and black hat SEO practitioners. According to industry analyst Danny Sullivan, the phrase "search engine optimization" probably came into use in 1997. Sullivan credits Bruce Clay as one of the first people to popularize the term.[7] On May 2, 2007,[8] Jason Gambert attempted to trademark the term SEO by convincing the Trademark Office in Arizona[9] that SEO is a "process" involving manipulation of keywords and not a "marketing service."
×