SEO techniques can be classified into two broad categories: techniques that search engine companies recommend as part of good design ("white hat"), and those techniques of which search engines do not approve ("black hat"). The search engines attempt to minimize the effect of the latter, among them spamdexing. Industry commentators have classified these methods, and the practitioners who employ them, as either white hat SEO, or black hat SEO.[50] White hats tend to produce results that last a long time, whereas black hats anticipate that their sites may eventually be banned either temporarily or permanently once the search engines discover what they are doing.[51]
Brian hello! First off I want to THANK YOU for this fantastic post. I can’t emphasize that enough. I have this bookmarked and keep going through it to help boost our blog. I totally nerded out on this, especially the LSI keywords which made my day. I know, pathetic, right? But when so much changes in SEO all the time, these kinds of posts are so helpful. So thanks for this. So no question – just praise, hope that’s ok 😁

In addition to optimizing these six areas of your site, analyze your competitors and see what they are doing in terms of on-page optimization, off-page optimization (competitive link analysis) and social media. While you may be doing a lot of the same things they are, it’s incredibly important to think outside the box to get a leg up over the competition.

If you are using Responsive Web Design, use meta name="viewport" tag to tell the browser how to adjust the content. If you use Dynamic Serving, use the Vary HTTP header to signal your changes depending on the user-agent. If you are using separate URLs, signal the relationship between two URLs by tag with rel="canonical" and rel="alternate" elements.
Link text is the visible text inside a link. This text tells users and Google something about the page you're linking to. Links on your page may be internal—pointing to other pages on your site—or external—leading to content on other sites. In either of these cases, the better your anchor text is, the easier it is for users to navigate and for Google to understand what the page you're linking to is about.
Thanks for sharing the easiest methods for increasing website traffic. I was really confused about my website that what it is missing. But after reading your blog thoroughly it really helped me out in increasing my website traffic. The most amazing part of your article is that you have shared each and every basic thing so deeply that there is no place for illusions.
Dig deep into everything you ever needed to know about links from anchor text to redirection. Read this series of pages to understand how and when to use nofollow and whether guest blogging is actually dead. If you're more into the link building side of things (working to improve the rankings on your site by earning links), go straight to the Beginner's Guide to Link Building.
Structured data21 is code that you can add to your sites' pages to describe your content to search engines, so they can better understand what's on your pages. Search engines can use this understanding to display your content in useful (and eye-catching!) ways in search results. That, in turn, can help you attract just the right kind of customers for your business.
So many great tips! There are a couple of things I’ve implemented recently to try and boost traffic. One is to make a pdf version of my post that people can download. It’s a great way to build a list:) Another way is to make a podcast out of my post. I can then take a snippet of it and place it on my Facebook page as well as syndicate it. As far as video I’ve started to create a video with just a few key points from the post. The suggestion about going back to past articles is a tip I am definitely going to use especially since long-form content is so important. Thanks!
To avoid undesirable content in the search indexes, webmasters can instruct spiders not to crawl certain files or directories through the standard robots.txt file in the root directory of the domain. Additionally, a page can be explicitly excluded from a search engine's database by using a meta tag specific to robots (usually ). When a search engine visits a site, the robots.txt located in the root directory is the first file crawled. The robots.txt file is then parsed and will instruct the robot as to which pages are not to be crawled. As a search engine crawler may keep a cached copy of this file, it may on occasion crawl pages a webmaster does not wish crawled. Pages typically prevented from being crawled include login specific pages such as shopping carts and user-specific content such as search results from internal searches. In March 2007, Google warned webmasters that they should prevent indexing of internal search results because those pages are considered search spam.[47]
Great post. I know most of the stuff experienced people read and think “I know that already”… but actually lots of things we tend to forget even though we know them. So its always good to read those. What I liked most was the broken link solution. Not only to create a substitute for the broken link but actually going beyond that. I know some people do this as SEO technique but its actually also useful for the internet as you repair those broken links that others find somewhere else.
Hey Ted, thanks for the great questions! The peak times refer to your particular time zone, if you are targeting an audience that resides in the same zone as you. You can also use tools to find out when most of your audience is online. For example, Facebook has this built into their Page Insights. For Twitter, you can use https://followerwonk.com/. Many social posting tools also offer this functionality.
I’ve just started blogging and there’s a ton of useful information here. I was wondering how to use reddit and you cleared that up for me, as well as when to post to social media. Quora I’m going to check out as I’ve never heard of them-thank you! In your opinion would you also deal with any of the free traffic generators to have people come and engage, or would you skip that step? Would you use meta tags, and if yes how? Thank you for your time and I look forward to hearing from you!
Fortunately, Google puts more weight on the anchor text of external links anyway. So as long as some of your external links have your target anchors, you’re probably OK with a “Home” button. In fact, I’ve ranked homepages with a “Home” anchor text nav button for some seriously competitive terms. So it’s not a make-or-break ranking signal by any means.
Another reason is that if you're using an image as a link, the alt text for that image will be treated similarly to the anchor text of a text link. However, we don't recommend using too many images for links in your site's navigation when text links could serve the same purpose. Lastly, optimizing your image filenames and alt text makes it easier for image search projects like Google Image Search to better understand your images.
Google recommends that all websites use https:// when possible. The hostname is where your website is hosted, commonly using the same domain name that you'd use for email. Google differentiates between the "www" and "non-www" version (for example, "www.example.com" or just "example.com"). When adding your website to Search Console, we recommend adding both http:// and https:// versions, as well as the "www" and "non-www" versions.

Over the next few posts, and starting with this one, I’m going to share with you a detailed 8-step process for creating your own SEO strategy (what I often refer to as an SRD (SEO Research Document)), beginning with defining target audiences and taking it all the way through some fairly comprehensive competitive research, search traffic projections, content strategies, and specific goals and prioritizations.
Getting free website traffic may not cost you monetarily, but it will require effort on your part. However, the effort you put in will equate to the quality of the traffic you generate. As mentioned above, there is no point in getting more traffic to your website if those visitors are not likely to engage with your pages, convert into leads, or become customers.
This philosophy is beautiful in its simplicity, and it serves to correct the “more, more, more” mentality of link building. We only want links from relevant sources. Often, this means that in order to scale our link-building efforts beyond the obvious tactics, we need to create something that deserves links. You have links where it makes sense for you to have links. Simple.

Vary your article length. You should have long, comprehensive articles as well as short and to-the-point articles. Let the content dictate the size; don’t spend too long belaboring a simple point, but don’t be too brief when detail is called for. research suggests the average length should be around 1,600 words, though feel free to vary as you see fit.


Take the 10 pillar topics you came up with in Step 1 and create a web page for each one that outlines the topic at a high level -- using the long-tail keywords you came up with for each cluster in Step 2. A pillar page on SEO, for example, can describe SEO in brief sections that introduce keyword research, image optimization, SEO strategy, and other subtopics as they are identified. Think of each pillar page as a table of contents, where you're briefing your readers on subtopics you'll elaborate on in blog posts.
Fantastic stuff, as usual, Brian. The First Link Priority Rule is always one that causes me great angst. I often get torn between search engines and usability when it comes to the main navigation bar. And, I’ve never known what the heck to do about the “Home” link. You can hardly target your keywords with that one without it being anything but awkward.

What blog posts are generating the most views? What subjects are most popular? And how can you create more, similar content? These are some of the questions you’ll want to be asking yourself as you analyze your website data. Determine what pages are resulting in the most bounces (exit pages) and the pages through which people are entering your site the most (entry pages). For instance, if the majority of people are leaving your site after reaching the About page, that’s a pretty clear indication that something should be changed there.


I feel I have great content…but most of it is within my email marketing campaign instead of my blogs. I’ve used my blogs to include links to my email marketing campaigns to lead to my product. In your opinion, should my blog content be the priority? I find my marketing emails sound more like a blog than just a “tip” or a reason to grab people to my list.
He started by finding an offer that resonated with and is relevant to his audience. In his case, his blog was dedicated to teaching people how to use a software called “Sublime Text.” He simply offered a license to the software for the giveaway. By doing this, not only did he increase the chances of success of his giveaway since his incentive was relevant, but he also ensured the quality of subscribers since they were actually people interested in his content. It’s easy to give people an iPad or an iPhone, but how relevant will they be to you at the end of the day?
This was very interesting. I run a website that promotes sports entertainment amongst teenagers who are graphic designers or video editors. The foundation is in place (Over 60 contributors) so my only focus is how to blog consistently about what goes on in the sports world with appeal to teenagers. I am confident i took a huge step today after learning these 4 steps!

After you have identified your target keywords, you need to create a page targeting that keyword. This is known as SEO content. In many cases, it makes sense to publish a blog post targeting keywords. However, you need to make decisions based on the search intent. If your target keyword phrase is “buy black Nike shoes”, then it doesn’t make sense to create a long-form piece of content.
Advertiser Disclosure: Some of the products that appear on this site are from companies from which QuinStreet receives compensation. This compensation may impact how and where products appear on this site including, for example, the order in which they appear. QuinStreet does not include all companies or all types of products available in the marketplace.
Tip: Along with delicious I search on scoop.it for similar opportunities. If they liked an article related to a year.. say 2013 and you update the resource to 2014 chances are they’ll share it. Kind of a twist on your delicious + sky scraper technique. You don’t even have to make the content much different or better, just updated! Got some fantastic links recently because of it.
Yep and sometimes it’s just being a little creative. I’ve started a little blog on seo/wordpress just for fun actually… no great content on it like here though… but because the competition is so tough in these niches I decided to take another approach. I created a few WordPress plugins that users can download for free from wordpress.org… and of course these link to my site so this gets me visitors each day.

The leading search engines, such as Google, Bing and Yahoo!, use crawlers to find pages for their algorithmic search results. Pages that are linked from other search engine indexed pages do not need to be submitted because they are found automatically. The Yahoo! Directory and DMOZ, two major directories which closed in 2014 and 2017 respectively, both required manual submission and human editorial review.[40] Google offers Google Search Console, for which an XML Sitemap feed can be created and submitted for free to ensure that all pages are found, especially pages that are not discoverable by automatically following links[41] in addition to their URL submission console.[42] Yahoo! formerly operated a paid submission service that guaranteed crawling for a cost per click;[43] however, this practice was discontinued in 2009.
Amazing article. As per my point of view, the best source of traffic in today’s world is nothing but the social networking site. A huge number of people are using social media. So, we can connect with our audience easily. While doing the research, I have found this article: https://www.blurbpointmedia.com/design-social-media-business-marketing-strategy/ which is about the developing the community on the social media. I think the best way to a successful social media account is nothing but the posting different kinds of interesting content on the daily basis!
You can confer some of your site's reputation to another site when your site links to it. Sometimes users can take advantage of this by adding links to their own site in your comment sections or message boards. Or sometimes you might mention a site in a negative way and don't want to confer any of your reputation upon it. For example, imagine that you're writing a blog post on the topic of comment spamming and you want to call out a site that recently comment spammed your blog. You want to warn others of the site, so you include the link to it in your content; however, you certainly don't want to give the site some of your reputation from your link. This would be a good time to use nofollow.

Thanks for sharing the easiest methods for increasing website traffic. I was really confused about my website that what it is missing. But after reading your blog thoroughly it really helped me out in increasing my website traffic. The most amazing part of your article is that you have shared each and every basic thing so deeply that there is no place for illusions.


By relying so much on factors such as keyword density which were exclusively within a webmaster's control, early search engines suffered from abuse and ranking manipulation. To provide better results to their users, search engines had to adapt to ensure their results pages showed the most relevant search results, rather than unrelated pages stuffed with numerous keywords by unscrupulous webmasters. This meant moving away from heavy reliance on term density to a more holistic process for scoring semantic signals.[13] Since the success and popularity of a search engine is determined by its ability to produce the most relevant results to any given search, poor quality or irrelevant search results could lead users to find other search sources. Search engines responded by developing more complex ranking algorithms, taking into account additional factors that were more difficult for webmasters to manipulate. In 2005, an annual conference, AIRWeb (Adversarial Information Retrieval on the Web), was created to bring together practitioners and researchers concerned with search engine optimization and related topics.[14]
×