Thanks for sharing the easiest methods for increasing website traffic. I was really confused about my website that what it is missing. But after reading your blog thoroughly it really helped me out in increasing my website traffic. The most amazing part of your article is that you have shared each and every basic thing so deeply that there is no place for illusions.
I read your post on my mobile phone while on a bus travel and it stirred me due to the fact that I’ve been doing SEO lately the poor man’s way like blog commenting, Social bookmarking, forum signature, directory submission, etc. I don’t know if any of these things still work today since I’ve been practicing them since 2008. These 25 SEO tactics that you have shared got my interest. Actually I am planning to make a new site right now after reading this one. I found out that maybe I’ve been doing a lot of spamming lately that my site is still not ranking on my desired keywords. And also, you have pointed out that it is not just by means of Keyword planner that we will be able to get keywords since there are others like, as what you have said, the wikipedia and the like. I am planning to make use of this article as my guide in starting a new one. I bookmarked it… honestly.. 🙂 And since I have read a lot of articles regarding SEO tips from other sites, I can compare them to your tactics and this is more interesting and exciting. I want to build a quality site that can make me generate income for long years. THANK YOU FOR BEING GENEROUS WITH YOUR KNOWLEDGE. I will try to communicate with you through email and I hope you can coach me Brian. .. please.. 🙂
Backlinks. If content is king, then backlinks are queen. Remember, it's not about which site has the most links, but who has the most quality links pointing back to their website. Build backlinks by submitting monthly or bi-monthly press releases on any exciting company, and contacting popular blogs in your niche to see how you can work together to get a backlink from their website. Create the best possible product site you can, so people talking about the products you sell will link back. Try creating graphics or newsworthy content that will influence bloggers and news websites to link that content.
Expertise and authoritativeness of a site increases its quality. Be sure that content on your site is created or edited by people with expertise in the topic. For example, providing expert or experienced sources can help users understand articles’ expertise. Representing well-established consensus in pages on scientific topics is a good practice if such consensus exists.
In February 2011, Google announced the Panda update, which penalizes websites containing content duplicated from other websites and sources. Historically websites have copied content from one another and benefited in search engine rankings by engaging in this practice. However, Google implemented a new system which punishes sites whose content is not unique.[36] The 2012 Google Penguin attempted to penalize websites that used manipulative techniques to improve their rankings on the search engine.[37] Although Google Penguin has been presented as an algorithm aimed at fighting web spam, it really focuses on spammy links[38] by gauging the quality of the sites the links are coming from. The 2013 Google Hummingbird update featured an algorithm change designed to improve Google's natural language processing and semantic understanding of web pages. Hummingbird's language processing system falls under the newly recognized term of 'Conversational Search' where the system pays more attention to each word in the query in order to better match the pages to the meaning of the query rather than a few words [39]. With regards to the changes made to search engine optimization, for content publishers and writers, Hummingbird is intended to resolve issues by getting rid of irrelevant content and spam, allowing Google to produce high-quality content and rely on them to be 'trusted' authors.
Early versions of search algorithms relied on webmaster-provided information such as the keyword meta tag or index files in engines like ALIWEB. Meta tags provide a guide to each page's content. Using metadata to index pages was found to be less than reliable, however, because the webmaster's choice of keywords in the meta tag could potentially be an inaccurate representation of the site's actual content. Inaccurate, incomplete, and inconsistent data in meta tags could and did cause pages to rank for irrelevant searches.[10][dubious – discuss] Web content providers also manipulated some attributes within the HTML source of a page in an attempt to rank well in search engines.[11] By 1997, search engine designers recognized that webmasters were making efforts to rank well in their search engine, and that some webmasters were even manipulating their rankings in search results by stuffing pages with excessive or irrelevant keywords. Early search engines, such as Altavista and Infoseek, adjusted their algorithms to prevent webmasters from manipulating rankings.[12]

This truly amazing and I’m gonna share this with like minded people. I loved the part about flippa. What a great source to get ideas. Building links tends to be the hardest to do, but a few good quality links is all you need now a days to get ranked. I currently rank for a very high volume keyword with only 5 links all with pr 3,4 and good DA and PA. Good links are hard to get but you only need a few which is encouraging! Props for this post!
Use clean backgrounds. The background textures and color you choose have the ability to drastically affect the overall appeal of the website. Lots of texture and graphics in the background can be distracting. If you are going to use a color on the background, you should make sure there is significant contrast between the background color and the text. Be careful when using brighter and darker colors such as red or yellow. They cause visual fatigue (temporary loss of strength and energy resulting from hard physical or mental work) and the reader will lose their focus on the text.
Vary your article length. You should have long, comprehensive articles as well as short and to-the-point articles. Let the content dictate the size; don’t spend too long belaboring a simple point, but don’t be too brief when detail is called for. research suggests the average length should be around 1,600 words, though feel free to vary as you see fit.
For some reason I had to delete some pages, these pages are using the HTML suffix, so I blocked them in robots.txt use Disallow: /*.html, but it’s been almost a year, I found that google robot often capture these pages, How can I quickly let Google completely remove these pages? And I have removed these URL from google webmaster tool by google index-> remove URLs, but Google still capture these pages.
He started by finding an offer that resonated with and is relevant to his audience. In his case, his blog was dedicated to teaching people how to use a software called “Sublime Text.” He simply offered a license to the software for the giveaway. By doing this, not only did he increase the chances of success of his giveaway since his incentive was relevant, but he also ensured the quality of subscribers since they were actually people interested in his content. It’s easy to give people an iPad or an iPhone, but how relevant will they be to you at the end of the day?
Advertiser Disclosure: Some of the products that appear on this site are from companies from which QuinStreet receives compensation. This compensation may impact how and where products appear on this site including, for example, the order in which they appear. QuinStreet does not include all companies or all types of products available in the marketplace.
In the real world, its not so easy. For example, I have 2 niches where I’m trying to use your technique. By keywords, its Software for Moving and Free Moving Quotes. I have 2 websites that related to each of them, emoversoftware.com (emover-software.com as initial, they linked together) and RealMoving.com ( for latter keyword). So, to begin with, none of those niches has Wikipedia articles, so your first suggestion will not work. But, in general suggestions, you are advising to get backlinks (of authoritative sites of course). But check this out – my site emover-software.com has only 4(!) backlinks (https://openlinkprofiler.org/r/emover-software.com#.VXTaOs9VhBc) and, however, listed as #1 (or #2) by my keyword. (moving software, software for moving, software for moving company). RealMoving.com has more than 600 backlinks and is way back in ranks ( 170 and up) by my keyword. Even though those sites have different competition, its still makes no sense! It doesn’t seem like Google even cares about your backlinks at all! I also checked one of my competitor’s backlinks, its more than 12000, however, his rank by keyword related to moving quotes even worse than mine!.
In 1998, two graduate students at Stanford University, Larry Page and Sergey Brin, developed "Backrub", a search engine that relied on a mathematical algorithm to rate the prominence of web pages. The number calculated by the algorithm, PageRank, is a function of the quantity and strength of inbound links.[22] PageRank estimates the likelihood that a given page will be reached by a web user who randomly surfs the web, and follows links from one page to another. In effect, this means that some links are stronger than others, as a higher PageRank page is more likely to be reached by the random web surfer.
Keywords. Keyword research is the first step to a successful SEO strategy. Those successful with SEO understand what people are searching for when discovering their business in a search engine. These are the keywords they use to drive targeted traffic to their products. Start brainstorming potential keywords, and see how the competition looks by using Google AdWords Keyword Tool. If you notice that some keywords are too competitive in your niche, go with long-tail keywords (between two and five words) which will be easier for you to rank. The longer the keyword, the less competition you will have for that phrase in the engines.

Hey Brian. Even though our own website ranks constantly (last 3 years now) for SEO Companies at Number 1 of Google (obviously when searching from London UK or nearby that is), I sttill keep reading other people’s posts and sending my own out when I find a gold nugget. However, within your clearly written article I have noticed multiple golden nuggets, and was very impressed by your ‘thinking out the box’ approach, and the choices you made for this article. Anytime you want a job as head of R&D for SEO at KD Web, you just let me know 😉
Early versions of search algorithms relied on webmaster-provided information such as the keyword meta tag or index files in engines like ALIWEB. Meta tags provide a guide to each page's content. Using metadata to index pages was found to be less than reliable, however, because the webmaster's choice of keywords in the meta tag could potentially be an inaccurate representation of the site's actual content. Inaccurate, incomplete, and inconsistent data in meta tags could and did cause pages to rank for irrelevant searches.[10][dubious – discuss] Web content providers also manipulated some attributes within the HTML source of a page in an attempt to rank well in search engines.[11] By 1997, search engine designers recognized that webmasters were making efforts to rank well in their search engine, and that some webmasters were even manipulating their rankings in search results by stuffing pages with excessive or irrelevant keywords. Early search engines, such as Altavista and Infoseek, adjusted their algorithms to prevent webmasters from manipulating rankings.[12]
What kind of advice would you give is your site is growing but seems to be attracting the wrong kind of traffic? My visitor numbers are going up but all other indicators such as bounce rate, time page, pages per visit seem to be developing in the wrong direction. Not sure if that’s to be expected or if there is something that I should be doing to counter that development?

Google is the most popular spider-driven search engine. Its database currently has about 4 billion pages indexed and is known for finding the most relevant information. When Google spiders the Web, it finds sites by traveling through links. The more sites that link to you, the more important the engines believe your content to be. You should focus on getting many important sites to link to your site. You can do this in many ways: submit to online directories, exchange links with business partners and industry-related sites, or participate in Link Building.
You may not want certain pages of your site crawled because they might not be useful to users if found in a search engine's search results. If you do want to prevent search engines from crawling your pages, Google Search Console has a friendly robots.txt generator to help you create this file. Note that if your site uses subdomains and you wish to have certain pages not crawled on a particular subdomain, you'll have to create a separate robots.txt file for that subdomain. For more information on robots.txt, we suggest this Webmaster Help Center guide on using robots.txt files13.
Creating high quality content takes a significant amount of at least one of the following: time, effort, expertise, and talent/skill. Content should be factually accurate, clearly written, and comprehensive. So, for example, if you describe your page as a recipe, provide a complete recipe that is easy to follow, rather than just a set of ingredients or a basic description of the dish.
Btw, I was always under the impression that digg and delicious were dying but I’m really mistaken. Your(and Jason’s) thinking is foolproof though. If these guys are already curating content, there’s no reason they wouldn’t want to do more of just that! Seo has become a lot of chasing and pestering…it’s good of you to remind us that there are people out there just waiting to share stuff, too.:)
Hello Brian, i am planing to start my blog soon and im in preparation phase (invastigating, learning, etc…). I have read a lot of books and posts about SEO and i can say that this is the best post so far. Its not even a book and you covered more than in books. I would like to thank you for sharing your knowledge with me and rest of the world, thats one of the most appriciate thing that someone can do, even if you do it for your own “good” you shared it! As soon as i start my site ill make and article about you!!
Instead, in this instance, we started at wireframe stage, plopping in keywords and meta tags. Of course, the site really needed those things, and although it launched technically “optimized”, it wasn’t enough to provide a better product than our top competitor(s). A product that people want to visit, revisit, email to friends, share on social networks, and link to more than our competitors. It wasn’t even enough to move up in the rankings.

5) Post at the right time. Let’s say you want to post in the r/Entrepreneur/ subreddit, but there’s already a post in the #1 spot with 200 upvotes, and it was posted 4 hours ago. If you post at that time, you probably won’t overtake that #1 spot, and you’ll get less traffic. However, if you wait a day, check back, and see that the new #1 spot only has 12-15 upvotes, you’ll have a golden opportunity. It will be much easier for you to hit the #1 spot and get hundreds of upvotes.
Thanks for a this timely article. If I understand it correctly, are you saying that we would better be off looking at market data in our niche and make an article of that for influencers to share rather than actionable tips that the target clients would be interested in? Shouldn’t there be a double strategy – articles for the influencers to share and articles for the users to enjoy?
Well, the age of print media is coming to a close. But there’s no reason why some enterprising blogger couldn’t use the same tactic to get new subscribers. Let’s say you have a lifestyle blog targetting people in San Francisco. You could promote the giveaway through local media, posters, and many other tactics (we’ll get into these methods shortly).
Tip: Along with delicious I search on scoop.it for similar opportunities. If they liked an article related to a year.. say 2013 and you update the resource to 2014 chances are they’ll share it. Kind of a twist on your delicious + sky scraper technique. You don’t even have to make the content much different or better, just updated! Got some fantastic links recently because of it.
Tablet - We consider tablets as devices in their own class, so when we speak of mobile devices, we generally do not include tablets in the definition. Tablets tend to have larger screens, which means that, unless you offer tablet-optimized content, you can assume that users expect to see your site as it would look on a desktop browser rather than on a smartphone browser.
As of 2009, there are only a few large markets where Google is not the leading search engine. In most cases, when Google is not leading in a given market, it is lagging behind a local player. The most notable example markets are China, Japan, South Korea, Russia and the Czech Republic where respectively Baidu, Yahoo! Japan, Naver, Yandex and Seznam are market leaders.

Some features on the Service require payment of fees. If you elect to sign up for these features, you agree to pay Us the applicable fees and any taxes as described on the Service. All payments due are in the U.S. dollars unless otherwise indicated. Upon payment, You will have access to the chosen features immediately. If Your use of the Service is terminated for any reason, whether by You or by Us, You will lose and forfeit any time remaining on Your account with Us.


Search engines use complex mathematical algorithms to interpret which websites a user seeks. In this diagram, if each bubble represents a website, programs sometimes called spiders examine which sites link to which other sites, with arrows representing these links. Websites getting more inbound links, or stronger links, are presumed to be more important and what the user is searching for. In this example, since website B is the recipient of numerous inbound links, it ranks more highly in a web search. And the links "carry through", such that website C, even though it only has one inbound link, has an inbound link from a highly popular site (B) while site E does not. Note: Percentages are rounded.
×