Just a suggestion, but maybe you could write an article about generating traffic to a brand new blog. As you know, when you start out, you have only a couple posts and very little credibility with other bloggers, also the search engines will take considerable time to be of any benefit initially. Would be interesting to know how Brian Dean approaches that dilemma!
You have also mentioned Quuu for article sharing and driving traffic. I have been using Quuu for quite sometime now and I don’t think they’re worth it. While the content does get shared a lot, there are hardly any clicks to the site. Even the clicks that are there, average time is like 0.02 seconds compared to more than 2 minutes for other sources of traffic on my website. I have heard a few guys having a similar experience with Quuu and so, I thought should let you know.
Thank you so much for these great SEO techniques you posted on your blog. I also follow you on your youtube and listened to almost all of your videos and sometimes I re-listen just to refresh my mind. Because of your techniques, we managed to bring our website to the first pages within a month. Adding external links was something I never imagined that it would work. But it seems like it is working. Anyway, please accept my personal thank you for coming up with and sharing these techniques. I look forward to your new blog posts and youtube videos!
Consider outsourcing article writing. If you hate the thought of generating content yourself, or your team is not writing-savvy, consider outsourcing this end of the task. Depending on the length, content, specialization and quality required, prices can start as low as US$5 per article. However, don't neglect attempting to write your own work - who better than you knows your own business, hobby or club and can express precisely what needs to be said?
Hi there, am interested to try your trick in Wikipedia, but am also not sure of how should I do tht, coz i read some posts saying tht “Please note that Wikipedia hates spams, so don’t spam them; if you do, they can block your IP and/or website URL, check their blocking policy and if they blacklist you, you can be sure that Google may know about it.”
Inclusion in Google's search results is free and easy; you don't even need to submit your site to Google. Google is a fully automated search engine that uses web crawlers to explore the web constantly, looking for sites to add to our index. In fact, the vast majority of sites listed in our results aren't manually submitted for inclusion, but found and added automatically when we crawl the web. Learn how Google discovers, crawls, and serves web pages.3 

Hey Brian, I have landed in this blog while visiting via blog land. I must appreciate your effort to put up such informative content. As being an Internet Marketing Consultant, I would like to add few thought of my own with your valuable content. There are many people who wants HUGE number of traffic with no time at all. But as per my experience, SEO has become SLOW-BUT-STEADY process in the recent times. After so many algorithm updates of Google, I think if we will do any wrong things with the websites, that should be paid off. So without taking any risk, we need to work ethically so that slowly the website will get the authority and grab the targeting traffic. What do you think mate? I am eagerly looking forward to your reply and love to see more valuable write-ups from your side. Why don’t you write about some important points about Hummingbird Updates of Google. It will be a good read. Right brother? 🙂
On another note, we recently went through this same process with an entire site redesign.  The executive team demanded we cut out over 75% of the pages on our site because they were useless to the visitor.  It's been 60 days since the launch of the new site and I've been able to still increase rankings, long-tail keywords, and even organic traffic.  It took a little bit of a "cowboy" mentality to get some simple things done (like using 301's instead of blocking the old content with robots.txt!).  I predicted we would lose a lot of our long tail keywords...but we haven't....yet!
He started by finding an offer that resonated with and is relevant to his audience. In his case, his blog was dedicated to teaching people how to use a software called “Sublime Text.” He simply offered a license to the software for the giveaway. By doing this, not only did he increase the chances of success of his giveaway since his incentive was relevant, but he also ensured the quality of subscribers since they were actually people interested in his content. It’s easy to give people an iPad or an iPhone, but how relevant will they be to you at the end of the day?
Hey Brian, love your site + content. Really awesome stuff! I have a question about dead link building on Wikipedia. I actually got a “user talk” message from someone moderating a Wikipedia page I replaced a dead link on. They claimed that “Wikipedia uses nofollow tags” so “additions of links to Wikipedia will not alter search engine rankings.” Any thoughts here?
Fortunately, Google puts more weight on the anchor text of external links anyway. So as long as some of your external links have your target anchors, you’re probably OK with a “Home” button. In fact, I’ve ranked homepages with a “Home” anchor text nav button for some seriously competitive terms. So it’s not a make-or-break ranking signal by any means.
To avoid undesirable content in the search indexes, webmasters can instruct spiders not to crawl certain files or directories through the standard robots.txt file in the root directory of the domain. Additionally, a page can be explicitly excluded from a search engine's database by using a meta tag specific to robots (usually ). When a search engine visits a site, the robots.txt located in the root directory is the first file crawled. The robots.txt file is then parsed and will instruct the robot as to which pages are not to be crawled. As a search engine crawler may keep a cached copy of this file, it may on occasion crawl pages a webmaster does not wish crawled. Pages typically prevented from being crawled include login specific pages such as shopping carts and user-specific content such as search results from internal searches. In March 2007, Google warned webmasters that they should prevent indexing of internal search results because those pages are considered search spam.[47]
Great article. My site has been up for several years now but I rebranded and switched from Blogger to WordPress about a year ago because I was told the reason why my traffic is so low is because I was using the wrong platform. I still haven’t seen an increase in my traffic and am very frustrated. I write in the health, fitness and parenting niche and I have over 30 experts that write for me, but I still don’t have the page views I would like. My paychecks are small and I am very frustrated. How do I find out what influencers in my niche are talking about and what they would like to share? I read tons of blogs, but most of them just review products or write about their kids, not a whole lot of similar articles. Where do I begin to find sharable content in my niche?

In addition to optimizing these six areas of your site, analyze your competitors and see what they are doing in terms of on-page optimization, off-page optimization (competitive link analysis) and social media. While you may be doing a lot of the same things they are, it’s incredibly important to think outside the box to get a leg up over the competition.


By 2004, search engines had incorporated a wide range of undisclosed factors in their ranking algorithms to reduce the impact of link manipulation. In June 2007, The New York Times' Saul Hansell stated Google ranks sites using more than 200 different signals.[26] The leading search engines, Google, Bing, and Yahoo, do not disclose the algorithms they use to rank pages. Some SEO practitioners have studied different approaches to search engine optimization, and have shared their personal opinions.[27] Patents related to search engines can provide information to better understand search engines.[28] In 2005, Google began personalizing search results for each user. Depending on their history of previous searches, Google crafted results for logged in users.[29]
Black hat SEO involves techniques such as paying to post links to a website on link farms, stuffing the metadata with nonrelated keywords, and using text that is invisible to readers to attract search engines. These and many other black hat SEO tactics may boost traffic, but search engines frown on the use of such measures. Search engines may punish sites that employ these methods by reducing their page rank or delisting them from search results.
×