Many blogging software packages automatically nofollow user comments, but those that don't can most likely be manually edited to do this. This advice also goes for other areas of your site that may involve user-generated content, such as guest books, forums, shout-boards, referrer listings, etc. If you're willing to vouch for links added by third parties (for example, if a commenter is trusted on your site), then there's no need to use nofollow on links; however, linking to sites that Google considers spammy can affect the reputation of your own site. The Webmaster Help Center has more tips on avoiding comment spam40, for example by using CAPTCHAs and turning on comment moderation.
Thanks for the great post. I am confused about the #1 idea about wikipedia ded links…it seems like you didn’t finish what you were supposed to do with the link once you found it. You indicated to put the dead link in ahrefs and you found a bunch of links for you to contact…but then what? What do you contact them about and how do you get your page as the link? I’m obviously not getting something 🙁

Many blogging software packages automatically nofollow user comments, but those that don't can most likely be manually edited to do this. This advice also goes for other areas of your site that may involve user-generated content, such as guest books, forums, shout-boards, referrer listings, etc. If you're willing to vouch for links added by third parties (for example, if a commenter is trusted on your site), then there's no need to use nofollow on links; however, linking to sites that Google considers spammy can affect the reputation of your own site. The Webmaster Help Center has more tips on avoiding comment spam40, for example by using CAPTCHAs and turning on comment moderation.
Laura,Great post.  This touches something I wish more SEOs practiced: conversion optimization. I think most SEOs think of what they do as a service for, instead of a partnership with clients.  The end result should never be raw traffic, but value obtained through targeted, CONVERTING traffic.You make excellent points about market research, product input, content creation, and other functions many SEOs and SEMs neglect.More and more SEO providers focus only on assembly line basics and worn out techniques instead of challenging themsleves to learn product marketing, usability, and conversion optimization.Your advice on market research is extremely valuable.Great start to a promising series.  I look forward to more!
It’s rare to come across new SEO tips worth trying. And this post has tons of them. I know that’s true BECAUSE…I actually read it all the way to the end and downloaded the PDF. What makes these great is that so many are a multiple step little strategy, not just the one-off things to do that clients often stumble across and ask if they are truly good for SEO. But there are also some nice one-off tips that I can easily start using without ramping up a new project.
Search engines find and catalog web pages through spidering (also known as webcrawling) software. Spidering software "crawls" through the internet and grabs information from websites which is used to build search engine indexes. Unfortunately, not all search engine spidering software works the same way, so what gives a page a high ranking on one search engine may not necessarily give it a high ranking on another. Note that rather than waiting for a search engine to discover a newly created page, web designers can submit the page directly to search engines for cataloging.
Give customers the ways with which they can access the translated version of your website easily. And if they are not able to execute that, then they will bounce without engaging. You can integrate the ‘hreflang” attribute to the website’s code and assure that the adequately translated version of the website appears in the search engines. Yandex and Google highly recognize it.
I feel I have great content…but most of it is within my email marketing campaign instead of my blogs. I’ve used my blogs to include links to my email marketing campaigns to lead to my product. In your opinion, should my blog content be the priority? I find my marketing emails sound more like a blog than just a “tip” or a reason to grab people to my list.
This post and the Skycraper technique changed my mind about how I approach SEO, I’m not a marketing expert and I haven’t ranked sites that monetize really well, I’m just a guy trying to get some projects moving on and I’m not even in the marketing business so I just wanted to say that the way you write makes the information accesible, even if you’re not a native english speaker as myself.

I completely agree that defintion of a target audience isa great first step, but would ask if adding in competitors to the analysis (mentioned here as a later step) helps draw out who your target audience would be via comparisons, i.e. showing who you are an who you are not - would be very interested to hear opinions on how this tactic can be used within the overall step in coordination with targeted keyword discovery.
Usually Search-engines automatically crawl your articles if it is high-quality but you should also try to submit your blog to search engines like Google, Bing, and Ask etc. Search engines like Google have already simplified the way of submitting your content. Google Webmaster Tools makes it easy for every webmaster to get their website crawled faster.
On October 17, 2002, SearchKing filed suit in the United States District Court, Western District of Oklahoma, against the search engine Google. SearchKing's claim was that Google's tactics to prevent spamdexing constituted a tortious interference with contractual relations. On May 27, 2003, the court granted Google's motion to dismiss the complaint because SearchKing "failed to state a claim upon which relief may be granted."[68][69]
Google Analytics is an invaluable source of data on just about every conceivable aspect of your site, from your most popular pages to visitor demographics. Keep a close eye on your Analytics data, and use this information to inform your promotional and content strategies. Pay attention to what posts and pages are proving the most popular. Inspect visitor data to see how, where and when your site traffic is coming from.

When Googlebot crawls a page, it should see the page the same way an average user does15. For optimal rendering and indexing, always allow Googlebot access to the JavaScript, CSS, and image files used by your website. If your site's robots.txt file disallows crawling of these assets, it directly harms how well our algorithms render and index your content. This can result in suboptimal rankings.


I am a newbie in the blogging field and started a health blog few months back. I read so many articles on SEO and gaining traffic to a blog. Some of the articles were very good but your article is great. Your writing style is amazing. The way you described each and every point in the article is very simple which becomes easy to learn for a newbie. Also, you mentioned numerous of ways to get the traffic to our blog which is very beneficial for us. I am highly thankful to you for sharing this information with us.
SEO may generate an adequate return on investment. However, search engines are not paid for organic search traffic, their algorithms change, and there are no guarantees of continued referrals. Due to this lack of guarantees and certainty, a business that relies heavily on search engine traffic can suffer major losses if the search engines stop sending visitors.[61] Search engines can change their algorithms, impacting a website's placement, possibly resulting in a serious loss of traffic. According to Google's CEO, Eric Schmidt, in 2010, Google made over 500 algorithm changes – almost 1.5 per day.[62] It is considered a wise business practice for website operators to liberate themselves from dependence on search engine traffic.[63] In addition to accessibility in terms of web crawlers (addressed above), user web accessibility has become increasingly important for SEO.
When Googlebot crawls a page, it should see the page the same way an average user does15. For optimal rendering and indexing, always allow Googlebot access to the JavaScript, CSS, and image files used by your website. If your site's robots.txt file disallows crawling of these assets, it directly harms how well our algorithms render and index your content. This can result in suboptimal rankings.

Wow I wish I had the comments you do. So you’re saying that by re-visiting and re-writing old posts garnered 111% more traffic by updating old posts? I feel like I go back from time to time to do this, mostly to keep information current. This tip makes me want to revisit all my old posts to see what could be updated. That’s a lot of low hanging fruit. Thanks for this one.
WebEngage is an effective tool through which you can collect the customer insights through laser target surveys. It delivers drag and drop option to make a form that helps different types of questions. When you properly setup the form, you can gather answers from your relevant audience. It also provides real time data and information of every survey in a form of report which you can download easily.
Over the next few posts, and starting with this one, I’m going to share with you a detailed 8-step process for creating your own SEO strategy (what I often refer to as an SRD (SEO Research Document)), beginning with defining target audiences and taking it all the way through some fairly comprehensive competitive research, search traffic projections, content strategies, and specific goals and prioritizations.
A breadcrumb is a row of internal links at the top or bottom of the page that allows visitors to quickly navigate back to a previous section or the root page. Many breadcrumbs have the most general page (usually the root page) as the first, leftmost link and list the more specific sections out to the right. We recommend using breadcrumb structured data markup28 when showing breadcrumbs.
If you're looking to upload an image to a blog post, for example, examine the file for its file size first. If it's anywhere in megabyte (MB) territory, even just 1 MB, it's a good idea to use an image compression tool to reduce the file size before uploading it to your blog. Sites like TinyPNG make it easy to compress images in bulk, while Google's very own Squoosh has been known to shrink image file sizes to microscopic levels.
Create a navigation menu. For easy navigation, you should create a toolbar with links that are easy to navigate and position the toolbar in an area that makes sense. Web users often look for the toolbar across the top or down the left the left hand side of the page. You shouldn't forget a link to your homepage. It’s often forgotten but very important to point your users to your homepage.
Tip: Along with delicious I search on scoop.it for similar opportunities. If they liked an article related to a year.. say 2013 and you update the resource to 2014 chances are they’ll share it. Kind of a twist on your delicious + sky scraper technique. You don’t even have to make the content much different or better, just updated! Got some fantastic links recently because of it.

A backlink is a link to your website from another website. Backlinks from complementary businesses or industry influencers will not only get your business in front of a larger audience, but it will also drive qualified traffic to your website. In addition, Google picks up on backlinks and will increase its trust in your business if it sees other trusted sites pointing to yours. More trust from Google leads to higher rankings, which leads to more traffic. Get noticed on Google for free with quality backlinks.
Thanks so much for this entry, Laura! I loved the way your post is so practical, straightforward, newbie-friendly - and most importantly, how it emphasizes the bottom line at all times. It's easy to get "lost in the fog" of SEO with so many looming tasks and forget the main purpose, so it's wonderful to have a straightforward outline of what to do and why certain tasks need to be done. I look forward to reading your future insights!
Brian, I recently found your blog by following OKDork.com. Just want to say you’re really amazing with the content you put out here. It’s so helpful, especially for someone like me who is just starting out. I’m currently writing posts for a blog I plan to launch later this year. I think my niche is a little too broad and I have to figure out how to narrow it down. I essentially want to write about my current journey of overcoming my fears to start accomplishing the dreams i have for blogging, business, and travel. In doing so, I will share the best tips, tools, and tactics I can find, as well as what worked, what didn’t and why.
Our products, including, but not limited to, themes and plugins, are created to be used by end users, including, but not limited to, designers, bloggers and developers for final work (personal and client websites). You can see what every license comes with on the Pricing Page. Our products only work on the self-hosted version of WordPress. You can’t use one of our themes or plugins on a WordPress.com blog. For more information on WordPress.com Vs WordPress.org, you can read here [http://en.support.wordpress.com/com-vs-org/].
Search engine optimisation or SEO, has become a huge priority for marketers over the last few years. It’s easy to see why—higher search engine rankings result in more traffic, more leads, and higher sales and conversions. But how, exactly, does it work? How does adding keywords to various site elements improve your chances of ranking well in search engines?
Thanks for bringing up this point - I agree Eric - competitive positioning can help you determine value that you bring to the table that your competitors dont.  I'm all for it.  Neilsen does some reports that provide awareness, likelihood to recommend, sentiment and other insightsfor your site/brand and your competitors. You can also pull some of that type of insight out of social listening platforms like NetBase, SM2, Radian6, Dow Jones, Nielsen, and so many others.  I've even done some hacked compeitove sentiment comprisons before using Search: searching for [brand or feature] + "like", "love", hate", "wish" etc. 
Search engines use complex mathematical algorithms to interpret which websites a user seeks. In this diagram, if each bubble represents a website, programs sometimes called spiders examine which sites link to which other sites, with arrows representing these links. Websites getting more inbound links, or stronger links, are presumed to be more important and what the user is searching for. In this example, since website B is the recipient of numerous inbound links, it ranks more highly in a web search. And the links "carry through", such that website C, even though it only has one inbound link, has an inbound link from a highly popular site (B) while site E does not. Note: Percentages are rounded.
Just a suggestion, but maybe you could write an article about generating traffic to a brand new blog. As you know, when you start out, you have only a couple posts and very little credibility with other bloggers, also the search engines will take considerable time to be of any benefit initially. Would be interesting to know how Brian Dean approaches that dilemma!
How do you ask others for link opportunities? Most of the time people are only interested in either reciprocal links, or them providing guest posts on my site (when I reach out). And I can’t imagine if I did a round up post getting many inbound links. People would be thrilled that they had received a link, and wouldn’t create a reciprocal link to destroy the value.
Social media is one of the most popular free marketing tools around, and plays a role in driving traffic to your website. Use Facebook, Instagram, and LinkedIn to promote blog posts and other useful pages on your website. This way you can turn your social media audience into website visitors, and draw traffic from their networks if you post shareable content.
Amazing article. As per my point of view, the best source of traffic in today’s world is nothing but the social networking site. A huge number of people are using social media. So, we can connect with our audience easily. While doing the research, I have found this article: https://www.blurbpointmedia.com/design-social-media-business-marketing-strategy/ which is about the developing the community on the social media. I think the best way to a successful social media account is nothing but the posting different kinds of interesting content on the daily basis!
Tablet - We consider tablets as devices in their own class, so when we speak of mobile devices, we generally do not include tablets in the definition. Tablets tend to have larger screens, which means that, unless you offer tablet-optimized content, you can assume that users expect to see your site as it would look on a desktop browser rather than on a smartphone browser.
People want to speak their minds and weigh in on subjects they feel passionately about, so building a community into your site is a great way to start a conversation and increase traffic to your website. Implement a robust commenting system through third-party solutions such as Facebook comments or Disqus, or create a dedicated forum where visitors can ask questions. Don’t forget to manage your community to ensure that minimum standards of decorum are met, however.

You grant to Us a worldwide, irrevocable, non-exclusive, royalty-free license to use, reproduce, adapt, publish, translate and distribute Your Content in any existing or future media. You also grant to Us the right to sublicense these rights and the right to bring an action for infringement of these rights. If You delete Content, we will use reasonable efforts to remove it from the Service, but You acknowledge that caching or references to the Content may not be made immediately unavailable.
Webmasters and content providers began optimizing websites for search engines in the mid-1990s, as the first search engines were cataloging the early Web. Initially, all webmasters only needed to submit the address of a page, or URL, to the various engines which would send a "spider" to "crawl" that page, extract links to other pages from it, and return information found on the page to be indexed.[5] The process involves a search engine spider downloading a page and storing it on the search engine's own server. A second program, known as an indexer, extracts information about the page, such as the words it contains, where they are located, and any weight for specific words, as well as all links the page contains. All of this information is then placed into a scheduler for crawling at a later date.
×