Like the hundreds of people already, I thought this was an amazing post. You have a great way of breaking things down into ways that the average reader will be able to understand and make actionable. I think this is a great resource for our readers, so I included it in my monthly roundup of the best SEO, social media, and content marketing articles. https://www.northcutt.com/blog/2014/02/january-resource-round-up-the-best-of-seo-social-media-and-content-marketing/
Another excellent guide is Google’s “Search Engine Optimization Starter Guide.” This is a free PDF download that covers basic tips that Google provides to its own employees on how to get listed. You’ll find it here. Also well worth checking out is Moz’s “Beginner’s Guide To SEO,” which you’ll find here, and the SEO Success Pyramid from Small Business Search Marketing.
We now have a dedicated SEO strategist who, among other things, develops 90 day plans for our websites. 90 days isn't longterm planning, but at least we have a strategic objective for the quarter. He also works closely with our UX team to identify the target audience - the crew that does the persona research and focus groups prior to the wireframe stage.
It’s an awesome post which I like the most and commenting here for the first time. I’m Abhishek founder of CouponMaal want to know more like you’ve said above in the points relaunch your old posts. Here I want to know is there any difference between changing the date, time and year while we’re relaunching old post OR we should relaunch the old post with the previous date, time and year. I mean it matters or not.
“Syndicate carefully: If you syndicate your content on other sites, Google will always show the version we think is most appropriate for users in each given search, which may or may not be the version you’d prefer. However, it is helpful to ensure that each site on which your content is syndicated includes a link back to your original article. You can also ask those who use your syndicated material to use the noindex meta tag to prevent search engines from indexing their version of the content.”
The strength of your link profile isn’t solely determined by how many sites link back to you – it can also be affected by your internal linking structure. When creating and publishing content, be sure to keep an eye out for opportunities for internal links. This not only helps with SEO, but also results in a better, more useful experience for the user – the cornerstone of increasing traffic to your website.
That second link will still help you because it will pass extra PR to that page. But in terms of anchor text, most of the experiments I’ve seen show that the second link’s anchor text probably doesn’t help. That being said, Google is more sophisticated than when a lot of these came out so they may count both anchors. But to stay on the safe side I recommend adding keywords to navigation links if possible.

Hi Brain, I am a young business owner who has had 4 different websites in the last 2 years but none of them were successful as I would have liked due to lack of SEO. Now I am in process of starting another business and I felt it was time for me to learn about SEO myself. I must say the information you have provided is invaluable and extremely helpful!! I am learning on the go and you are my biggest contributor. Thank you Sir!


Google is the most popular spider-driven search engine. Its database currently has about 4 billion pages indexed and is known for finding the most relevant information. When Google spiders the Web, it finds sites by traveling through links. The more sites that link to you, the more important the engines believe your content to be. You should focus on getting many important sites to link to your site. You can do this in many ways: submit to online directories, exchange links with business partners and industry-related sites, or participate in Link Building.
When Googlebot crawls a page, it should see the page the same way an average user does15. For optimal rendering and indexing, always allow Googlebot access to the JavaScript, CSS, and image files used by your website. If your site's robots.txt file disallows crawling of these assets, it directly harms how well our algorithms render and index your content. This can result in suboptimal rankings.
Another way to increase traffic to your website is to get listed in free online directories and review sites. For most of these sites, your profile will have a link to your website, so actively updating these listings and getting positive reviews is likely to result in more website traffic. In addition, many directories like Yelp have strong domain authority on Google. There’s a chance that your business’s free Yelp page could rank high for relevant searches.

Since heading tags typically make text contained in them larger than normal text on the page, this is a visual cue to users that this text is important and could help them understand something about the type of content underneath the heading text. Multiple heading sizes used in order create a hierarchical structure for your content, making it easier for users to navigate through your document.
There are many times when you post a small quote or a phrase in your blog post that you believe people would love to tweet. ClickToTweet helps you do just that. Simple create a pre-made Tweet on ClickToTweet.com, generate a unique, and put it on your website so that people can just click it to tweet it. Sounds simple. It is, and it is one of the most popular strategies for generating buzz on Twitter.
There were some great tips in this article. I notice that many people make the mistake of making too many distracting images in the header and the sidebar which can quickly turn people off content. I particularly dislike google ads anchored in the centre of a piece of text. I understand that people want to make a revenue for ads but there are right ways and wrong ways of going about this. The writing part of the content is the important part, why would you take a dump on it by pouring a load of conflicting media in the sides?

I love your post. I keep coming back because you always have great content I can use in my business as well as share. Since I own my own Digital Marketing company I guess you would be one of THE influencers in Internet Marketing field. I just started my business and because most influencers on twitter are talking about Content Marketing, that is what I have been writing about. But my site is only about a month old so I will just stay consistent in my writing. I’m also in the process of changing my navigation bar so be know how to get to what they want faster. Which would be “what is SEO”, etc. Thanks and would love any advice you can give me.

If you are using Responsive Web Design, use meta name="viewport" tag to tell the browser how to adjust the content. If you use Dynamic Serving, use the Vary HTTP header to signal your changes depending on the user-agent. If you are using separate URLs, signal the relationship between two URLs by  tag with rel="canonical" and rel="alternate" elements.

“In conclusion, this research illuminates how content characteristics shape whether it becomes viral. When attempting to generate word of mouth, marketers often try targeting “influentials,” or opinion leaders (i.e., some small set of special people who, whether through having more social ties or being more persuasive, theoretically have more influence than others). Although this approach is pervasive,recent research has cast doubt on its value (Bakshy et al. 2011; Watts 2007) and suggests that it is far from cost effective. Rather than targeting “special” people, the current research suggests that it may be more beneficial to focus on crafting contagious content. By considering how psychological processes shape social transmission, it is possible to gain deeper insight into collective outcomes, such as what becomes viral.”
Robots.txt is not an appropriate or effective way of blocking sensitive or confidential material. It only instructs well-behaved crawlers that the pages are not for them, but it does not prevent your server from delivering those pages to a browser that requests them. One reason is that search engines could still reference the URLs you block (showing just the URL, no title or snippet) if there happen to be links to those URLs somewhere on the Internet (like referrer logs). Also, non-compliant or rogue search engines that don't acknowledge the Robots Exclusion Standard could disobey the instructions of your robots.txt. Finally, a curious user could examine the directories or subdirectories in your robots.txt file and guess the URL of the content that you don't want seen.
A user-feedback poll is one great, easy way to help you better understand your customers. Kline claims, “Done incorrectly, these can be annoying for a user. Done well, it’s an excellent opportunity to help the customer feel that their opinion matters, while also getting needed insights to better market the company. One poll we ran for an e-commerce client helped us learn that 80% of potential customers cared more about the performance of the product than the price. [So,] we added as much helpful performance information to the website as we could.”

Do not be fooled by those traffic sellers promising thousands of hits an hour. What they really do is load up your URL in a program, along with a list of proxies. Then they run the program for a few hours. It looks like someone is on your site because your logs show visitors from thousands of different IPs. What happens in reality is your website is just pinged by the proxy, no one really sees your site. It is a waste of money.
Website owners recognized the value of a high ranking and visibility in search engine results,[6] creating an opportunity for both white hat and black hat SEO practitioners. According to industry analyst Danny Sullivan, the phrase "search engine optimization" probably came into use in 1997. Sullivan credits Bruce Clay as one of the first people to popularize the term.[7] On May 2, 2007,[8] Jason Gambert attempted to trademark the term SEO by convincing the Trademark Office in Arizona[9] that SEO is a "process" involving manipulation of keywords and not a "marketing service."
Hi Brian, i absolutely love your content.My competitors and influence rs are very strong-most of them government bodies or supported by government, or travel guides known worldwide.I constantly follow them,get onto them.like,share,comment etc.they share photos that are submitted to them,and i do photography myself which takes hours and still can’t reach out big audience…Any idea please what could i create that my influence rs would love to share (hard to find out what they care,they get 100’s photos submitted daily and collaborate with other big names…) Please help me.
When Googlebot crawls a page, it should see the page the same way an average user does15. For optimal rendering and indexing, always allow Googlebot access to the JavaScript, CSS, and image files used by your website. If your site's robots.txt file disallows crawling of these assets, it directly harms how well our algorithms render and index your content. This can result in suboptimal rankings.
Wow I wish I had the comments you do. So you’re saying that by re-visiting and re-writing old posts garnered 111% more traffic by updating old posts? I feel like I go back from time to time to do this, mostly to keep information current. This tip makes me want to revisit all my old posts to see what could be updated. That’s a lot of low hanging fruit. Thanks for this one.
Getting more website visitors does not happen overnight. It takes some effort but we’ve eliminated the hard part for you: knowing what to do in the first place. By using Google My Business and the other safe channels listed above, you can get the right visitors coming to your site and more importantly, more of those visitors converting into customers.
Give customers the ways with which they can access the translated version of your website easily. And if they are not able to execute that, then they will bounce without engaging. You can integrate the ‘hreflang” attribute to the website’s code and assure that the adequately translated version of the website appears in the search engines. Yandex and Google highly recognize it.
You may not want certain pages of your site crawled because they might not be useful to users if found in a search engine's search results. If you do want to prevent search engines from crawling your pages, Google Search Console has a friendly robots.txt generator to help you create this file. Note that if your site uses subdomains and you wish to have certain pages not crawled on a particular subdomain, you'll have to create a separate robots.txt file for that subdomain. For more information on robots.txt, we suggest this Webmaster Help Center guide on using robots.txt files13.
Well, the age of print media is coming to a close. But there’s no reason why some enterprising blogger couldn’t use the same tactic to get new subscribers. Let’s say you have a lifestyle blog targetting people in San Francisco. You could promote the giveaway through local media, posters, and many other tactics (we’ll get into these methods shortly).
Keep resources crawlable. Blocking page resources can give Google an incomplete picture of your website. This often happens when your robots.txt file is blocking access to some or all of your page resources. If Googlebot doesn't have access to a page's resources, such as CSS, JavaScript, or images, we may not detect that it's built to display and work well on a mobile browser. In other words, we may not detect that the page is "mobile-friendly," and therefore not properly serve it to mobile searchers.
Having a different description meta tag for each page helps both users and Google, especially in searches where users may bring up multiple pages on your domain (for example, searches using the site: operator). If your site has thousands or even millions of pages, hand-crafting description meta tags probably isn't feasible. In this case, you could automatically generate description meta tags based on each page's content.
If you're looking to upload an image to a blog post, for example, examine the file for its file size first. If it's anywhere in megabyte (MB) territory, even just 1 MB, it's a good idea to use an image compression tool to reduce the file size before uploading it to your blog. Sites like TinyPNG make it easy to compress images in bulk, while Google's very own Squoosh has been known to shrink image file sizes to microscopic levels.
×