If you own, manage, monetize, or promote online content via Google Search, this guide is meant for you. You might be the owner of a growing and thriving business, the webmaster of a dozen sites, the SEO specialist in a Web agency or a DIY SEO ninja passionate about the mechanics of Search : this guide is meant for you. If you're interested in having a complete overview of the basics of SEO according to our best practices, you are indeed in the right place. This guide won't provide any secrets that'll automatically rank your site first in Google (sorry!), but following the best practices outlined below will hopefully make it easier for search engines to crawl, index and understand your content.
To avoid undesirable content in the search indexes, webmasters can instruct spiders not to crawl certain files or directories through the standard robots.txt file in the root directory of the domain. Additionally, a page can be explicitly excluded from a search engine's database by using a meta tag specific to robots (usually ). When a search engine visits a site, the robots.txt located in the root directory is the first file crawled. The robots.txt file is then parsed and will instruct the robot as to which pages are not to be crawled. As a search engine crawler may keep a cached copy of this file, it may on occasion crawl pages a webmaster does not wish crawled. Pages typically prevented from being crawled include login specific pages such as shopping carts and user-specific content such as search results from internal searches. In March 2007, Google warned webmasters that they should prevent indexing of internal search results because those pages are considered search spam.[47]
Under no circumstances shall MyThemeShop be liable for any direct, indirect, special, incidental or consequential damages, including, but not limited to, loss of data or profit, arising out of the use, or the inability to use, the materials on this site, even if MyThemeShop or an authorized representative has been advised of the possibility of such damages. If your use of materials from this site results in the need for servicing, repair or correction of equipment or data, you assume any costs thereof.

Structured data21 is code that you can add to your sites' pages to describe your content to search engines, so they can better understand what's on your pages. Search engines can use this understanding to display your content in useful (and eye-catching!) ways in search results. That, in turn, can help you attract just the right kind of customers for your business.
SEO is not an appropriate strategy for every website, and other Internet marketing strategies can be more effective, such as paid advertising through pay per click (PPC) campaigns, depending on the site operator's goals. Search engine marketing (SEM) is the practice of designing, running and optimizing search engine ad campaigns.[56] Its difference from SEO is most simply depicted as the difference between paid and unpaid priority ranking in search results. Its purpose regards prominence more so than relevance; website developers should regard SEM with the utmost importance with consideration to visibility as most navigate to the primary listings of their search.[57] A successful Internet marketing campaign may also depend upon building high quality web pages to engage and persuade, setting up analytics programs to enable site owners to measure results, and improving a site's conversion rate.[58] In November 2015, Google released a full 160 page version of its Search Quality Rating Guidelines to the public,[59] which revealed a shift in their focus towards "usefulness" and mobile search. In recent years the mobile market has exploded, overtaking the use of desktops, as shown in by StatCounter in October 2016 where they analyzed 2.5 million websites and found that 51.3% of the pages were loaded by a mobile device [60]. Google has been one of the companies that are utilizing the popularity of mobile usage by encouraging websites to use their Google Search Console, the Mobile-Friendly Test, which allows companies to measure up their website to the search engine results and how user-friendly it is.

Like the hundreds of people already, I thought this was an amazing post. You have a great way of breaking things down into ways that the average reader will be able to understand and make actionable. I think this is a great resource for our readers, so I included it in my monthly roundup of the best SEO, social media, and content marketing articles. https://www.northcutt.com/blog/2014/02/january-resource-round-up-the-best-of-seo-social-media-and-content-marketing/


Expertise and authoritativeness of a site increases its quality. Be sure that content on your site is created or edited by people with expertise in the topic. For example, providing expert or experienced sources can help users understand articles’ expertise. Representing well-established consensus in pages on scientific topics is a good practice if such consensus exists.
I’d add one thing to number 5: Writing good copy is crucial not just for your Title/snippet, but for your whole page, especially your landing page. You want people to stay on your page for a while and (hopefully) even navigate to other pages you have. Google looks at bounce rate and where they go after they hit your page. Learning to write good copy can not only increase conversion (if you’re selling something) but make your content more impactful and engaging. There are free books at most libraries or online to help.
On October 17, 2002, SearchKing filed suit in the United States District Court, Western District of Oklahoma, against the search engine Google. SearchKing's claim was that Google's tactics to prevent spamdexing constituted a tortious interference with contractual relations. On May 27, 2003, the court granted Google's motion to dismiss the complaint because SearchKing "failed to state a claim upon which relief may be granted."[68][69]
In February 2011, Google announced the Panda update, which penalizes websites containing content duplicated from other websites and sources. Historically websites have copied content from one another and benefited in search engine rankings by engaging in this practice. However, Google implemented a new system which punishes sites whose content is not unique.[36] The 2012 Google Penguin attempted to penalize websites that used manipulative techniques to improve their rankings on the search engine.[37] Although Google Penguin has been presented as an algorithm aimed at fighting web spam, it really focuses on spammy links[38] by gauging the quality of the sites the links are coming from. The 2013 Google Hummingbird update featured an algorithm change designed to improve Google's natural language processing and semantic understanding of web pages. Hummingbird's language processing system falls under the newly recognized term of 'Conversational Search' where the system pays more attention to each word in the query in order to better match the pages to the meaning of the query rather than a few words [39]. With regards to the changes made to search engine optimization, for content publishers and writers, Hummingbird is intended to resolve issues by getting rid of irrelevant content and spam, allowing Google to produce high-quality content and rely on them to be 'trusted' authors.
Hello Brian, i am planing to start my blog soon and im in preparation phase (invastigating, learning, etc…). I have read a lot of books and posts about SEO and i can say that this is the best post so far. Its not even a book and you covered more than in books. I would like to thank you for sharing your knowledge with me and rest of the world, thats one of the most appriciate thing that someone can do, even if you do it for your own “good” you shared it! As soon as i start my site ill make and article about you!!
I’d add one thing to number 5: Writing good copy is crucial not just for your Title/snippet, but for your whole page, especially your landing page. You want people to stay on your page for a while and (hopefully) even navigate to other pages you have. Google looks at bounce rate and where they go after they hit your page. Learning to write good copy can not only increase conversion (if you’re selling something) but make your content more impactful and engaging. There are free books at most libraries or online to help.

Hack #1: Hook readers in from the beginning. People have low attention spans. If you don’t have a compelling “hook” at the beginning of your blogs, people will click off in seconds. You can hook them in by teasing the benefits of the article (see the intro to this article for example!), telling a story, or stating a common problem that your audience faces.


Link text is the visible text inside a link. This text tells users and Google something about the page you're linking to. Links on your page may be internal—pointing to other pages on your site—or external—leading to content on other sites. In either of these cases, the better your anchor text is, the easier it is for users to navigate and for Google to understand what the page you're linking to is about.
Write a description that would both inform and interest users if they saw your description meta tag as a snippet in a search result. While there's no minimal or maximal length for the text in a description meta tag, we recommend making sure that it's long enough to be fully shown in Search (note that users may see different sized snippets depending on how and where they search), and contains all the relevant information users would need to determine whether the page will be useful and relevant to them.

Keywords. Keyword research is the first step to a successful SEO strategy. Those successful with SEO understand what people are searching for when discovering their business in a search engine. These are the keywords they use to drive targeted traffic to their products. Start brainstorming potential keywords, and see how the competition looks by using Google AdWords Keyword Tool. If you notice that some keywords are too competitive in your niche, go with long-tail keywords (between two and five words) which will be easier for you to rank. The longer the keyword, the less competition you will have for that phrase in the engines.
2. Targeted Keyword Discovery: Ideally you’ll want to do keyword research based on what the audience wants, not solely on what content the site already has (or plans to have sans audience targeting), which may be limited. I can do keyword research on health conditions and drugs (content I have on my site) and determine what the general population is searching for and optimize my current content, or I can cast my net wide and look at what my target audience wants first, then do my keyword research. You may find there are needs that your site is not meeting. Knowing my senior audience is interested in primarily in prescription drug plans and cheap blood pressure medication, I can first make sure I’m providing that content, and then further determine the top keywords in these areas (in the next article Step 2), and use those terms in relevant and high visibility areas on my site.
This was all free information I found online in less than an hour, that gives me some great ideas for content, partnerships and potential tools to build into my site to be relevant and useful to my target audience. Of course this is just some quick loose data, so I'll emphasize again: be careful where your data comes from (try to validate when possible), and think about how to use your data wisely.
Inclusion in Google's search results is free and easy; you don't even need to submit your site to Google. Google is a fully automated search engine that uses web crawlers to explore the web constantly, looking for sites to add to our index. In fact, the vast majority of sites listed in our results aren't manually submitted for inclusion, but found and added automatically when we crawl the web. Learn how Google discovers, crawls, and serves web pages.3
On another note, we recently went through this same process with an entire site redesign.  The executive team demanded we cut out over 75% of the pages on our site because they were useless to the visitor.  It's been 60 days since the launch of the new site and I've been able to still increase rankings, long-tail keywords, and even organic traffic.  It took a little bit of a "cowboy" mentality to get some simple things done (like using 301's instead of blocking the old content with robots.txt!).  I predicted we would lose a lot of our long tail keywords...but we haven't....yet!
Influencers: Government Contracting Officers, Other GovCon (Government Contracting) consultants, Sellers of professional services for small businesses (certain CPAs, bonding companies, financial institutions, contract attorneys), large contracting firms (who need to hire small business subcontractors), Union/trade organizations, Construction and Engineering trade publications

While most search engine companies try to keep their processes a secret, their criteria for high spots on SERPs isn't a complete mystery. Search engines are successful only if they provide a user links to the best Web sites related to the user's search terms. If your site is the best skydiving resource on the Web, it benefits search engines to list the site high up on their SERPs. You just have to find a way to show search engines that your site belongs at the top of the heap. That's where search engine optimization (SEO) comes in -- it's a collection of techniques a webmaster can use to improve his or her site's SERP position.

People love to learn, and webinars are an excellent way to impart your wisdom to your eagerly waiting audience. Combined with an effective social promotion campaign, webinars are a great way to increase traffic to your website. Send out an email a week or so ahead of time, as well as a “last chance to register” reminder the day before the webinar. Make sure to archive the presentation for later viewing, and promote your webinars widely through social media. If you're wondering how to do a webinar, click the link for some tips.

Write a description that would both inform and interest users if they saw your description meta tag as a snippet in a search result. While there's no minimal or maximal length for the text in a description meta tag, we recommend making sure that it's long enough to be fully shown in Search (note that users may see different sized snippets depending on how and where they search), and contains all the relevant information users would need to determine whether the page will be useful and relevant to them.

Structured data21 is code that you can add to your sites' pages to describe your content to search engines, so they can better understand what's on your pages. Search engines can use this understanding to display your content in useful (and eye-catching!) ways in search results. That, in turn, can help you attract just the right kind of customers for your business.
By 2004, search engines had incorporated a wide range of undisclosed factors in their ranking algorithms to reduce the impact of link manipulation. In June 2007, The New York Times' Saul Hansell stated Google ranks sites using more than 200 different signals.[26] The leading search engines, Google, Bing, and Yahoo, do not disclose the algorithms they use to rank pages. Some SEO practitioners have studied different approaches to search engine optimization, and have shared their personal opinions.[27] Patents related to search engines can provide information to better understand search engines.[28] In 2005, Google began personalizing search results for each user. Depending on their history of previous searches, Google crafted results for logged in users.[29]
Write articles rich in content. Quality articles will get ranked better in search results. Make sure that your articles address the needs of your readers, and that they can find all of the information they need in one spot. This is the most effective means for increasing traffic to a website; offering people something that they cannot obtain elsewhere, or at least, not to the level of quality that you are offering it.[1]
Do not be fooled by those traffic sellers promising thousands of hits an hour. What they really do is load up your URL in a program, along with a list of proxies. Then they run the program for a few hours. It looks like someone is on your site because your logs show visitors from thousands of different IPs. What happens in reality is your website is just pinged by the proxy, no one really sees your site. It is a waste of money.
Having a different description meta tag for each page helps both users and Google, especially in searches where users may bring up multiple pages on your domain (for example, searches using the site: operator). If your site has thousands or even millions of pages, hand-crafting description meta tags probably isn't feasible. In this case, you could automatically generate description meta tags based on each page's content.
Hi, my name is Dimitrios and I am responsible for Crave Culinaire’s digital marketing. I would like to drive more traffic to Crave’s blog. Since Crave Culinaire is the only catering company who provides molecular cuisine, I thought about craving a blog post about that. The influencers in this niche have great success in utilizing recipes on their blogs. I will share some recipes of Brian Roland, owner and head chef of Crave Culinaire.
×