Write a description that would both inform and interest users if they saw your description meta tag as a snippet in a search result. While there's no minimal or maximal length for the text in a description meta tag, we recommend making sure that it's long enough to be fully shown in Search (note that users may see different sized snippets depending on how and where they search), and contains all the relevant information users would need to determine whether the page will be useful and relevant to them.
For example, let’s say I have a health site. I have several types of articles on health, drug information, and information on types of diseases and conditions. My angle on the site is that I’m targeting seniors. If I find out seniors are primarily interested in information on prescription drug plans and cheap blood pressure medication, then I know that I want to provide information specifically on those things. This allows me to hone in on that market’s needs and de-prioritize or bypass other content.

Structured data21 is code that you can add to your sites' pages to describe your content to search engines, so they can better understand what's on your pages. Search engines can use this understanding to display your content in useful (and eye-catching!) ways in search results. That, in turn, can help you attract just the right kind of customers for your business.
Hi Brain, I am a young business owner who has had 4 different websites in the last 2 years but none of them were successful as I would have liked due to lack of SEO. Now I am in process of starting another business and I felt it was time for me to learn about SEO myself. I must say the information you have provided is invaluable and extremely helpful!! I am learning on the go and you are my biggest contributor. Thank you Sir!
When would this be useful? If your site has a blog with public commenting turned on, links within those comments could pass your reputation to pages that you may not be comfortable vouching for. Blog comment areas on pages are highly susceptible to comment spam. Nofollowing these user-added links ensures that you're not giving your page's hard-earned reputation to a spammy site.
Creating high quality content takes a significant amount of at least one of the following: time, effort, expertise, and talent/skill. Content should be factually accurate, clearly written, and comprehensive. So, for example, if you describe your page as a recipe, provide a complete recipe that is easy to follow, rather than just a set of ingredients or a basic description of the dish.
In December 2009, Google announced it would be using the web search history of all its users in order to populate search results.[33] On June 8, 2010 a new web indexing system called Google Caffeine was announced. Designed to allow users to find news results, forum posts and other content much sooner after publishing than before, Google caffeine was a change to the way Google updated its index in order to make things show up quicker on Google than before. According to Carrie Grimes, the software engineer who announced Caffeine for Google, "Caffeine provides 50 percent fresher results for web searches than our last index..."[34] Google Instant, real-time-search, was introduced in late 2010 in an attempt to make search results more timely and relevant. Historically site administrators have spent months or even years optimizing a website to increase search rankings. With the growth in popularity of social media sites and blogs the leading engines made changes to their algorithms to allow fresh content to rank quickly within the search results.[35]
By 2004, search engines had incorporated a wide range of undisclosed factors in their ranking algorithms to reduce the impact of link manipulation. In June 2007, The New York Times' Saul Hansell stated Google ranks sites using more than 200 different signals.[26] The leading search engines, Google, Bing, and Yahoo, do not disclose the algorithms they use to rank pages. Some SEO practitioners have studied different approaches to search engine optimization, and have shared their personal opinions.[27] Patents related to search engines can provide information to better understand search engines.[28] In 2005, Google began personalizing search results for each user. Depending on their history of previous searches, Google crafted results for logged in users.[29]
When Googlebot crawls a page, it should see the page the same way an average user does15. For optimal rendering and indexing, always allow Googlebot access to the JavaScript, CSS, and image files used by your website. If your site's robots.txt file disallows crawling of these assets, it directly harms how well our algorithms render and index your content. This can result in suboptimal rankings.
Add relevant links back to your site. Throughout your answer, sprinkle a few relevant links back to your website. The more relevant they are to the question, the more clicks and traffic they will generate. You can also finish your answers with a link to your lead magnet, concluding with something like this: “Want to know more about how to start a business? Check out my free checklist with 10 steps for starting your first business!” and a link to the lead magnet (in this example, the checklist).
This was all free information I found online in less than an hour, that gives me some great ideas for content, partnerships and potential tools to build into my site to be relevant and useful to my target audience. Of course this is just some quick loose data, so I'll emphasize again: be careful where your data comes from (try to validate when possible), and think about how to use your data wisely.
Users will occasionally come to a page that doesn't exist on your site, either by following a broken link or typing in the wrong URL. Having a custom 404 page30 that kindly guides users back to a working page on your site can greatly improve a user's experience. Your 404 page should probably have a link back to your root page and could also provide links to popular or related content on your site. You can use Google Search Console to find the sources of URLs causing "not found" errors31.
This information hits the mark. “If you want your content to go viral, write content that influencers in your niche will want to share.” I love the information about share triggers too. I’m wondering, though, if you could share your insights on how influencers manage to build such vast followings. At some point, they had to start without the support of other influencers. It would seem that they found a way to take their passion directly to a “ready” world. Excellent insights. Thanks for sharing.

LinkedIn has become much more than a means of finding another job. The world’s largest professional social network is now a valuable publishing platform in its own right, which means you should be posting content to LinkedIn on a regular basis. Doing so can boost traffic to your site, as well as increase your profile within your industry – especially if you have a moderate to large following.
Vary your article length. You should have long, comprehensive articles as well as short and to-the-point articles. Let the content dictate the size; don’t spend too long belaboring a simple point, but don’t be too brief when detail is called for. research suggests the average length should be around 1,600 words, though feel free to vary as you see fit.

What kind of advice would you give is your site is growing but seems to be attracting the wrong kind of traffic? My visitor numbers are going up but all other indicators such as bounce rate, time page, pages per visit seem to be developing in the wrong direction. Not sure if that’s to be expected or if there is something that I should be doing to counter that development?
Use your keyword list to determine how many different pillar pages you should create. Ultimately, the number of topics for which you create pillar pages should coincide with how many different products, offerings, and locations your business has. This will make it much easier for your prospects and customers to find you in search engines no matter what keywords they use.
For some reason I had to delete some pages, these pages are using the HTML suffix, so I blocked them in robots.txt use Disallow: /*.html, but it’s been almost a year, I found that google robot often capture these pages, How can I quickly let Google completely remove these pages? And I have removed these URL from google webmaster tool by google index-> remove URLs, but Google still capture these pages.
×