Hi! I really found this article to be valuable and helpful to improve our SEO techniques. But I am just wondering regarding the dead links, does that mean we can contact those who have dead links to recreate the page? How does it improve my SEO technique for my website? Can they add some citations or thank you or gratitude section that links to our website?
Robots.txt is not an appropriate or effective way of blocking sensitive or confidential material. It only instructs well-behaved crawlers that the pages are not for them, but it does not prevent your server from delivering those pages to a browser that requests them. One reason is that search engines could still reference the URLs you block (showing just the URL, no title or snippet) if there happen to be links to those URLs somewhere on the Internet (like referrer logs). Also, non-compliant or rogue search engines that don't acknowledge the Robots Exclusion Standard could disobey the instructions of your robots.txt. Finally, a curious user could examine the directories or subdirectories in your robots.txt file and guess the URL of the content that you don't want seen.

Hi Brian, i absolutely love your content.My competitors and influence rs are very strong-most of them government bodies or supported by government, or travel guides known worldwide.I constantly follow them,get onto them.like,share,comment etc.they share photos that are submitted to them,and i do photography myself which takes hours and still can’t reach out big audience…Any idea please what could i create that my influence rs would love to share (hard to find out what they care,they get 100’s photos submitted daily and collaborate with other big names…) Please help me.
I consulted a few years ago before Yahoo and CNET and my clients were all small businesses, even friends' sites.  No matter the size of the project, you can still try to get some insight into your target audiences and what they need or want.  I mentioned in a previous comment I used Search once to determine sentiment on a site vs. it's competitors by searching for a feature the site and its competitors all had, along with "like", "love", "hate", "wish", etc.  I also took note of who the people were who said those things and where they were talking (forums, twitter, etc).  It's a hacked manual approach and although not nearly as quality as a good market research report, at least I have a llittle bit of insight before going out to make site recommendations based solely on tags & links.  If you're recommending the site build things that people want (and fix or remove things that they dont), you're more likely to gain links and traffic naturally.
Hey Brian I must say it’s a awesome content you are sharing .my question to you is how did you transform from a nutrition expert to a Seo master I mean both subjects are poles apart so how did you learn SEO can you share your story because I find my self in similar situation I am an engineer by profession and I am starting a ecommerce business niche is Apparel no experience of watspever in Blog writing and SEO if you can throw some resources where I can improve my skills that would be a huge help
Facebook is keen to promote streaming video – the success of Twitch.tv has them drooling. This means that Streaming videos are given massive “reach” – more people see them. They’ll show your video to more of your page subscribers, more group members, etc. If the video is good, you’ll get lots of shares and likes, and this can build your audience rapidly.

As of 2009, there are only a few large markets where Google is not the leading search engine. In most cases, when Google is not leading in a given market, it is lagging behind a local player. The most notable example markets are China, Japan, South Korea, Russia and the Czech Republic where respectively Baidu, Yahoo! Japan, Naver, Yandex and Seznam are market leaders.


You may not want certain pages of your site crawled because they might not be useful to users if found in a search engine's search results. If you do want to prevent search engines from crawling your pages, Google Search Console has a friendly robots.txt generator to help you create this file. Note that if your site uses subdomains and you wish to have certain pages not crawled on a particular subdomain, you'll have to create a separate robots.txt file for that subdomain. For more information on robots.txt, we suggest this Webmaster Help Center guide on using robots.txt files13.
Black hat SEO attempts to improve rankings in ways that are disapproved of by the search engines, or involve deception. One black hat technique uses hidden text, either as text colored similar to the background, in an invisible div, or positioned off screen. Another method gives a different page depending on whether the page is being requested by a human visitor or a search engine, a technique known as cloaking. Another category sometimes used is grey hat SEO. This is in between black hat and white hat approaches, where the methods employed avoid the site being penalized but do not act in producing the best content for users. Grey hat SEO is entirely focused on improving search engine rankings.
While most search engine companies try to keep their processes a secret, their criteria for high spots on SERPs isn't a complete mystery. Search engines are successful only if they provide a user links to the best Web sites related to the user's search terms. If your site is the best skydiving resource on the Web, it benefits search engines to list the site high up on their SERPs. You just have to find a way to show search engines that your site belongs at the top of the heap. That's where search engine optimization (SEO) comes in -- it's a collection of techniques a webmaster can use to improve his or her site's SERP position.
×