Provide full functionality on all devices. Mobile users expect the same functionality - such as commenting and check-out - and content on mobile as well as on all other devices that your website supports. In addition to textual content, make sure that all important images and videos are embedded and accessible on mobile devices. For search engines, provide all structured data and other metadata - such as titles, descriptions, link-elements, and other meta-tags - on all versions of the pages.
I second Rand's comment!  Congrats on moving from the corporate world to the independent consultant.  This is my goal for the near future.  I too have been testing the waters of independent consulting, but it doesn't quite pay the bills yet!  Sometimes I feel like I should find a mentor who has been where I am now and is where I want to go.  Perhaps i'll find a few in this community over time!
This is excellent and should be intuitive for marketers (and SEO pros are marketers!) but we often take the short cut and neglect critical details. What would also reinforce the strategy is way of providing solid projections for SEO (these could be based on industry trends and statistics). Clients now ask for ways to calculate ROI and they need numbers to get budget approvals. Increase in traffic by X,  increase in qualified traffic and leads, conversions etc, some way of quatifying the expected return.
Google has recently changed how you can use the Google Keyword Planner. Before, everyone who signed up could see the search volume for keywords. Now, it only shows estimates. There is a way to get around this. You need to create a Google Adwords campaign. The amount you spend doesn’t matter. After you do that, you will regain access to the search volume.
People want to speak their minds and weigh in on subjects they feel passionately about, so building a community into your site is a great way to start a conversation and increase traffic to your website. Implement a robust commenting system through third-party solutions such as Facebook comments or Disqus, or create a dedicated forum where visitors can ask questions. Don’t forget to manage your community to ensure that minimum standards of decorum are met, however.
On October 17, 2002, SearchKing filed suit in the United States District Court, Western District of Oklahoma, against the search engine Google. SearchKing's claim was that Google's tactics to prevent spamdexing constituted a tortious interference with contractual relations. On May 27, 2003, the court granted Google's motion to dismiss the complaint because SearchKing "failed to state a claim upon which relief may be granted."[68][69]
Link text is the visible text inside a link. This text tells users and Google something about the page you're linking to. Links on your page may be internal—pointing to other pages on your site—or external—leading to content on other sites. In either of these cases, the better your anchor text is, the easier it is for users to navigate and for Google to understand what the page you're linking to is about.
Keep resources crawlable. Blocking page resources can give Google an incomplete picture of your website. This often happens when your robots.txt file is blocking access to some or all of your page resources. If Googlebot doesn't have access to a page's resources, such as CSS, JavaScript, or images, we may not detect that it's built to display and work well on a mobile browser. In other words, we may not detect that the page is "mobile-friendly," and therefore not properly serve it to mobile searchers.
Robots.txt is not an appropriate or effective way of blocking sensitive or confidential material. It only instructs well-behaved crawlers that the pages are not for them, but it does not prevent your server from delivering those pages to a browser that requests them. One reason is that search engines could still reference the URLs you block (showing just the URL, no title or snippet) if there happen to be links to those URLs somewhere on the Internet (like referrer logs). Also, non-compliant or rogue search engines that don't acknowledge the Robots Exclusion Standard could disobey the instructions of your robots.txt. Finally, a curious user could examine the directories or subdirectories in your robots.txt file and guess the URL of the content that you don't want seen.
Search engines attempt to rank results for a given search based on their relevance to the topic, and the quality and reliability a site is judged to have. Google, the world’s most popular search engine, uses an ever-evolving algorithm that aims to evaluate sites in the way that a human reader would. This means that a key part of SEO involves ensuring that the website is a unique and relevant resource for readers.
Having an industry influencer publish a blog post on your site or turning an interview with them into a blog post can help to drive traffic both through organic search but also via that influencer promoting the content to their audience (see the backlinks section above). This can also help to add more variety to your content and show your visitors that you are active in your field.

Getting more website visitors does not happen overnight. It takes some effort but we’ve eliminated the hard part for you: knowing what to do in the first place. By using Google My Business and the other safe channels listed above, you can get the right visitors coming to your site and more importantly, more of those visitors converting into customers.
Firstly, really think about what your audience is interested in and what their needs are. As SUCCESS agency CEO, Avin Kline, states, “It’s so easy to forget, but the heart of increasing user engagement is to put yourself in their shoes and add undeniable value to the user. Keep in mind, what marketers think is valuable and what users think is valuable are often two different things.”
I’d add one thing to number 5: Writing good copy is crucial not just for your Title/snippet, but for your whole page, especially your landing page. You want people to stay on your page for a while and (hopefully) even navigate to other pages you have. Google looks at bounce rate and where they go after they hit your page. Learning to write good copy can not only increase conversion (if you’re selling something) but make your content more impactful and engaging. There are free books at most libraries or online to help.

Advertiser Disclosure: Some of the products that appear on this site are from companies from which QuinStreet receives compensation. This compensation may impact how and where products appear on this site including, for example, the order in which they appear. QuinStreet does not include all companies or all types of products available in the marketplace.

For some reason I had to delete some pages, these pages are using the HTML suffix, so I blocked them in robots.txt use Disallow: /*.html, but it’s been almost a year, I found that google robot often capture these pages, How can I quickly let Google completely remove these pages? And I have removed these URL from google webmaster tool by google index-> remove URLs, but Google still capture these pages.
Brian, I’ve drunk your Kool aid! Thank you for honesty and transparency – it really gives me hope. Quick question: I am beyond passionate about a niche (UFOs, extraterrestrials, free energy) and know in my bones that an authority site is a long term opportunity. The problem today is that not many products are attached to this niche and so it becomes a subscriber / info product play. However, after 25+ years as an entrepreneur with a financial background and marketing MBA, am I Internet naive to believe that my passion and creativity will win profitability in the end? The target audience is highly passionate too. Feedback?
“In conclusion, this research illuminates how content characteristics shape whether it becomes viral. When attempting to generate word of mouth, marketers often try targeting “influentials,” or opinion leaders (i.e., some small set of special people who, whether through having more social ties or being more persuasive, theoretically have more influence than others). Although this approach is pervasive,recent research has cast doubt on its value (Bakshy et al. 2011; Watts 2007) and suggests that it is far from cost effective. Rather than targeting “special” people, the current research suggests that it may be more beneficial to focus on crafting contagious content. By considering how psychological processes shape social transmission, it is possible to gain deeper insight into collective outcomes, such as what becomes viral.”
For some reason I had to delete some pages, these pages are using the HTML suffix, so I blocked them in robots.txt use Disallow: /*.html, but it’s been almost a year, I found that google robot often capture these pages, How can I quickly let Google completely remove these pages? And I have removed these URL from google webmaster tool by google index-> remove URLs, but Google still capture these pages.