Search Engine Optimization (SEO) is the word online marketing more likely. Google is always changing their algorithms and SEO professionals to rack their brains to work with the latest algorithms. Always good SEO professionals suggest using white hat methods, since it will remain longer in the results of search engines. Here are some tips to follow white hat SEO according to the latest search engine algorithms in 2011.
- Search and choose the right keywords that drive quality traffic, leads and conversions
- Analyze competitors who are successful in similar cases and extract the following methods and keywords are optimized
- Choosing the right domain name based geo – ‘. In business targeted in India, ‘com’. For the U.S., England, etc. “co.uk”.
- Plan a user friendly and search engine design and navigation links for a good architecture
- Hire a professional writer to write good quality content contained on the user’s attention, as well as a search engine
- Make sure the site has no connection URLs broken the timeout, the time of high load of tools, etc., as Xenu and extensions such as PageSpeed, YSlow will help us develop the site technically clean.
- Use the robots.txt file to block all unnecessary pages that are indexed by search engines.
- Submit an XML sitemap to search engines Webmaster Tools account and correct errors
- Rebuild quality links through the use of white hat link building methods.
- Traffic analysis and plan or change strategies to generate quality traffic that converts to sales
Shorten the URL is used, particularly on social media sites to share web pages and links. People spend hours in front of Facebook, Twitter, etc., so the URL Shortener services are increasing day by day. Most people have this problem if these short URLs SEO friendly or not.
I’m talking about creating your website to ensure that search engines can find your products, services and all the content you have published.
Here are eight ways to ensure that search engines have no problem finding and indexing your web pages:
1. Avoid flash: Flash is not inherently bad. When used correctly, can improve the visitor experience. But your site should not be built entirely in Flash or your site navigation is only done in Flash. Search engines have stated a few years now that they are better at crawling Flash, but it’s still not a substitute for good, the menus and the site searchable content.
2. Avoid AJAX: The same ideas mentioned above apply here for flash AJAX. You can add the user experience of your site, but AJAX is, historically, has not been visible to search engines. Google offers a guide to help make AJAX-based content search, but it is complicated and SEO “best practices” recommendations remain the same: Do not put important content on AJAX.
It remains the best practice today: Make your site navigation is presented in simple, easy to crawl HTML links.
4. Avoid long dynamic URLs: A “dynamic URL” is defined simply as having a “?” In it, as
This is a very simple dynamic URLs, and search engines are now indexing something. But when the dynamic URLs are always longer and more complex, search engines may be less likely to index them (for various reasons, one of which is that research shows that researchers prefer a short URL). So, if the URL looks like this, you may need crawlability problems with:
Google Webmaster Help page which reads: “… be aware that all search engines crawl dynamic pages and static pages. It help keep the parameters short and the number of them few.”
5. Avoid session IDs in URLs: This is an offshoot of the previous section, but must be listed separately. Search engines do not crawl and index URLs that has a session ID. Why? Because even if the session ID is different URL each time the spider visits, the contents of the current page is the same. If indexing URLs with session IDs, then there would be a ton of duplicate content appear in search results.
6. Avoid robots.txt blocking: First of all, there is no need to have a robots.txt file on a web site, millions of web pages very well without it. But if you use something (perhaps because you want to make sure your administrator or members-only pages are not indexed), be careful not to completely prevent the robot, the entire website.
In any case, your robots.txt file is something like this:
That code block all spiders from accessing your site. If you ever have any questions about using a robots.txt file, visit robotstxt.org.
If you take care of all the above questions, you can be sure you’ve made it as easy as possible for search engines to crawl and index your site.
Facebook profile of the search engines like Google, Yahoo, MSN, ASK. Since Facebook is the leading brand among all social networking sites and Alexa Rank #3 with Google PageRank (PR)10. For all major search engines Google for most all the indexed pages of Facebook almost everyone, including profile page and take it to the pages of the first or second. Therefore, everyone can find their friends and family or strangers with ease by the Google search engine by entering their names. But many people do not want to expose your profile in the Google search engine and do not want unsolicited requests friend.
You can prevent your Facebook profile, get listed in Google. Here is how you do it with simple step by step procedure.
- Sign into Facebook Account.
- Choose Your Privacy Settings
- Choose Your Privacy Settings Apps, Games and Websites
- Choose Your Privacy Settings Public Search
- Click on Edit Settings under Public search. See the preview.
- Unchecked Enable public search under Public search. See the preview.
You must click Confirm. Now it will take some time to hide your Facebook profile to search engines.
If you change your domain name and redirect the old pages with a 301-redirect from your old site to your new page as a power link will be transferred to your new domain name, but the overall effect of these links decrease. 301 redirects do not pass the full PageRank.