One of the three major pillars of Search Engine Optimization is a website’s content, and onsite content optimization. All of the major search engine ranking algorithms have components that relate to the content that is contained on the website. Typically these components relate to Keyword Densities, number of words, content location, and sometimes age of content. In regards to the code that the content is contained in that falls under the topic of structure and not content, and will not be discussed in this article.
Asynchronous JavaScript and XML (AJAX) is an advanced web development method which can be used to create more responsive and interactive dynamic websites. AJAX accomplishes this by making object request calls back to the web server without having to refresh your browser, these object calls are then processed and are typically used to update the content of the page on your website that is currently being viewed. For the sake of this Article I’m going to ignore the XML component of AJAX as the search engines never view any of the XML data. Websites that use Javascript to manipulate content without using AJAX will also suffer from the issues described.
When a search engine sends out a robot / spider to visit your website with the goal of indexing your content it is only looking at what is being presented in the Markup Language. Generally a search engine does not behave like a user when indexing your website, it doesn’t click buttons or links it simply makes note of URLs associated with each page then individually then visits these pages to index them. This largely goes against the goal of AJAX which is to have as few pages as possible by interacting with the web server in a smarter method as the users interact with the website.
To put the last paragraph simply any content that is changed via AJAX or Javascript on a webpage that is not hardcoded in a page won’t be cached by the search engines. This essentially means that if you have great content that the search engines may love but you’re using AJAX you may be missing out on traffic. There are two approaches to rectifying these which may even give you an advantage over sites that don’t utilize Javascript / AJAX.
The first approach:
Is to make sure that your website degrades to normal flat markup language for non javascript capable browsers and search engines. Essentially every time you would have used an AJAX call make sure you have a page with the same content. Unfortunately for a lot of people this could mean a lot of work, for those individual using a database with PHP or ASP it is not too hard to build a site that builds itself with some effective web programming.
The second approach:
Is to use AJAX in a more minimalist fashion. The goal here is to present the search engines with your optimized content while making sure that any AJAX calls a user would do has no bearing on what you want the search engines to see. In fact this can be used to remove content from your website which may negatively affect your rankings such as testimonials. I’ve seen very few testimonials that actually do good things for a sites keyword density, I’ve even been known to optimize testimonials on client’s websites. With Javascript / AJAX you could insert a random testimonial into a page and therefore not affecting that pages keyword density. The only downside to this approach is that some offsite keyword density tools actually use Web Browser rendering engines so they may get false results as it takes the Javascript into account.
Now you may think that I’m anti AJAX from everything that I’ve said, but there are times and places for AJAX, provided it doesn’t affect how the search engines see your beautiful relevant content your trying to rank. AJAX is great to use for Member sections of your website, interactive forms, slideshows, and a lot more it just needs to be leveraged correctly to avoid missing out on search engine visitors. The final thing to keep in mind is that most search engines like to see more than a single page website which many AJAX website appear to be, always strive for at least 5 or more indexable pages as internal links and anchor text can have a lot of value.
Showing posts with label rankings. Show all posts
Showing posts with label rankings. Show all posts
May 30, 2010
May 21, 2010
SEO For Google
Virtually every webmaster and website owner is primarily concerned about attaining Google rankings due to the significantly higher number of searchers using it. Provided that you are building your website following the best practices of SEO (i.e. unique content, a sizable amount of content, and a good number of incoming links) then your rankings are sure to follow, however due to their aging delay it will likely take longer than on MSN or Yahoo! Google considers the age of your links, your domain and even the individual page to be a factor and the longer your page has been online the better.
Essentially, ranking a site on Google requires that you take the same actions as for Yahoo! and MSN, continue your link building efforts on an ongoing basis to insure that you end up with more-and-more links, and that you update your content and add content on a fairly regular basis (though the addition of a blog for example).
Essentially, ranking a site on Google requires that you take the same actions as for Yahoo! and MSN, continue your link building efforts on an ongoing basis to insure that you end up with more-and-more links, and that you update your content and add content on a fairly regular basis (though the addition of a blog for example).
May 14, 2010
The Best Top Ten Tips for SEO To Improved Search Engine Placement
Read Website Optimization in Ten Easy Steps
1.Do not purchase a new domain unless you have to.
Due to Google's aging delay for all new domains, your best bet is to use your existing domain/website if at all possible. If you're redesigning or starting from scratch and you have to use a brand-new domain for some reason, you can expect to wait a good 9-12 months before your site will show up in Google for any keyword phrases that are important to you.
2.Optimize your site for your target audience, not for the search engines.
This may sound counterintuitive, but hear me out. The search engines are looking for pages that best fit the keyword phrase someone types into their little search box. If those "someones" are typing in search words that relate to what your site offers, then they are most likely members of your target audience. You need to optimize your site to meet target audience needs. If you don't know who your target audience is, then you need to find out one way or another. Look for studies online that might provide demographic information, and visit other sites, communities, or forums where your target audience might hang out and listen to what they discuss. This information will be crucial to your resulting website design, keyword research, and copywriting.
3.Research your keyword phrases extensively.
The phrases you think your target market might be searching for may very well be incorrect. To find the optimal phrases to optimize for, use research tools such as our Keyword Search tool, Google AdWords, and Yahoo Search Marketing data. Compile lists of the most relevant phrases for your site, and choose a few different ones for every page. Never shoot for general keywords such as "travel" or "vacation," as they are rarely (if ever) indicative of what your site is really about.
4.Design and categorize your site architecture and navigation based on your keyword research.
Your research may uncover undiscovered areas of interest or ways of categorizing your products/services that you may wish to add to your site. For instance, let's say your site sells toys. There are numerous ways you could categorize and lay out your site so that people will find the toys they're looking for. Are people looking for toys to fit their child's stage of development? (Look for keyword phrases such as "preschool toys.") Or are they more likely to be seeking specific brands of toys? Most likely, your keyword research will show you that people are looking for toys in many different ways. Your job is to make sure that your site's navigation showcases the various ways of searching. Make sure you have links to specific-brand pages as well as specific age ranges, specific types of toys, etc.
5.Program your site to be "crawler-friendly."
The search engines can't fill out forms, can't search your site, can't read JavaScript links and menus, and can't interpret graphics and Flash. This doesn't mean that you can't use these things on your site; you most certainly can! However, you do need to provide alternate means of navigating your site as necessary. If you have only a drop-down sequence of menus to choose a category or a brand of something, the search engine crawlers will never find those resulting pages. You'll need to make sure that you always have some form of HTML links in the main navigation on every page which link to the top-level pages of your site. From those pages, you'll need to have further HTML links to the individual product/service pages. (Please note that HTML links do NOT have to be text-only links. There's nothing wrong with graphical image navigation that is wrapped in standard tags, as the search engines can follow image links just fine.)
6.Label your internal text links and clickable image alt attributes as clearly and descriptively as possible.
Your site visitors and the search engines look at the clickable portion of your links (aka the anchor text) to help them understand what they're going to find once they click through. Don't make them guess what's at the other end with links that say "click here" or other non-descriptive words. Be as descriptive as possible with every text and graphical link on your site. The cool thing about writing your anchor text and alt attributes to be descriptive is that you can almost always describe the page you're pointing to by using its main keyword phrase.
7.Write compelling copy for the key pages of your site based on your chosen keyword phrases and your target market's needs, and make sure it's copy that the search engines can "see."
This is a crucial component to having a successful website. The search engines need to read keyword-rich copy on your pages so they can understand how to classify your site. This copy shouldn't be buried in graphics or hidden in Flash. Write your copy based on your most relevant keyword phrases while also making an emotional connection with your site visitor. (This is where that target audience analysis comes in handy!) Understand that there is no magical number of words per page or number of times to use your phrases in your copy. The important thing is to use your keyword phrases only when and where it makes sense to do so for the real people reading your pages. Simply sticking keyword phrases at the top of the page for no apparent reason isn't going to cut it, and it just looks silly.
8.Incorporate your keyword phrases into each page's unique Title tag.
Title tags are critical because they're given a lot of weight with every search engine. Whatever keyword phrases you've written your copy around should also be used in your Title tag. Remember that the information that you place in this tag is what will show up as the clickable link to your site at the search engines. Make sure that it accurately reflects the content of the page it's on, while also using the keyword phrases people might be using at a search engine to find your stuff.
9.Make sure your site is "link-worthy."
Other sites linking to yours is a critical component of a successful search engine optimization campaign, as all of the major search engines place a good deal of emphasis on your site's overall link popularity. You can go out and request hundreds or thousands of links, but if your site stinks, why would anyone want to link to it? On the other hand, if your site is full of wonderful, useful information, other sites will naturally link to it without your even asking. It's fine to trade links; just make sure you are providing your site visitors with only the highest quality of related sites. When you link to lousy sites, keep in mind what this says to your site visitors as well as to the search engines.
10.Don't be married to any one keyword phrase or worried too much about rankings.
If you've done the above 9 things correctly, you will start to see an increase in targeted search engine visitors to your site fairly quickly. Forget about where you rank for any specific keyword phrase and instead measure your results in increased traffic, sales, and conversions. It certainly won't hurt to add new content to your site if it will really make your site more useful, but don't simply add a load of fluff just for the sake of adding something.
1.Do not purchase a new domain unless you have to.
Due to Google's aging delay for all new domains, your best bet is to use your existing domain/website if at all possible. If you're redesigning or starting from scratch and you have to use a brand-new domain for some reason, you can expect to wait a good 9-12 months before your site will show up in Google for any keyword phrases that are important to you.
2.Optimize your site for your target audience, not for the search engines.
This may sound counterintuitive, but hear me out. The search engines are looking for pages that best fit the keyword phrase someone types into their little search box. If those "someones" are typing in search words that relate to what your site offers, then they are most likely members of your target audience. You need to optimize your site to meet target audience needs. If you don't know who your target audience is, then you need to find out one way or another. Look for studies online that might provide demographic information, and visit other sites, communities, or forums where your target audience might hang out and listen to what they discuss. This information will be crucial to your resulting website design, keyword research, and copywriting.
3.Research your keyword phrases extensively.
The phrases you think your target market might be searching for may very well be incorrect. To find the optimal phrases to optimize for, use research tools such as our Keyword Search tool, Google AdWords, and Yahoo Search Marketing data. Compile lists of the most relevant phrases for your site, and choose a few different ones for every page. Never shoot for general keywords such as "travel" or "vacation," as they are rarely (if ever) indicative of what your site is really about.
4.Design and categorize your site architecture and navigation based on your keyword research.
Your research may uncover undiscovered areas of interest or ways of categorizing your products/services that you may wish to add to your site. For instance, let's say your site sells toys. There are numerous ways you could categorize and lay out your site so that people will find the toys they're looking for. Are people looking for toys to fit their child's stage of development? (Look for keyword phrases such as "preschool toys.") Or are they more likely to be seeking specific brands of toys? Most likely, your keyword research will show you that people are looking for toys in many different ways. Your job is to make sure that your site's navigation showcases the various ways of searching. Make sure you have links to specific-brand pages as well as specific age ranges, specific types of toys, etc.
5.Program your site to be "crawler-friendly."
The search engines can't fill out forms, can't search your site, can't read JavaScript links and menus, and can't interpret graphics and Flash. This doesn't mean that you can't use these things on your site; you most certainly can! However, you do need to provide alternate means of navigating your site as necessary. If you have only a drop-down sequence of menus to choose a category or a brand of something, the search engine crawlers will never find those resulting pages. You'll need to make sure that you always have some form of HTML links in the main navigation on every page which link to the top-level pages of your site. From those pages, you'll need to have further HTML links to the individual product/service pages. (Please note that HTML links do NOT have to be text-only links. There's nothing wrong with graphical image navigation that is wrapped in standard tags, as the search engines can follow image links just fine.)
6.Label your internal text links and clickable image alt attributes as clearly and descriptively as possible.
Your site visitors and the search engines look at the clickable portion of your links (aka the anchor text) to help them understand what they're going to find once they click through. Don't make them guess what's at the other end with links that say "click here" or other non-descriptive words. Be as descriptive as possible with every text and graphical link on your site. The cool thing about writing your anchor text and alt attributes to be descriptive is that you can almost always describe the page you're pointing to by using its main keyword phrase.
7.Write compelling copy for the key pages of your site based on your chosen keyword phrases and your target market's needs, and make sure it's copy that the search engines can "see."
This is a crucial component to having a successful website. The search engines need to read keyword-rich copy on your pages so they can understand how to classify your site. This copy shouldn't be buried in graphics or hidden in Flash. Write your copy based on your most relevant keyword phrases while also making an emotional connection with your site visitor. (This is where that target audience analysis comes in handy!) Understand that there is no magical number of words per page or number of times to use your phrases in your copy. The important thing is to use your keyword phrases only when and where it makes sense to do so for the real people reading your pages. Simply sticking keyword phrases at the top of the page for no apparent reason isn't going to cut it, and it just looks silly.
8.Incorporate your keyword phrases into each page's unique Title tag.
Title tags are critical because they're given a lot of weight with every search engine. Whatever keyword phrases you've written your copy around should also be used in your Title tag. Remember that the information that you place in this tag is what will show up as the clickable link to your site at the search engines. Make sure that it accurately reflects the content of the page it's on, while also using the keyword phrases people might be using at a search engine to find your stuff.
9.Make sure your site is "link-worthy."
Other sites linking to yours is a critical component of a successful search engine optimization campaign, as all of the major search engines place a good deal of emphasis on your site's overall link popularity. You can go out and request hundreds or thousands of links, but if your site stinks, why would anyone want to link to it? On the other hand, if your site is full of wonderful, useful information, other sites will naturally link to it without your even asking. It's fine to trade links; just make sure you are providing your site visitors with only the highest quality of related sites. When you link to lousy sites, keep in mind what this says to your site visitors as well as to the search engines.
10.Don't be married to any one keyword phrase or worried too much about rankings.
If you've done the above 9 things correctly, you will start to see an increase in targeted search engine visitors to your site fairly quickly. Forget about where you rank for any specific keyword phrase and instead measure your results in increased traffic, sales, and conversions. It certainly won't hurt to add new content to your site if it will really make your site more useful, but don't simply add a load of fluff just for the sake of adding something.
It really is okay to have a business site that is just a business site and not a diatribe on the history of your products. Neither your site visitors nor the engines really give a hoot!
Black Hat SEO
Black Hat SEO search engine optimization is customarily defined as techniques that are used to get higher search rankings in an unethical manner. These black hat SEO techniques usually include one or more of the following characteristics:
Black Hat SEO Techniques To Avoid
- Keyword stuffing: Packing long lists of keywords and nothing else onto your site will get you penalized eventually by search engines. Learn how to find and place keywords and phrases the right way on your Web site with my article titled Learn Where And How To Put Keywords In Your Site Pages.
- Invisible text: This is putting lists of keywords in white text on a white background in hopes of attracting more search engine spiders. Again, not a good way to attract searchers or search engine crawlers.
- Doorway Pages: A doorway page is basically a “fake” page that the user will never see. It is purely for search engine spiders, and attempts to trick them into indexing the site higher. Read more about doorway pages.
Black Hat SEO is tempting; after all, these tricks actually do work, temporarily. They do end up getting sites higher search rankings; that is, until these same sites get banned for using unethical practices. It’s just not worth the risk. Use efficient search engine optimization techniques to get your site ranked higher, and stay away from anything that even looks like Black Hat SEO.
- Breaks search engine rules and regulations
- Creates a poor user experience directly because of the black hat SEO techniques utilized on the Web site
- Unethically presents content in a different visual or non-visual way to search engine spiders and search engine users.
Black Hat SEO Techniques To Avoid
- Keyword stuffing: Packing long lists of keywords and nothing else onto your site will get you penalized eventually by search engines. Learn how to find and place keywords and phrases the right way on your Web site with my article titled Learn Where And How To Put Keywords In Your Site Pages.
- Invisible text: This is putting lists of keywords in white text on a white background in hopes of attracting more search engine spiders. Again, not a good way to attract searchers or search engine crawlers.
- Doorway Pages: A doorway page is basically a “fake” page that the user will never see. It is purely for search engine spiders, and attempts to trick them into indexing the site higher. Read more about doorway pages.
Black Hat SEO is tempting; after all, these tricks actually do work, temporarily. They do end up getting sites higher search rankings; that is, until these same sites get banned for using unethical practices. It’s just not worth the risk. Use efficient search engine optimization techniques to get your site ranked higher, and stay away from anything that even looks like Black Hat SEO.
May 13, 2010
Measuring Relevance and Popularity
Modern commercial search engines rely on the science of information retrieval (IR). That science has existed since the middle of the 20th century, when retrieval systems powered computers in libraries, research facilities, and government labs. Early in the development of search systems, IR scientists realized that two critical components made up the majority of search functionality:
Relevance - the degree to which the content of the documents returned in a search matched the user's query intention and terms. The relevance of a document increases if the terms or phrase queried by the user occurs multiple times and shows up in the title of the work or in important headlines or subheaders.
Popularity - the relative importance, measured via citation (the act of one work referencing another, as often occurs in academic and business documents) of a given document that matches the user's query. The popularity of a given document increases with every other document that references it.
These two items were translated to web search 40 years later and manifest themselves in the form of document analysis and link analysis.
In document analysis, search engines look at whether the search terms are found in important areas of the document - the title, the meta data, the heading tags, and the body of text content. They also attempt to automatically measure the quality of the document (through complex systems beyond the scope of this guide).
In link analysis, search engines measure not only who is linking to a site or page, but what they are saying about that page/site. They also have a good grasp on who is affiliated with whom (through historical link data, the site's registration records, and other sources), who is worthy of being trusted (links from .edu and .gov pages are generally more valuable for this reason), and contextual data about the site the page is hosted on (who links to that site, what they say about the site, etc.).
Link and document analysis combine and overlap hundreds of factors that can be individually measured and filtered through the search engine algorithms (the set of instructions that tells the engines what importance to assign to each factor). The algorithm then determines scoring for the documents and (ideally) lists results in decreasing order of importance (rankings).
Relevance - the degree to which the content of the documents returned in a search matched the user's query intention and terms. The relevance of a document increases if the terms or phrase queried by the user occurs multiple times and shows up in the title of the work or in important headlines or subheaders.
Popularity - the relative importance, measured via citation (the act of one work referencing another, as often occurs in academic and business documents) of a given document that matches the user's query. The popularity of a given document increases with every other document that references it.
These two items were translated to web search 40 years later and manifest themselves in the form of document analysis and link analysis.
In document analysis, search engines look at whether the search terms are found in important areas of the document - the title, the meta data, the heading tags, and the body of text content. They also attempt to automatically measure the quality of the document (through complex systems beyond the scope of this guide).
In link analysis, search engines measure not only who is linking to a site or page, but what they are saying about that page/site. They also have a good grasp on who is affiliated with whom (through historical link data, the site's registration records, and other sources), who is worthy of being trusted (links from .edu and .gov pages are generally more valuable for this reason), and contextual data about the site the page is hosted on (who links to that site, what they say about the site, etc.).
Link and document analysis combine and overlap hundreds of factors that can be individually measured and filtered through the search engine algorithms (the set of instructions that tells the engines what importance to assign to each factor). The algorithm then determines scoring for the documents and (ideally) lists results in decreasing order of importance (rankings).
Subscribe to:
Posts (Atom)