Aggressive Internet Marketing Made Possible

Aggressive internet marketing means full-blown marketing and promotions that exceed any businessman’s expectations.

A business needs fierce internet marketing. No more, no less.

But to make it low cost? Is that even possible?

How can something so aggressive be affordable? Continue reading “Aggressive Internet Marketing Made Possible”

Advertisements

Run iPad Web Application Full Page (Bookmark)

Building an iPhone or iPad Web App is not an easy endeavor. However, if you have any experience in HTML 5, Javascript and CSS its a lot easier to accomplish than you would think. Apple has built a lot of tags and templates into the building of there mobile safari browser. I am going to go over many of the things that I have found that have really helped me build my webapp.

Meta Tags Used By Mobile Safari

To make your app full screen when it is added to the iPhone /iPad home screen using the + button on the browser, the below meta tag will remove all buttons and the url bar and will give it the “native” app look that we are going after. To use any of the meta tags below just add them somewhere in your head tag.

<meta name=”apple-mobile-web-app-capable” content=”yes” />

Now when you pinch on the app it will still zoom and scale. Now, native apps will not zoom or scale, so when need to do something to stop that. Luckily Apple has designed a Meta tag to do that also..

<meta name=”viewport” content=”user-scalable=no, width=device-width” />

The last thing that you may notice is when you tug the app up or down it kinda zooms down and you can see the “background” of the browser. Apple has also designed a meta tag to get rid of that. However, you do need to add a tag to your body tag also. Below is an example…

//

Making Links Open Fullscreen and Not In Safari

Now this was one of the most frustrating problems I had with making my web app in the beginning. If you use…

<a href=”link.html”>Your Link</a>

that tag will take you straight out of your fullscreen app and right into the normal safari, we don’t want that. So the solution to get past this is to use a little bit of Javascript. Now, how we would do this using the Javascript is below…

<a ontouchstart=”window.location=yourlink.html’ “>Your Link</a>

Now some of you familiar with Javascript will see that we are using the ontouchstart instead of onclick. This is a iPhone/iPad only event trigger that will only work on iOS devices. Its a lot smoother on iOS devices but will not work on computers. You can use that event trigger anywhere in your web app. Since onclick takes about one extra second for the device to read, ontouch start is a great asset to have on our side.

IOS7 and above – Show Links in App

if((“standalone” in window.navigator) && window.navigator.standalone){

var noddy, remotes = false;

document.addEventListener(‘click’, function(event) {

noddy = event.target;

while(noddy.nodeName !== “A” && noddy.nodeName !== “HTML”)

{ noddy = noddy.parentNode; }

if(‘href’ in noddy && noddy.href.indexOf(‘http’) !== -1 && (noddy.href.indexOf(document.location.host) !== -1 || remotes))

{ event.preventDefault(); document.location.href = noddy.href; }

},false);
}

Specifying a Webpage Icon for Web Clip

You may want users to be able to add your web application or webpage link to the Home screen. These links, represented by an icon, are called Web Clips. Follow these simple steps to specify an icon to represent your web application or webpage on iOS.

  • To specify an icon for the entire website (every page on the website), place an icon file in PNG format in the root document folder called apple-touch-icon.png
  • To specify an icon for a single webpage or replace the website icon with a webpage-specific icon, add a link element to the webpage, as in:
    <link rel="apple-touch-icon" href="/custom_icon.png">

    In the above example, replace custom_icon.png with your icon filename.

  • To specify multiple icons for different device resolutions—for example, support both iPhone and iPad devices—add a sizes attribute to each link element as follows:
    <link rel="apple-touch-icon" href="touch-icon-iphone.png">
    <link rel="apple-touch-icon" sizes="76x76" href="touch-icon-ipad.png">
    <link rel="apple-touch-icon" sizes="120x120" href="touch-icon-iphone-retina.png">
    <link rel="apple-touch-icon" sizes="152x152" href="touch-icon-ipad-retina.png">

    The icon that is the most appropriate size for the device is used. If no sizes attribute is set, the element’s size defaults to 60 x 60.

If there is no icon that matches the recommended size for the device, the smallest icon larger than the recommended size is used. If there are no icons larger than the recommended size, the largest icon is used.

If no icons are specified using a link element, the website root directory is searched for icons with the apple-touch-icon... prefix. For example, if the appropriate icon size for the device is 60 x 60, the system searches for filenames in the following order:

  1. apple-touch-icon-76×76.png
  2. apple-touch-icon.png

See “Icon and Image Sizes” for webpage icon metrics.

Now what if your user wants to take a certain element and drag it around with there finger? How are we going to do this? Well the technique that I am using is called Drag Drop Library. Its very simple to use, go to this website http://www.gotproject.com/blog/post2.html, download their javascrupt Library using the green button. Then link to that JavaScript file on any page where you would want to use drag and drop. Now, the syntax for getting an element to be able to be dragged is below…

  • //To make an element draggable
  • var D = new webkit_draggable(‘id_of_element’, {options});
  • //To stop an element from being draggable
  • D.destroy();

So what you are going to do is give an element that you want to be able to be dragged a unique ID using css. Then take that css and input it into the above code that reads ‘id_of_element’. So that sample code would find and html element that has a id=”id_of_element” and make it be able to be dragged around by the users finger. It’s fairly simple and creates an awesome “WOW” affect with your apps.

iPhone and iPad web apps are slowly becoming more wide spread around the world. Especially now that Apple is finally giving us Web Designers and Developers the required tools. Now every iOS developer doesn’t have to be an Obj C wizard and have a mac, any person with a little HTML know how can create easy, good looking apps.

Getting iPad Web App to Work Offline Using Localhost, not Cache-Manifest

I spent so much time looking around trying to figure out how I was going to get my web app to work offline. My client will be using the app at large conferences to showcase there product and give a presentation. Wifi is very very unstable at large conferences so it is necessary to make it work offline. What I finally decided to do was to Jailbreak each iPad, and install a server on the localhost of the iPad. To do that I used a program on Cydia called PHPod. This creates a server on that iPad and gives me access to files stored directly on the iPad. What I did next was SSH into the iPad and copy my entire Web App into the WWW directory. Next, I went to localhost on safari on the iPad, clicked my web app and boom there it is right on the iPad. It runs fast, my JQuery still works, all my forms work, its perfect. Now I am looking to make my own Cydia repo where I can host the web app so that the clients wont have to ssh into the iPad every time there is a change. It will just update the package and the site will be updated by the Repo. Yes there is the option to use Cache Manifest and all that but it doesn’t really work all the time and there’s a limit so I couldn’t cache all my videos and larger files etc.

Meta tags

Static Meta Tags

Static Meta tags should be used where there either is no on page dynamic content or the end result is better than using Dynamic Meta Tags. Using static Meta tags gives you more control over the end result but also increases workload significantly. Use the following set of META controls when editing the page source.

<TITLE>title here</TITLE>
<META NAME=”Description” CONTENT=”content here”>
<META NAME=”Keywords” CONTENT=”content here”>
<META NAME=”author” content=”author here”>
<META NAME=”copyright” content=”company here”>
<META NAME=”language” content=”en-us”>
<META NAME=”rating” content=”General”>
<META NAME=”robots” content=”index,follow”>
<META NAME=”revisit-after” content=”7 Days”>
<META http-equiv=”pragma” content=”no-cache”>

Dynamic Meta Tags (Using PHP)

Dynamic Meta Tags are used when there is a dynamic element on the page such as a title, deck, and page body. In the example below, the fields, Profile_Name, Profile_Symbol, MetaDesc, and MetaKeywords are used to populate the title with a public company name, and
Top Performing SEO/SEM Strategies Page 11 of 18
exchange symbol. The description is populated with the first 150 correctors of the deck and the keywords are being populated with the company name and the first 100 characters of the deck. Use the following set of META controls when editing the page source. (Edit to your requirements)
$Profile_ID = $company_profile[‘ID’];
$Profile_Name = $company_profile[‘Name’];
$Profile_Symbol = $company_profile[‘Symbol’];
$Profile_Exchange = $company_profile[‘Exchange’];
$Profile_Description = $company_profile[‘Description’];
$MetaDesc=substr($Profile_Description, 0, 150);
$MetaKeywords=substr($Profile_Description, 0, 100);
<title><?php echo “$Profile_Name”; ?> – <?php echo “$Profile_Symbol”; ?></title> <META NAME=”Description” CONTENT=”<?php echo “$MetaDesc”; ?>”>
<META NAME=”Keywords” CONTENT=”<?php echo “$Profile_Name”; ?>,
<?php echo “$MetaKeywords”; ?>”>
<META NAME=”author” content=”author here”>
<META NAME=”copyright” content=”company here”>
<META NAME=”language” content=”en-us”>
<META NAME=”rating” content=”General”>
<META NAME=”robots” content=”index,follow”>
<META NAME=”revisit-after” content=”7 Days”>
<META http-equiv=”pragma” content=”no-cache”>

Meta Tag Analyzer

Use a Meta Tag analyzer to analyze your relevancy for every page in the site. The goal is to achieve as close to 100% keyword relevancy between all page elements as possible. As mentioned earlier, assign a theme to each page; keep the theme flowing through the Title, Description, Keywords, Headline, Deck, Page Body, Links, and Page Name. When you are certain that the site has been developed in this manner, run the page through a Meta Tag Analyzer and remove or ad words phrases to the page focusing on strengthening the theme.

Keyword Density

Keyword density is the measurement of how many times a single word or phrase is found within all page text on that page. The example below shows us that the keyword “madden” has been found 39 times and is measured at 2.53% keyword density. The goal is to get between 4% – 6% keyword density for a single keyword, 2% – 3% keyword density for a 2 keyword phrase, and 1.5% – 2% for a 3 keyword phrase. When you keep the keyword density between these percentages, you will be sure not to over stuff, and under utilize your keywords.

META NAME Robots

The content of the Robots META tag contains directives separated by commas. The currently defined directives are [NO]INDEX and [NO]FOLLOW. The INDEX directive specifies if an indexing robot should index the page. The FOLLOW directive specifies if a robot is to follow links on the page. The defaults are INDEX and FOLLOW. The values ALL and NONE set all directives on or off: ALL=INDEX,FOLLOW and NONE=NOINDEX,NOFOLLOW. Some examples (more examples can be found by Googling the phrase ‘Robots META tag’):

<meta name=”robots” content=”index,follow”>
<meta name=”robots” content=”noindex,follow”>
<meta name=”robots” content=”index,nofollow”>
<meta name=”robots” content=”noindex,nofollow”>

How to find the page relevancy against a search query?

There are a number of ways to establish a page’s relevancy in the eyes of search engines. One of these methods is called the “on-page factor”. On-page factors involve placing your keywords in strategic locations throughout the pages on your site, so that search engines know to associate those keywords with a specific web page.

Important on-page locations include Header Tags, Internal Links, External Links, Anchor Text, Bold and Italicized Text, HTML Lists, ALT Tags, Image Names, Dynamic Bread Crumb, Title, Description, Keywords, Headline, Deck, Page Body, and Page Name.

Search engines make money by showing ads. In most cases, that’s their entire profit model. This means that in order to make money, they need to show those ads to as many people as possible. The method in which they get the largest number of people to use their search engine is by giving them the most relevant search results. If the search query, “Make Money” was typed into a search engine, one would expect to see the very best pages about that specific topic. If the results returned pages about Investment Opportunities, Vacation Rentals, or ever worse, pages about Viagra and Online Casinos, the user would probably decide to use a different search engine that could provide more relevant search results. Search engines have a vested interest in providing the best, most relevant search results possible.

Otherwise, people could stop using them, and as a result, they would have no one to show their ads to, and eventually go out of business.

Another method to establish relevancy are “off-page factors”.

These are the factors related to the pages that link to the site from other sites. Off-page factors include the inbound link anchor text, the text in the paragraphs surrounding that anchor text, the titles of the pages linking to the page, the other on page factors of the pages that link to the page, the directory categories the site is found in, the directory categories of the sites linking to the page, and many other factors.

Of the off-page factors, the inbound link anchor text is the most important, but they all play a role. Some search engines are more advanced than others, and make more complete use of this data; however, all of the major search engines are moving towards applying this data in order to increase the quality and relevancy of their search results. Simply put; the topic of and theme of the page MUST be built around the keywords and key phrases you are targeting. If you are writing long sales copy, this is very difficult task to perform with without making the content read funny. In this case you would only focus on the first 5-10 paragraphs or the eye catcher. A webmaster should always follow acceptability guidelines for each search engine. Review these guidelines and become familiar with them.

Google Guidelines: http://www.google.com/Webmasters/guidelines.html

Yahoo Guidelines: http://help.yahoo.com/help/us/ysearch/basics/basics-18.html

Ask Guidelines: http://about.ask.com/en/docs/about/editorial_guidelines.shtml

DMOZ Guidelines: http://www.dmoz.org/help/submit.html

Their advice is to generally create content for the user; not the search engines – to make content easily accessible to their spiders and to not try to trick their system. Webmasters often make critical mistakes when designing or setting up their web sites, inadvertently “poisoning” them so that they will not rank well. Coding guidelines published by the World Wide Web Consortium (http://www.w3.org/) should be followed as well as tested using their free valuator which checks the markup validity of Web documents in HTML, XHTML, SMIL, MathML, etc. (http://validator.w3.org/) If the acceptability and coding guidelines are followed, and the site presents frequently updated, useful, original content, and a few meaningful, useful inbound links are established. It is very possible to obtain a significant amount of organic search traffic. When a site has useful content, other Webmasters will naturally place links to the site, increasing its Page Rank and flow of visitors.

When visitors discover a useful web site, they tend to refer other visitors by emailing or instant messaging links. As a result, practices that improve web site quality are likely to outlive short term practices that simply seek to manipulate search rankings. Relevant, useful content will ensure you will always come out on top!

Headline Relevancy

The headline of the article should be created as a keyword phrase using as little junk words as possible. A junk word is a word that gets in the way of the deck message and dilutes the relevancy of the message. A few examples of these junk words are: the, and, if, at, about, & of. If someone is searching for the phrase “Corn Commodities” an article or page that has been optimized for that exact phrase will gain higher ranking in the search engines than “Coy About Corn Commodities” because the words “Coy About” are not getting in the way and diluting the exact search term “Corn Commodities”.

Deck Relevancy

The deck is a summary of the title; somewhere inside this summary should be your keyword phrase without using too many junk words. If you are using the words “Corn Commodities” in the title, use it in the deck as well.

Creating a Relevant Headline and Deck

(WRONG)

Page Headline: The Easy Way To Create Your OWN Money Making Internet Business

Deck: Get access to absolutely ALL of the tools, resources, and expert support you need to start your own BOOMING, fully-automated Internet business from scratch.

(RIGHT)

Page Headline: Create Your OWN Money Making Internet Business

Deck: Access to ALL of the tools, resources, and expert support needed to start your own BOOMING, fully-automated Internet business.

Body Relevancy

The body of the page is a very important part of search engine optimization. Many search engines like Google “strip” the body text and use it to not only to display sections in their search results, but they also use the body text to determine the subject matter of each page, determine which words and phrases are used the most, and assigns a theme to the page. This way; the most relevant sites are found in their search results.

Content Theme

The content writer needs to apply a keyword theme to each page. In order to apply a keyword theme you must keep a very distinct set of keywords to use on that page.

The theme must flow through the Title, Description, Keywords, Headline, Deck, Page Body, Links, and Page Name.

The following wire frame is an example of a highly optimized page which include keywords in the Website title, slogan, bread crumb, page title, page deck, page body, and page footer. Other keyword elements can be added to navigation, news headlines, blog headlines, external links and advertising. When combined with highly relevant Meta title, description, and keywords the relevancy score will increase and search listings will appear higher in search results.

How to organize keywords on the page
How to organize keywords on the page

Advanced SEO Techniques for Beginners

This Article is brought to you by Carra Lucia Limited – a UK based company specialized in SEO, e-Marketing, Web Site design and Web Application Software Creation. This article can be downloaded in PDF format from the company’s website

This is a hard-hitting guide that gives you the information you need to make the adjustments to your site right away to help improve your search rankings and benefit from the increase in organic search traffic.

Search Engine Optimization or SEO is simply the act of manipulating the pages of your website to be easily accessible by search engine spiders so they can be easily spidered and indexed.  A spider is a robot that search engines use to check millions of web pages very quickly and sort them by relevance.  A page is indexed when it is spidered and deemed appropriate content to be placed in the search engines results for people to click on.

The art and science of understanding how search engines identify pages that are relevant to a query made by a visitor and designing marketing strategies based on this is called search engine optimization. Search engines offer the most cost effective mechanism to acquire “real” and “live” business leads. It is found that in most cases, search engine optimization delivers a better ROI than other forms such as online advertisements, e-mail marketing and newsletters, affiliate and pay per click advertising, and digital campaigns and promotions.

What On Earth Is An Algorithm?

Each search engine has something called an algorithm which is the formula that each search engine uses to evaluate web pages and determine their relevance and value when crawling them for possible inclusion in their search engine.  A crawler is the robot that browses all of these pages for the search engine.

GOOGLE Algorithm Is Key

Google has a comprehensive and highly developed technology, a straightforward interface and a wide-ranging array of search tools which enable the users to easily access a variety of information online.

Google users can browse the web and find information in various languages, retrieve maps, stock quotes and read news, search for a long lost friend using the phonebook listings available on Google for all of US cities and basically surf the 3 billion odd web pages on the internet! Google boasts of having world’s largest archive of Usenet messages, dating all the way back to 1981.  Google’s technology can be accessed from any conventional desktop PC as well as from various wireless platforms such as WAP and i-mode phones, handheld devices and other such Internet equipped gadgets.

Page Rank Based On Popularity

The web search technology offered by Google is often the technology of choice of the world’s leading portals and websites. It has also benefited the advertisers with its unique advertising program that does not hamper the web surfing experience of its users but still brings revenues to the advertisers.

 

When you search for a particular keyword or a phrase, most of the search engines return a list of page in order of the number of times the keyword or phrase appears on the website. Google web search technology involves the use of its indigenously designed Page Rank Technology and hypertext-matching analysis which makes several instantaneous calculations undertaken without any human intervention. Google’s structural design also expands simultaneously as the internet expands.

Page Rank technology involves the use of an equation which comprises of millions of variables and terms and determines a factual measurement of the significance of web pages and is calculated by solving an equation of 500 million variables and more than 3 billion terms. Unlike some other search engines, Google does not calculate links, but utilizes the extensive link structure of the web as an organizational tool. When the link to a Page, let’s say Page B is clicked from a Page A, then that click is attributed as a vote towards Page B on behalf of Page A.

Back Links Are Considered Popularity Votes

Quintessentially, Google calculates the importance of a page by the number of such ‘votes’ it receives. Not only that, Google also assesses the importance of the pages that are involved in the voting process.  Consequently, pages that are themselves ahead in ranking and are important in that way also help to make other pages important. One thing to note here is that Google’s technology does not involve human intervention in anyway and uses the inherent intelligence of the internet and its resources to determine the ranking and importance of any page.

Hypertext-Matching Analysis

Unlike its conventional counterparts, Google is a search engine which is hypertext-based. This means that it analyzes all the content on each web page and factors in fonts, subdivisions, and the exact positions of all terms on the page. Not only that, Google also evaluates the content of its nearest web pages. This policy of not disregarding any subject matter pays off in the end and enables Google to return results that are closest to user queries.

Google has a very simple 3-step procedure in handling a query submitted in its search box:

  1. When the query is submitted and the enter key is pressed, the web server sends the query to the index servers. Index server is exactly what its name suggests. It consists of an index much like the index of a book which displays where is the particular page containing the queried term is located in the entire book.
  2. After this, the query proceeds to the doc servers, and these servers actually retrieve the stored documents. Page descriptions or “snippets” are then generated to suitably describe each search result.
  3. These results are then returned to the user in less than a one second! (Normally.)

Approximately once a month, Google updates their index by recalculating the Page Ranks of each of the web pages that they have crawled. The period during the update is known as the Google dance.

Do You Know The GOOGLE Dance?

The Algorithm Shuffle

Because of the nature of Page Rank, the calculations need to be performed about 40 times and, because the index is so large, the calculations take several days to complete. During this period, the search results fluctuate; sometimes minute-by minute. It is because of these fluctuations that the term, Google Dance, was coined. The dance usually takes place sometime during the last third of each month.

Google has two other servers that can be used for searching. The search results on them also change during the monthly update and they are part of the Google dance. For the rest of the month, fluctuations sometimes occur in the search results, but they should not be confused with the actual dance. They are due to Google’s fresh crawl and to what is known “Everflux”.

Google has two other searchable servers apart from http://www.google.com. They are www2.google.com and www3.google.com. Most of the time, the results on all 3 servers are the same, but during the dance, they are different.

For most of the dance, the rankings that can be seen on www2 and www3 are the new rankings that will transfer to www when the dance is over. Even though the calculations are done about 40 times, the final rankings can be seen from very early on. This is because, during the first few iterations, the calculated figures merge to being close to their final figures. You can see this with the Page Rank Calculator by checking the Data box and performing some calculations. After the first few iterations, the search results on www2 and www3 may still change, but only slightly.

During the dance, the results from www2 and www3 will sometimes show on the www server, but only briefly. Also, new results on www2 and www3 can disappear for short periods. At the end of the dance, the results on www will match those on www2 and www3.
GOOGLE Dance Tool

This Google Dance Tool allows you to check your rankings on all three tools www, www2 and www3 and on all 9 datacenters simultaneously. The Google Web Directory works in combination of the Google Search Technology and the Netscape Open Directory Project which makes it possible to search the Internet organized by topic.  Google displays the pages in order of the rank given to it using the Page Rank Technology.  It not only searches the titles and descriptions of the websites, but searches the entire content of sites within a related category, which ultimately delivers a comprehensive search to the users.  Google also has a fully functional web directory which categorizes all the searches in order.

 

Submitting your URL to Google

Google is primarily a fully-automatic search engine with no human-intervention involved in the search process.  It utilizes robots known as ‘spiders’ to crawl the web on a regular basis for new updates and new websites to be included in the Google Index.  This robot software follows hyperlinks from site to site. Google does not require that you should submit your URL to its database for inclusion in the index, as it is done anyway automatically by the ‘spiders’.  However, manual submission of URL can be done by going to the Google website and clicking the related link.  One important thing here is that Google does not accept payment of any sort for site submission or improving page rank of your website.  Also, submitting your site through the Google website does not guarantee listing in the index.
Cloaking

 

Sometimes, a webmaster might program the server in such a way that it returns different content to Google than it returns to regular users, which is often done to misrepresent search engine rankings. This process is referred to as cloaking as it conceals the actual website and returns distorted web pages to search engines crawling the site. This can mislead users about what they’ll find when they click on a search result. Google highly disapproves of any such practice and might place a ban on the website which is found guilty of cloaking.

Google Guidelines

Here are some of the important tips and tricks that can be employed while dealing with Google.

Do’s

  • A website should have crystal clear hierarchy and links and should preferably be easy to navigate.
  • A site map is required to help the users go around your site and in case the site map has more than 100 links, then it is advisable to break it into several pages to avoid clutter.
  • Come up with essential and precise keywords and make sure that your website features relevant and informative content.
  • The Google crawler will not recognize text hidden in the images, so when describing important names, keywords or links; stick with plain text.
  • The TITLE and ALT tags should be descriptive and accurate and the website should have no broken links or incorrect HTML.
  • Dynamic pages (the URL consisting of a ‘?’ character) should be kept to a minimum as not every search engine spider is able to crawl them.
  • The robots.txt file on your web server should be current and should not block the Googlebot crawler. This file tells crawlers which directories can or cannot be crawled.

Don’ts

  • When making a site, do not cheat your users, i.e. those people who will surf your website.  Do not provide them with irrelevant content or present them with any fraudulent schemes.
  • Avoid tricks or link schemes designed to increase your site’s ranking.
  • Do not employ hidden texts or hidden links.
  • Google frowns upon websites using cloaking technique.  Hence, it is advisable to avoid that.
  • Automated queries should not be sent to Google.
  • Avoid stuffing pages with irrelevant words and content.  Also don’t create multiple pages, sub-domains, or domains with significantly duplicate content.
  • Avoid “doorway” pages created just for search engines or other “cookie cutter” approaches such as affiliate programs with hardly any original content.

Crawler/Spider Considerations

Also, consider technical factors. If a site has a slow connection, it might time-out for the crawler. Very complex pages, too, can time out before the crawler is able to harvest the text.

If you have a hierarchy of directories at your site, put the most important information high, not deep.  Some search engines will presume that the higher you placed the information, the more important it is. And crawlers may not venture deeper than three or four or five directory levels.

Above all remember the obvious – full-text search engines such index text. You may well be tempted to use fancy and expensive design techniques that either block search engine crawlers or leave your pages with very little plain text that can be indexed.  Don’t fall prey to that temptation.

Ranking Rules Of Thumb

The simple rule of thumb is that content counts, and that content near the top of a page counts for more than content at the end. In particular, the HTML title and the first couple lines of text are the most important part of your pages. If the words and phrases that match a query happen to appear in the HTML title or first couple lines of text of one of your pages, chances are very good that that page will appear high in the list of search results.

A crawler/spider search engine can base its ranking on both static factors (a computation of the value of page independent of any particular query) and query-dependent factors.

Values

  •  Long pages, which are rich in meaningful text (not randomly generated letters and words).
  • Pages that serve as good hubs, with lots of links to pages that that have related content (topic similarity, rather than random meaningless links, such as those generated by link exchange programs or intended to generate a false impression of “popularity”).
  • The connectivity of pages, including not just how many links there are to a page but where the links come from: the number of distinct domains and the “quality” ranking of those particular sites. This is calculated for the site and also for individual pages. A site or a page is “good” if many pages at many different sites point to it, and especially if many “good” sites point to it.
  • The level of the directory in which the page is found. Higher is considered more important. If a page is buried too deep, the crawler simply won’t go that far and will never find it.

These static factors are recomputed about once a week, and new good pages slowly percolate upward in the rankings. Note that there are advantages to having a simple address and sticking to it, so others can build links to it, and so you know that it’s in the index

Query-Dependent Factors

  •  The HTML title.
  • The first lines of text.
  • Query words and phrases appearing early in a page rather than late.
  • Meta tags, which are treated as ordinary words in the text, but like words that appear early in the text (unless the meta tags are patently unrelated to the content on the page itself, in which case the page will be penalized)
  • Words mentioned in the “anchor” text associated with hyperlinks to your pages. (E.g., if lots of good sites link to your site with anchor text “breast cancer” and the query is “breast cancer,” chances are good that you will appear high in the list of matches.)

Blanket Policy on Doorway Pages and Cloaking

Many search engines are opposed to doorway pages and cloaking. They consider doorway and cloaked pages to be spam and encourage people to use other avenues to increase the relevancy of their pages. We’ll talk about doorway pages and cloaking a bit later.

Meta Tags (Ask.Com as an Example)

Though Meta tags are indexed and considered to be regular text, Ask.com claims it doesn’t give them priority over HTML titles and other text. Though you should use Meta tags in all your pages, some webmasters claim their doorway pages for Ask.com rank better when they don’t use them. If you do use Meta tags, make your description tag no more than 150 characters and your keywords tag no more than 1,024 characters long.

Keywords in the URL and File Names

It’s generally believed that Ask.com gives some weight to keywords in filenames and URL names. If you’re creating a file, try to name it with keywords.

Keywords in the ALT Tags

Ask.com indexes ALT tags, so if you use images on your site, make sure to add them. ALT tags should contain more than the image’s description. They should include keywords, especially if the image is at the top of the page. ALT tags are explained later.

Page Length

There’s been some debate about how long doorway pages for AltaVista should be. Some webmasters say short pages rank higher, while others argue that long pages are the way to go. According to AltaVista’s help section, it prefers long and informative pages. We’ve found that pages with 600-900 words are most likely to rank well.

Frame Support

AltaVista has the ability to index frames, but it sometimes indexes and links to pages intended only as navigation. To keep this from happening to you, submit a frame-free site map containing the pages that you want indexed. You may also want to include a “robots.txt” file to prohibit AltaVista from indexing certain pages.

What Your Website Absolutely Needs

This section will go over some of the most important elements that a page that hopes to get high research engine rankings needs.  Make sure that you go through this while section very carefully as each of these can have a dramatic impact on the rankings that your website will ultimately achieve. Don’t focus solely on the home page, keywords and titles.

The first step to sales when customers visit your site to see the products they were looking for. Of course, search engine optimization and better rankings can’t keep your customer on your site or make them buy. The customer having visited your site, now ensure that he gets interested in your products or services and stays around. Motivate him to buy the product by providing clear and unambiguous information. Thus if you happen to sell more than one product or service, provide all necessary information about this, may be by keeping the  information at a different page. By providing suitable and easily visible links, the customer can navigate to these pages and get the details.

Understanding Your Target Customer

If you design a website you think will attract clients, but you don’t really know who your customers are and what they want to buy, it is unlikely you make much money. Website business is an extension or replacement for a standard storefront. You can send email to your existing clients and ask them to complete a survey or even while they are browsing on your website. Ask them about their choices. Why do they like your products? Do you discount prices or offer coupons? Are your prices consistently lower than others? Is your shipping price cheaper? Do you respond faster to client questions? Are your product descriptions better? Your return policies and guarantees better than your competitor’s? To know your customer you can check credit card records or ask your customer to complete a simple contact form with name, address, age, gender, etc. when they purchase a product.

Does Your Website Give Enough Contact Information?

When you sell from a website, your customer can buy your products 24 hrs a day and also your customers may be from other states that are thousands of miles away. Always provide contact information, preferably on every page of your website, complete with mailing address, telephone number and an email address that reaches you. People may need to contact you about sales, general information or technical problems on your site. Also have your email forwarded to another email address if you do not check your website mailbox often. When customer wants to buy online provide enough options like credit card, PayPal or other online payment service.

In the field of search engine optimization (SEO), writing a strong homepage that will rank high in the engines and will read well with your site visitors can sometimes present a challenge, even to some seasoned SEO professionals. Once you have clearly identified your exact keywords and key phrases, the exact location on your homepage where you will place those carefully researched keywords will have a drastic impact in the end results of your homepage optimization.

One thing we keep most people say is that they don’t want to change the looks or more especially the wording on their homepage. Understandably, some of them went to great lengths and invested either a lot of time and/or money to make it the best it can be. Being the best it can be for your site visitors is one thing. But is it the best it can be for the search engines, in terms of how your site will rank?

If you need powerful rankings in the major search engines and at the same time you want to successfully convert your visitors and prospects into real buyers, it’s important to effectively write your homepage the proper way the first time! You should always remember that a powerfully optimized homepage pleases both the search engines and your prospects.

In randomly inserting keywords and key phrases into your old homepage, you might run the risk of getting good rankings, but at the same time it might jeopardize your marketing flow. That is a mistake nobody would ever want to do with their homepage.

Even today, there are still some people that will say you can edit your homepage for key phrases, without re-writing the whole page. There are important reasons why that strategy might not work.

The Home Page

Your homepage is the most important page on your web site. If you concentrate your most important keywords and key phrases in your homepage many times, the search engines will surely notice and index it accordingly. But will it still read easily and will the sentences flow freely to your real human visitors? There are some good chances that it might not. As a primer, having just 40 or 50 words on your homepage will not deliver the message effectively. To be powerful and effective, a homepage needs at least 300 to 400 words for maximum search engine throughput and effectiveness.

One way to do that is to increase your word count with more value-added content. This often means rewriting your whole homepage all over again. The main reason to this is you will probably never have enough room to skillfully work your important keywords and key phrases into the body text of your homepage. This may not please your boss or marketing department, but a full re-write is often necessary and highly advisable to achieve high rankings in the engines, while at the same time having a homepage that will please your site visitors and convert a good proportion of them into real buyers.

The Acid Test

Here is the acid test that will prove what we just said is right: Carefully examine the body text of your existing homepage. Then, attempt to insert three to five different keywords and key phrases three to four times each, somewhere within the actual body of your existing page. In doing that, chances are you will end up with a homepage that is next to impossible to understand and read.

One mistake some people do is to force their prospects to wade through endless key phrase lists or paragraphs, in an attempt to describe their features and benefits. The other reason they do that is in trying to please the search engines at the same time. Writing a powerful and effective homepage around carefully defined keywords and key phrases is a sure way you can drive targeted traffic to your web site and keep them there once you do.

If some people still say re-writing a homepage takes too much time and costs too much money, think of the cost of losing prospective clients and the real cost of lost sales and lost opportunities. In the end, writing a strong homepage that will achieve all your desired goals will largely justify your time invested and the efforts you will have placed in the re-writing of your homepage.

This section presents a recommended layout for your homepage in order to make it as search engine friendly as possible. This is where you set the theme of your site. Let’s suppose the primary focus of your site is about online education. You also have secondary content that is there as alternative content for those not interested online education. There is also other content that you would like to share with your visitors. For example, this might include book reviews, humor, and links.

The top of your homepage, as discussed earlier is the most important. This is where you set the keywords and theme for the most important part of your site, the thing you really want to be found for.

Step By Step Page Optimization

Starting at the top of your index/home page something like this:

(After your logo or header graphic)

1)   A heading tag that includes a keyword(s) or keyword phrases. A heading tag is bigger and bolder text than normal body text, so a search engine places more importance on it because you emphasize it.

2)   Heading sizes range from h1 – h6 with h1 being the largest text. If you learn to use just a little Cascading Style Sheet code you can control the size of your headings. You could set an h1 sized heading to be only slightly larger than your normal text if you choose, and the search engine will still see it as an important heading.

3)   Next would be an introduction that describes your main theme. This would include several of your top keywords and keyword phrases. Repeat your top 1 or 2 keywords several times, include other keyword search terms too, but make it read in sentences that makes sense to your visitors.

4)   A second paragraph could be added that got more specific using other words related to online education.

5)   Next you could put smaller heading.

6)   Then you’d list the links to your pages, and ideally have a brief decision of each link using keywords and keyword phrases in the text. You also want to have several pages of quality content to link to. Repeat that procedure for all your links that relate to your theme.

7)   Next you might include a closing, keyword laden paragraph. More is not necessarily better when it comes to keywords, at least after a certain point. Writing “online education” fifty times across your page would probably result in you being caught for trying to cheat. Ideally, somewhere from 3% – 20% of your page text would be keywords. The percentage changes often and is different at each search engine. The 3-20 rule is a general guideline, and you can go higher if it makes sense and isn’t redundant.

8)   Finally, you can list your secondary content of book reviews, humor, and links. Skip the descriptions if they aren’t necessary, or they may water down your theme too much. If you must include descriptions for these non-theme related links, keep them short and sweet. You also might include all the other site sections as simply a link to another index that lists them all. You could call it Entertainment, Miscellaneous, or whatever. These can be sub-indexes that can be optimized toward their own theme, which is the ideal way to go.

Now you’ve set the all important top of your page up with a strong theme. So far so good, but this isn’t the only way you can create a strong theme so don’t be compelled into following this exact formula. This was just an example to show you one way to set up a strong site theme. Use your imagination, you many come up with an even better way.

One Site – One Theme

It’s important to note that you shouldn’t try to optimize your home page for more than one theme. They just end up weakening each other’s strength when you do that. By using simple links to your alternative content, a link to your humor page can get folks where they want to go, and then you can write your humor page as a secondary index optimized toward a humor theme. In the end, each page should be optimized for search engines for the main topic of that page or site section.

Search engine optimization is made up of many simple techniques that work together to create a comprehensive overall strategy. This combination of techniques is greater as a whole than the sum of the parts. While you can skip any small technique that is a part of the overall strategy, it will subtract from the edge you’d gain by employing all the tactics.

Affiliate Sites & Dynamic URLs

In affiliate programs, sites that send you traffic and visitors, have to be paid on the basis of per click or other parameters (such as number of pages visited on your site, duration spent, transactions etc). Most common contractual understanding revolves around payment per click or click through. Affiliates use tracking software that monitors such clicks using a redirection measurement system. The validity of affiliate programs in boosting your link analysis is doubtful. Nevertheless, it is felt that it does not actually do any harm. It does provide you visitors, and that is important. In the case of some search engines re-directs may even count in favor of your link analysis. Use affiliate programs, but this is not a major strategy for optimization.

Several pages in e-commerce and other functional sites are generated dynamically and have “?” or “&” sign in their dynamic URLs. These signs separate the CGI variables. While Google will crawl these pages, many other engines will not. One inconvenient solution is to develop static equivalent of the dynamic pages and have them on your site.

Another way to avoid such dynamic URLs is to rewrite these URLs using a syntax that is accepted by the crawler and also understood as equivalent to the dynamic URL by the application server. The Amazon site shows dynamic URLs in such syntax. If you are using Apache web server, you can use Apache rewrite rules to enable this conversion.

One good tip is that you should prepare a crawler page (or pages) and submit this to the search engines. This page should have no text or content except for links to all the important pages that you wished to be crawled. When the spider reaches this page it would crawl to all the links and would suck all the desired pages into its index. You can also break up the main crawler page into several smaller pages if the size becomes too large. The crawler shall not reject smaller pages, whereas larger pages may get bypassed if the crawler finds them too slow to be spidered.

You do not have to be concerned that the result may throw up this “site-map” page and would disappoint the visitor. This will not happen, as the “site-map” has no searchable content and will not get included in the results, rather all other pages would. We found the site wired.com had published hierarchical sets of crawler pages. The first crawler page lists all the category headlines, these links lead to a set of links with all story headlines, which in turn lead to the news stories.

Page Size Can Be a Factor

We have written above that the spiders may bypass long and “difficult” pages. They would have their own time-out characteristics or other controls that help them come unstuck from such pages. So you do not want to have such a page become your “gateway” page. One tip is to keep the page size below 200 kb.

How many Pages to Submit?

You do not have to submit all the pages of your site. As stated earlier, many sites have restrictions on the number of pages you submit. A key page or a page that has links to many inner pages is ideal, but you must submit some inner pages. This insures that even if the first page is missed, the crawler does get to access other pages and all the important pages through them. Submit your key 3 to 4 pages at least. Choose the ones that have the most relevant content and keywords to suit your target search string and verify that they link to other pages properly.

Should You Use Frames?

Many websites make use of frames on their web pages. In some cases, more than two frames would be used on a single web page. The reason why most websites use frames is because each frame’s content has a different source. A master page known as a “frameset” controls the process of clubbing content from different sources into a single web page. Such frames make it easier for webmasters to club multiple sources into a single web page. This, however, has a huge disadvantage when it comes to Search Engines.

Some of the older Search Engines do not have the capability to read content from frames. These only crawl through the frameset instead of all the web pages. Consequently web pages with multiple frames are ignored by the spider. There are certain tags known as “NOFRAMES” (Information ignored by frames capable browser) that can be inserted in the HTML of these web pages. Spiders are able to read information within the NOFRAMES tags. Thus, Search Engines only see the Frameset. Moreover, there cannot be any links to other web pages in the NOFRAMES blocks. That means the search engines won’t crawl past the frameset, thus ignoring all the content rich web pages that are controlled by the frameset.

Hence, it is always advisable to have web pages without frames as these could easily make your website invisible to Search Engines.

Making Frames Visible To Search Engines

We discussed earlier the prominence of frames based websites. Many amateur web designers do not understand the drastic effects frames can have on search engine visibility. Such ignorance is augmented by the fact that some Search Engines such as Google and Ask.com are actually frames capable. Ask.com spiders can crawl through frames and index all web pages of a website. However, this is only true for a few Search Engines.

The best solution as stated above is to avoid frames all together.  If you still decide to use frames another remedy to this problem is using JavaScript. JavaScript can be added anywhere and is visible to Search Engines. These would enable spiders to crawl to other web pages, even if they do not recognize frames.

With a little trial and error, you can make your frame sites accessible to both types of search engines.

STOP Words

Stop words are common words that are ignored by search engines at the time of searching a key phrase. This is done in order to save space on their server, and also to accelerate the search process. When a search is conducted in a search engine, it will exclude the stop words from the search query, and will use the query by replacing all the stop words with a marker. A marker is a symbol that is substituted with the stop words. The intention is to save space. This way, the search engines are able to save more web pages in that extra space, as well as retain the relevancy of the search query. Besides, omitting a few words also speeds up the search process. For instance, if a query consists of three words, the Search Engine would generally make three runs for each of the words and display the listings. However, if one of the words is such that omitting it does not make a difference to search results, it can be excluded from the query and consequently the search process becomes faster. Some commonly excluded “stop words” are:

after, also, an, and, as, at, be, because, before, between, but, before, for, however, from, if, in, into, of, or, other, out, since, such, than, that, the, these, there, this, those, to, under, upon, when, where, whether, which, with, within, without

Image Alt Tag Descriptions

Search engines are unable to view graphics or distinguish text that might be contained within them. For this reason, most engines will read the content of the image ALT tags to determine the purpose of a graphic. By taking the time to craft relevant, yet keyword rich ALT tags for the images on your web site, you increase the keyword density of your site.

Although many search engines read and index the text contained within ALT tags, it’s important NOT to go overboard in using these tags as part of your SEO campaign. Most engines will not give this text any more weight than the text within the body of your site.

Invisible & Tiny Text

Invisible text is content on a web site that is coded in a manner that makes it invisible to human visitors, but readable by search engine spiders. This is done in order to artificially inflate the keyword density of a web site without affecting the visual appearance of it. Hidden text is a recognized spam tactic and nearly all of the major search engines recognize and penalize sites that use this tactic.

This is the technique of placing text on a page in a small font size. Pages that are predominantly heavy in tiny text may be dismissed as spam. Or, the tiny text may not be indexed. As a general guideline, try to avoid pages where the font size is predominantly smaller than normal. Make sure that you’re not spamming the engine by using keyword after keyword in a very small font size. Your tiny text may be a copyright notice at the very bottom of the page, or even your contact information. If so, that’s fine.

Keyword Stuffing & Spamming

Important keywords and descriptions should be used in your content in visible Meta tags and you should choose the words carefully and position them near the top and have proper frequency for such words. However it is very important to adopt moderation in this. Keyword stuffing or spamming is a No-No today. Most search engine algorithms can spot this, bypass the spam and some may even penalize it.

Dynamic URLs

 Several pages in e-commerce and other functional sites are generated dynamically and have? Or & sign in their dynamic URLs. These signs separate the CGI variables. While Google will crawl these pages, many other engines will not. One inconvenient solution is to develop static equivalent of the dynamic pages and have them on your site. Another way to avoid such dynamic URLs is to rewrite these URLs using a syntax that is accepted by the crawler and also understood as equivalent to the dynamic URL by the application server. The Amazon site shows dynamic URLs in such syntax. If you are using Apache web server, you can use Apache rewrite rules to enable this conversion.

Re-Direct Pages

 Sometimes pages have a Meta refresh tag that redirects any visitor automatically to another page. Some search engines refuse to index a page that has a high refresh rate. The Meta refresh tag however does not affect Google.

Image Maps without ALT Text

Avoid image maps without text or with links. Image maps should have alt text (as also required under the American Disabilities Act, for public websites) and the home page should not have images as links. Instead HTML links should be used. This is because search engines would not read image links and the linked pages may not get crawled.

 Frames

 There are some engines whose spiders won’t work with frames on your site. A web page that is built using frames is actually a combination of content from separate “pages” that have been blended into a single page through a ‘frameset’ instruction page. The frameset page does not have any content or links that would have promoted spidering. The frameset page could block the spider’s movement. The workaround is by placing a summary of the page content and relevant description in the frameset page and also by placing a link to the home page on it.

Tables

When you use tables on the key pages and if some columns have descriptions while others have numbers, it is possible that this may push your keywords down the page. Search engines break up the table and read them for the content the columns have. The first column is read first, then the next and so on. Thus if the first column had numbers, and the next one had useful descriptions, the positioning of these descriptions will suffer. The strategy is to avoid using such tables near the top of the key pages. Large sections of Java scripts also will have the same effect on the search engines. The HTML part will be pushed down. Thus again, place your long JavaScript lower down on key pages.

Link Spamming

Realizing the importance of links and link analysis in search engine results, several link farms and Free for All sites have appeared that offer to provide links to your site. This is also referred to as link spamming. Most search engines are smarter to this obvious tactic and know how to spot this. Such FFA sites, as they are known, do not provide link quality or link context, two factors that are important in link analysis. Thus the correct strategy is to avoid link spamming and not get carried away by what seems to be too simple a solution.

Conclusion

 If you’re looking for some simple things that you can do to increase the position of your sites rank in the search engines or directories, this section will give you some hard hitting and simple tips that you can put into action right away.

 

What Should You Do Now?

 

It is worth cataloging the basic principles to be enforced to increase website traffic and search engine rankings.

  • Create a site with valuable content, products or services.
  • Place primary and secondary keywords within the first 25 words in your page content and spread them evenly throughout the document.
  • Research and use the right keywords/phrases to attract your target customers.
  • Use your keywords in the right fields and references within your web page. Like Title, META tags, Headers, etc.
  • Keep your site design simple so that your customers can navigate easily between web pages, find what they want and buy products and services.
  • Submit your web pages i.e. every web page and not just the home page, to the most popular search engines and directory services. Hire someone to do so, if required. Be sure this is a manual submission. Do not engage an automated submission service.
  • Keep track of changes in search engine algorithms and processes and accordingly modify your web pages so your search engine ranking remains high. Use online tools and utilities to keep track of how your website is doing.
  • Monitor your competitors and the top ranked websites to see what they are doing right in the way of design, navigation, content, keywords, etc.
  • Use reports and logs from your web hosting company to see where your traffic is coming from. Analyze your visitor location and their incoming sources whether search engines or links from other sites and the keywords they used to find you.
  • Make your customer visit easy and give them plenty of ways to remember you in the form of newsletters, free reports, reduction coupons etc.
  • Demonstrate your industry and product or service expertise by writing and submitting articles for your website or for article banks so you are perceived as an expert in your field.
  • When selling products online, use simple payment and shipment methods to make your customer’s experience fast and easy.
  • When not sure, hire professionals. Though it may seem costly, but it is a lot less expensive than spending your money on a website which no one visits.
  • Don’t look at your website as a static brochure. Treat it as a dynamic, ever-changing sales tool and location, just like your real store to which your customers with the same seriousness.