Showing posts with label SEO. Show all posts
Showing posts with label SEO. Show all posts

Friday, 19 December 2008

Measuring the competitiveness of keywords

When writing a new website and choosing keywords, it's easy to make the mistake of choosing ones which are far too competitive. For a new website, it will be difficult to rank well for competitive keywords unless you can get some very high quality links from high PR sites.

Competitiveness is a difficult thing to measure and it has to be balanced against the search volume for any particular keywords. It might be worth optimising for competitive keywords if the search volume is very high and you don't mind taking a longer term view of ranking well. Taking the opposite view, it isn't worth optimising for uncompetitive keywords if the search volume is very low.

One way of investigating the competitiveness of keywords is by using Google's own keyword research tool at https://adwords.google.com/select/KeywordToolExternal. This tool shows monthly search volumes for keywords together with a rough gauge of the competition for the keywords.

Another rough way of gauging the competition is by using the allintitle: operator when doing a search. Using the allintitle operator makes the SERPS only contain those pages which have the search words in their title. Since a page title is a key SEO factor, the number of results returned is a rough and ready gauge of competitiveness. For example, if you want to measure the competition for web design, do the following search on google.co.uk:

allintitle:web design

This returns 9,800,000 results. Trying:

allintitle:seo

returns 13,800,000 results.

Contrast these numbers with a set of keywords that we can guess are pretty uncompetitive:

allintitle:british vineyards

This returns 639 results, or ...

allintitle:web design worcester

which returns 1050 results.

Of course, the number of pages that include keywords in their title doesn't tell you the full story of how competitive a set of keywords are but it is a start. For one thing, allintitle doesn't tell you how well optimised the pages are, e.g. the first 50 pages in the results might have good links and content and be hard to beat without a lot of work. allintitle though is a useful tool to add to your SEO arsenal.

Wednesday, 3 December 2008

SEO: Doing it professionally

After helping out a few friends and acquaintances with website optimisation, I've been approached by a web designer about doing SEO work for them on an ongoing basis. They would like me to propose a service or set of services I could offer together with a set of prices.

The easiest and cheapest service I could offer is sets of directory submissions. To do these I could use my development version of professional directory submitter, SliQ Submitter Pro. This should allow me to do a hundred or so submissions an hour.

Of course there are a lot of other techniques I could use to do link-building. The more I think about it though, the more I feel a fixed price service won't do the job. SEO is a long-haul activity and needs to be spread over a number of months. Ideally I would spend 6 or so hours a month doing offsite optimisation for a website using directory submissions, articles where appropriate plus other link-building techniques I've become familiar with.

Spreading the SEO work over a few months should give better value and satisfaction to the customer. With a one-off hit at link-building, there won't be time to see any results before the work is completed. It's also likely to be unsuccessful. To do optimisation, you have to be able to monitor the results and make changes over a period of weeks. with newer sites this is especially important as the sites tend to perform well for a period before dropping back.

The other aspect I've got to price up is the on-page optimisation. Do I charge per page? Do I have a minimum charge that makes it worthwhile doing the job in the first place? If I think back to when I was looking for SEO help, I would often get quoted £350 a site or £100 per page. I never felt entirely comfortable with quotes like that since they didn't quantify what work was being done. Now, I've got more experience I can also see that it's pretty hard - or at least less optimal - to optimise a single page on a website.

I'll also have to think through whether I offer any PPC, e.g. Google Adwords advice. My feeling right now is that I shouldn't since I don't think it's a good medium to long term way of getting traffic/ sales, or rather I think that organic SEO will be the most cost-effective after a 6 month to 1 year period.

Wednesday, 12 November 2008

Free Directory Submission Software

It's about 3 months since I made a new release of my free directory submission tool, SliQ Submitter. Since I made the release, I've been busy on other projects. One of those projects is a faster directory submitter that should make the whole submission process much quicker - perhaps as little as 1 or 2 seconds if the directory doesn't have a captcha.

SliQ Submitter was my first attempt at writing directory submission software. Initially I made 3 releases very soon after each other - first with a free web directory list containing 450 directories, quickly followed by 2 more releases until the package listed over 2000 web directories. I initially tested submissions to all the listed directories and was confident that all directories worked and would accept submissions.

Soon after the last release though, I realised that web directories don't stand still. Before long the PR of the web directories changed, with a lot going to PR0. Whether this caused a number to give up I don't know, but quite a few of the 2000 went offline. As the months have passed, a number of the domains expired and a good percentage of the directories switched to paid.

In the last few days, I've rechecked the directories, removing those which are dead or have switched to being paid. Of the original 2250, there are now about 1250 left. As of today though, all of these are free and if a submitted website gets accepted by a good proportion of the 1250 directories, the site should get a good boost in PR and performance in SERPs.

Getting more Visitors and Page Views

I've been helping a friend optimise his software archive site SoftTester. The site is nearly 5 years old and has about 100,000 pages as well as being listed in DMOZ. Over the last couple of years his site had been slowly losing visitors. By June he was down to only a few hundred a day. Needless to say, his income from Adsense had fallen away to almost nothing.

In June, we decided to do some SEO on the site. We mainly concentrated on on-page SEO and improved page titles and descriptions as well as adding good h1 and h2 tags. His site is database-driven, with most of the content coming from PAD files submitted by software authors.

We changed some of the data used to display info as well as shuffling the position of some the displayed items. Whatever we did, it seems to have paid off. Within a couple of weeks, search engines started sending more traffic to the site. In particular traffic from Google began to grow steadily.

As well as on-page optimisation, we set about getting new links to the site. One of the main ways software download sites get links is by reviewing and making awards to listed software packages. Software authors can then use a nice award graphic on their own websites and link back to the archive. The existing graphics were a bit tired, so I encouraged my friend to buy classy new ones and before long he began to get extra links to his site.

After waiting 4 or 5 months, the number of visitors and page views had grown by a factor of nearly 5 and the income from Adsense had grown along with the traffic. Not a bad result for a few hours work spread over a few days.

Monday, 10 November 2008

SEO’ing webpages using precise Keywords

A friend of mine has been trying to optimize his webpages. His site is an online shop selling jewellery. On each webpage, he's added a set of links to each product page. These links aid the user in navigating around the site and also attempt to improve SERPs performance as the anchor text for each link includes the keywords for each product page. For example, on one page he's trying to sell some Choker Jewellery, so he made Choker Jewellery the anchor text of the link to the page.

All the links and anchor text are chosen to reinforce the keywords used on the linked page. He's taken things one stage further and dynamically parsed the page description from the backend database and generated the anchor text for the links automatically. This will make it much easier to add product pages in the future and is a good example of using a database to make management of a website easier.

To give the links extra value he's added the navigation near the top of each webpage on the site. This should show google that these links are important. To make the placement of the links useful to visitors he's also added the text “Recent Searches” so the links look like phrases people have used to search for items on my site, but more importantly providing google with an important set of links.

He wasn't sure whether to have these links at the top of the page as they do look odd. However his biggest problem was deciding what keywords to use for his home page. He finally decided on Cheap Jewellery. Having developed several websites in the past and getting little traffic, he was keen to better this time and used the Adwords keyword tool to find keywords with a good expected level of traffic. He then matched the best keywords against the products on his shop site. An example would be Jewelry which is a spelling mistake, but a good keyword from a volume point of view with a good, i.e. low, level of keyword competition. This was a difficult process but he found that keywords with a good expected level of traffic aren’t necessarily the keywords people use when searching for things to buy from his site. Therefore the whole strategy is quite risky, but definitely worth trying.

Monday, 6 October 2008

Losing a PageRank Value

In the mini-update toolbar export on Sept 26th this blog lost its PR, i.e the value went to N/A. Previously the PR had been 3. Why this has happened I don't know. I haven't linked to any silly sites or reduced the frequency of posting.

Today I also noticed that a few inner pages on the blog have PR. I haven't noticed this before. It's strange that the older posts have PR but the blog home page is back to PR N/A. Google Analytics isn't showing any change in the level of traffic to the blog.

Friday, 12 September 2008

How to keep on the good side of Google

When people are trying to improve their ranking in search engine results page, a lot of people will use any advice they can find in a struggle to improve rankings. People need to be wary which advice they follow though. Some advice will cause you to be penalised by Google.

If a page on your website is penalised, it will not perform as well as it might in search results. Here are some tips to help avoid penalties. In the case of a new website, it may never get to perform well in the first place.

Remember that outside Google, no-one really knows what counts as good or bad in terms of SEO. There is some general advice from Google about having good, unique content and quality backlinks. Other than that, people are justing using their experience and guesswork to find out what works and doesn't. A lot of SEO information online is copied and spread as online myths.

With this proviso in mind, here's a fairly non-controversial list of things to avoid.

  • Avoid Exchanging Links
    Excessive link exchanging should be avoided as Google may see this as an attempt to artifically improve rank. A few link exchanges will be OK but avoid large numbers. Link farms - where a large group of sites hyperlink to all other sites in the group should always be avoided.
  • Do not Sell links
    Selling links is a no-no - unless the hrefs use the nofollow attribute. If your site sells dofollow links and Google becomes aware it may well be penalised. Google's WebMasters site allows people to report paid links. Rumour has it that Google may use this information to adjust its algorithms to improve detection of paid links.
  • Do not buy links
    Recently, Google has threatened to penalise sites they discover have purchases dofollow links from another site. Logically thinking though, this does not seem possible - or at least it would be extremely unfair! If this were to be the case it would be easy to penalise a competitor by purchasing links to their site and then reporting them to Google.
  • Avoid duplicate content
    If possible, avoid duplicate content. For example, don't make the same post to two different blogs. Google will ignore copies of content. A few copies will make it into Google's index but lots of copies will be ignored. Even on different pages within a website, try to keep the textual content unique and avoid repeating whole paragraphs of text.
  • Don't stuff keywords
    If you want to perform well for a certain keyword, stuffing your webpages full of the keyword will not help. Write you copy in a natural way so that it reads well. If you are writing a web page about Google penalties (for example), the keyword "Google Penalty" will naturally appear a number of times - you don't need to repeat the phrase scores or hundred of times.
  • Don't include hidden text
    Make the contents of the webpage visible to the user. For example, don't include extra content such as white text on a white background that the user cannot see.

Monday, 25 August 2008

Different sites, different visitor profiles

I thought I'd make a record of the share of visitors two of my sites get from different search engines. The profiles are very different and not really what I'd expected. the first sets gets over 90% of its traffic from direct addresses/ bookmarks but here are its search engine stats:-

  1. Windows Live - 53%
  2. Google - 32%
  3. Yahoo - 10%
  4. AltaVista - 2%
  5. Alexa,Google Images,Tiscali,Dogpile,AllTheWeb - 3%

The second site gets less than 5% of visitors from direct traffic. Its search engine stats are:-

  1. Google - 77%
  2. Yahoo - 10%
  3. Unknown - 8%
  4. AOL - 2%
  5. MSN, Tiscali, Ask Jeeves, Alexa, AltaVista - 3%

The search engine stats are very different between the sites. The second site is my oldest. For both sites Google is very important but I'm surprised to see Windows Live heading up the list for the first site. The first site is quite new - less than 2 months old. In another couple of months I'll see how the search engine stats are looking then.

Saturday, 23 August 2008

Google PageRank - Second Update

I'd been confused by the visible Pagerank of this blog as shown on the Google toolbar. After the last export of rank values at the end of July this blog had still not been assigned a rank value by Google and was still showing PR N/A. However today - Aug 23rd - the rank is showing as PR 3! This is great news and much more in line with what I was expecting. I'd been doing a mixture of different types of link-building and had also been doing directory submissions using SliQ Submitter, my directory submission software. This now looks as though it has paid off - at least in terms of PR - and ties up with the increase in the number of visitors to the blog over the past 4 weeks.

Thursday, 21 August 2008

Tracking performance in SERPS for keywords

A few day ago I was speaking to a friend who was testing out SliQ Submitter for me and providing some useful, constructive feedback. He was wondering what tools I used to check the ranking of my site on Google for different keywords. I don't use anything other than Google searches by hand but I thought I'd look to see if there was anything on the market.

I soon found a tool called WebPosition Gold or WPG for short. This tool allows you to automatically track the performance of sites on Google and show nice graphs of the historical trend. Trouble is that with a bit more reading online, it turns out that Google mentions that use of WPG is against Google's terms of service. the terms specifically name WPG as well as saying that similar tools are also not to be used.

To use WPG you have to a Google API key for their SOAP search API. Google no longer issue such keys and there are stories online saying that WPG users are now finding that WPG gets blocked. The future doesn't look good for WPG.

When I think about, it seems fairly clear that tracking a few keywords is interesting but not necessarily informative anyway. What really matters is traffic to your site and conversion of visitors to sales. You could be performing really well for certain keywords - and improving in performance too - but really you need to measure traffic. Google itself provides a better measurement tool - Google Analytics. If you have the time, Google Analytics can provide valuable data about the performance of your website - traffic figures, how people got to your site (not just SE performance) and where they went went they landed on your site.

Sunday, 27 July 2008

Google Pagerank update - Strange results

Two or three days ago Google exported the current Pagerank values to the Google toolbar. I always find this interesting - not because I put a huge amount of faith in a higher Pagerank value, but simply because I take a higher Pagerank value to mean I'm probably doing something right in terms of SEO.

This time round I had some confusing results. On the main SliQTools website, 6 inner pages moved from PR2 to PR3 but one page dropped from PR2 to PR0. The homepage itself remained at PR3 while another page (relatively new) that does really well in SERPS for certain keywords had PR 0. I find it a bit hard to explain how so many pages are PR3 and the homepage is only PR3 and why other pages are PR0 but still do well in SERPs. I guess Pagerank isn't as important as it used to be.

The other confusing aspect about the recent export, is that another of my websites has gained PR2 from PR0. This is hard to explain since it only has links from a forum. Its inner pages are PR1 even though one of them is virtually blank.

I collaborate on another blog with a friend. This blog is still PR0 but he has a new website - registered less than 2 months ago that only has links from the PR0 blog - and the new site has a PR of 1 which is higher than the only external site giving incoming links.

Strange!

New Directory Submission Tool

SliQTools have just released a beta of a new directory submission tool called SliQ Submitter.



SliQ Submitter helps speed manual submissions to web directories. Submitting to web directories can still be an important way of gaining backlinks for a website in order to help with positioning in search engine result pages.

SliQ Submitter is available for free. The tool currently handles submission to about 450 directories and will be updated in a future release to handle many more.

The software allows up to 5 titles and descriptions to be specified together with up to 6 categories. SliQ Submitter autmatically fills out the fields in web directory submission forms. All you have to do is review the submitted details, enter any captcha and click the directory's Submit or Continue button to complete the process.

To find out more about SliQ Submitter, go to http://www.sliqtools.co.uk/directory-submission-tool.aspx.

Friday, 20 June 2008

4 Tips for Making the Most of a Hyperlink

When getting a link to one of your websites you need to make the most of the link - not all links are equal. Here are my 4 tips for getting the most benefit from a link in terms of SEO:

Place the link on a page with higher PageRank

This means the link will have more PageRank to pass to you and will hopefully help push you up the ranking in search results.

Try to place the link on a page with only a small number of other links

When passing PageRank to other pages, the rank of a page is divided up between all the outgoing links on the page. If there are a lot of links on a page, the benefit passed by each one is reduced.

Place the link on a page on a similar topic to your own

For example a link from a page talking about shoes to one with a topic of finance is probably worth less.

Make sure the link text reinforces your keywords

For example, if one of your keywords is "Greek Holidays", make this the text associated with the link using the href element.

Wednesday, 18 June 2008

Keyword Research Tools for SEO

The following links point to great (and free!) tools to research popular keywords and longtail keywords for web pages.

http://freekeywords.wordtracker.com
http://inventory.overture.com/d/searchinventory/suggestion/

Monday, 16 June 2008

Tool for creating a sitemap for a website

As my website grows larger and acquires more pages I want to give Google all the help I can indexing my pages so I submit a sitemap using Google's webmaster tools.

I've been looking for a good tool to create a sitemap for my website for a few months on and off.
I found a few shareware programs but the majority of these would only scan the root directory of my site or cost more than I was prepared to spend for something I would only use 3 or 4 times a year.

However, I've now found a site with a great, free sitemap generator:

http://www.xml-sitemaps.com/

It scans subfolders and also works quicker than the shareware programs I've been using. It also exports sitemaps in a number of different formats including compressed and uncompressed sitemap format as well as ROR sitemap format.

Thursday, 12 June 2008

What is Google Pagerank?

Put simply, it's Google's measure of the importance of a web page or put another way, a measure of how likely you are to find a web page by randomly clicking on hyperlinks online. Google gives a web page a PageRank value between 0 and 10 with 0 meaning least important and 10 meaning more important.

This is the Google definition of PageRank:

PageRank relies on the uniquely democratic nature of the web by using its vast link structure as an indicator of an individual page's value. In essence, Google interprets a link from page A to page B as a vote, by page A, for page B. But, Google looks at more than the sheer volume of votes, or links a page receives; it also analyzes the page that casts the vote. Votes cast by pages that are themselves "important" weigh more heavily and help to make other pages "important".


Basically having more hyperlinks makes a page more important in Google's eyes. Incoming links from important pages count for more than links from less important pages. In a simple sense, if two web pages contain the words "Red Bananas" and a user searches for Red Bananas on Google, the page with the higher rank will appear first in the search results.

However, things are not quite this simple. If things were this simple, there would web pages stuffed with multiple links to other pages in an attempt to improve placings in search
results on Google.

Apart from Google itself, no-one knows how PageRank is calculated. Google now say that their ranking algorithms are more sophistacted than in years past and the content of pages is now more important than previously. However Google still assign and publish pageranks for websites 3 or 4 times a year so Pagerank must still have some relevance.

These factors are likely to affect the importance or weight passed by a hyperlink.
Multiple links from one page are devalued, i.e. the second link probably counts for less than the first link.

  • Site-wide links are devalued, e.g. a link from every page in a 10 page website probably counts for less than links from 10 pageson separate websites.
  • Reciprocal links are devalued. Many people now consider these worthless. This gets around the mutual voting scenario.
  • A link from a page containing unrelated content counts for less, e.g. a link from a page talking about holidays to a page talking about nuclear physics, counts for less than a link from a page on another site talking about holidays.
  • A link needs to say why it links to another page to give more credit, i.e. the link text needs to be appropriate to the topic of the page being linked.

Tuesday, 10 June 2008

Choosing Keywords for Web Pages

Consider these factors when choosing keywords:-
  1. The keywords should accurately describe the products or services you offer.
  2. Are the keywords ones people actually search for?
  3. The best keywords are ones people search for a lot.
  4. Can you find competitive keywords, i.e. ones that aren't used too frequently on other sites.

Some of these factors conflict.

As an example, if you are a travel agent selling holidays you might choose think holiday is a good keyword. After all, you are in the business of selling holidays and lots of people will search online for holidays. However, there are so many web pages containing the word holiday that you will unlikely to appear high in any search results. So although "holiday" will be included in lots of search terms, it is not a competitive choice since it will be used in masses of websites.

Instead choose a few more specific terms. Such as:

  • french holiday specialist
  • travel agent shrewsbury
  • discount package holidays
  • last minute package holdays
  • adventure holidays in spain

Build up a list of keywords and phrases you think people will type into search engines then include these in the text of your web pages. The best way to include the words is in a way that seems natural to the human reader.

Having read this far you might think ...

"What's the big deal? I'm a travel agent of course I'm going to include phrases like adventure holidays in France". I don't need to think too much about keywords, all I need to do is write about my products."

... and you would be correct, except there are methods to highlight your keywords so that they are emphasised to search engines. This will potentially raise your pages up the search results. If a competitor website is optimised to reinforce a certain keyword and your site is not then the odds are you will appear lower in the search results.

Monday, 9 June 2008

Favourite Tools for SEO

This post contains links to some of my favourite SEO tools.

Keyword Density Checking

These tools check for the occurrence and density of keywords in websites.

www.googlerankings.com/ultimate_seo_tool.php
www.seochat.com/seo-tools/keyword-density/
www.keyworddensity.com/

Measuring your PageRank

Tools that can be used for measuring the current PageRank of a web page.

http://www.top25web.com/pagerank.php
Google Toolbar - installs a toolbar into your browser and includes the option of displaying the PageRank of pages you visit.
http://www.smartpagerank.com/

Predicting your future PageRank

These links provide tools that attempt to predict your PageRank. Note: The tools are for curiosity only and the predicted values are sometimes wide of the mark - they aren't really SEO aids.

http://www.rustybrick.com/pagerank-prediction.php

Google Results for Multiple Countries

This tool allows you to see how a website performs in multiple countries. There is no single "google" and the results from Google data centres for different countries can be very different.

www.link.ezer.com/tools/google_serps_rank_checker.asp

Find out who owns a competitor domain

This tool allows you to find out who own a domain. The report also provides an SEO rating.

http://whois.domaintools.com/

Find Blogs on a particular topic

This tool allows you to find blogs on particular topics. Posting comments on blogs can give you free backlinks.

http://www.commenthunt.com/

Tips on Writing Web Page Copy for SEO

The copy or text you write in a web page is vitally important for good search engine results. You need to include a good range of keywords. The keywords will be the ones you have calculated people will use when searching for the services you offer on Google, MSN or Yahoo. There are a number of strategies you can use to choose keywords and to write your web pages to emphasis or highlight the keywords to search engines.

There is no penalty for having too much text on a page and if you have written less than 300 words it is unlikely you will have included a good range of keywords for SEO. When writing the text on the page, always write for a human reader but also bear in mind SEO. Include more text and also include long tail keywords. Long tails are keywords that are searched for less often but if you include more long tails you are likely to get just as many, if not more, hits from the long-tails as your main keywords.

A good way to look for keywords is to analyse competitor sites. A set of keyword analysis tools can be found at www.googlerankings.com. Use the tools to find what 1, 2 and 3 word phrases appear on your competitor's web pages. Include the same phrases, where appropriate, on your own pages.

Here are my tips for writing text on a web page:

1. Include alternative ways of saying the same thing, e.g. an invoice software/ invoicing software/ software to create invoices.

2. Include the words in different orders, e.g. a man’s hat/ a hat for a man.

3. One method of adding keywords is to include a column listing benefits or features.

Good ways to ways of including alternative ways of saying the same thing are:-

a. alt text on images

b. title text on a hyperlink.

4. title tag

Use the title tag to hold as many as 3 keyword/ phrases separated by or -. For example:

invoice software billing software

The page title is probably the most important on-page SEO elements and ensure your most important keywords are at the beginning of the title.

5. h1 tag

This the most important header tag. Place a h1 tag near the top of each of your pages and make sure the keywords agree with the page title, e.g.

title = Invoice Software Billing Software
h1 = Invoice Software

6. h2, 3, 4, 5, 6 tags

Structure your pages by separating the text into topics. Use a h2 title for each topic. h3, 4, 5 and h6 can also be used to emphasise keywords.

7. Place important text near the top of the page, e.g. the first paragraph after the h1 tag should contain your main keywords.

8. Use bold or strong to emphasise keywords - Google picks up this as well as a human reader.

6 Things to Consider when Selling Software Products online

If you want to sell a software product online, the obvious route is to sell via your own website. If you want to sell software this way, you need to optimise your website to make it perform well in searches.

There is no easy solution for optimising a website to get good rankings in search engine results. No quick fix or magic trick is possible that will guarantee frequent visitors to a site and convert large percentages of your visitors into paying customers. The task of optimising a website for rankings, visitors and sales is an ongoing story of continual refinement and updates.

To sell a product online, at least 6 factors need to be considered:-

  1. Do you have a product that has a reasonable market?
  2. Determine how people search for your product or type of product. Do they use Yahoo or Google, what keywords do they look for?
  3. Make sure the copy on your website emphasis the most frequently used search terms - plus long tail terms.
  4. Make sure the copy on your website emphasis the most frequently used search terms - plus long tail terms.
  5. Write the copy for your website to convince customers to buy your product.
  6. Images are also important - or at least the names of your images are.

Point 6 can be very important. Google searches images and uses the image filename. Lots of people look for example images of invoices and I get a few hundred hits a month on my website for images of invoices and invoice templates.

Long-tail search terms are very important. For example, on my main website the search terms are invoicing software, invoice software and billing software. I’ve done research and found that these are the main search terms in the US and UK for my type of product. However, these search terms account for only 15% of my visitors. The rest of my visitors come from long tail terms that may only be mentioned once in the whole website. I don’t always deliberately put long tails into the text, I just write text, e.g. in the Support page or the Release History page that simply gives a good variety of words and phrases. I also examine competitor sites to see if they have combinations of phrases that I haven't included. I don't do any keyword stuffing, I just modify an existing phrase to include the long tails.

Due to the way search engines work, the people who find your website will already be interested in your product, or at least in the problem your product helps with. My feeling is that your website should then describe the benefits your specific product offers, i.e. why using your product will make handling security easier, quicker and cheaper for users. Be as specific as possible about the benefits so the visitor can easily understand how they can take advantage of your product.

The other thing to remember is that Google Pagerank isn’t everything. Pagerank doesn’t guarantee a page gets visitors. Monitor your web stats and see if any changes produce a rise in the number of visitors. Make refinements and see what effect they have.

Also consider whether you should rely entirely on search engines to get customers. Are there alternative methods of advertising your software? Can you get resellers for your software in other countries. For my invoicing software, virtual assistants make good resellers as they do invoicing for clients and I add features to make the software more suitable for their use.

Read Sales Strategy for more information.