Thursday, 8 January 2009

ADO.Net, DataAdapters & DataSets: What are they?

ADO.Net is Microsoft's .Net interface to databases. Traditionally, to work with databases like Access, SQL Server and the like, you needed to know a fair bit about SQL. With ADO.Net you still need to know SQL commands but some pretty near classes are provided that allow you to hive off the SQL stuff and work with a much easier set of objects when adding, deleting or updating items in a database.

The two major classes described in this post are the DataAdapter and DataSet classes. It wasn't until I'd actually coded some example that the ease of use of these classes became clear to me. I'll expand on the classes in later posts and give some code examples but for now I'm just going to give an overview of their purpose. This picture gives an idea of how the classes interact to allow you (the programmer/ user) to work with a database.

The four items in the picture are:-

The Database

This is something like an Access or SQL Server database. ADO.Net provides classes to handle many different types of database. All the classes inherit from a set of base classes so to a degree you can hide the details of the specific database type from your code.

The DataAdapter

The DataAdapter class is the SQL workhorse. There are a number of different DataAdapter classes for different databases, e.g. OldDataAdapter for working with an Access database. It's the DataAdapter class that does all the work - reading, inserting, deleting and updating - in interacting with the database. All you have to do is build the SQL commands for the DataAdapter to do the work and then let it get on with it's job.

The DataSet

The DataSet is the class you interact with when manipulating data values in the database. The data within a table in the actual database can be thought of as a collection of rows, with each row containing a number of named field values. The DataSet mirrors this view of the database. The DataSet is a collection of DataRow objects, with each DataRow begin a Dictionary of the values in the row where the dictionary keys are the field names.

How do the classes interact?

To work with the data, you configure a DataAdapter instance with the SQL commands to read data, insert, update and delete data in the database. You then ask the DataAdapter to fill a DataSet. You can then change values in rows in the DataSet, add new rows or delete rows. When you've made the changes, you ask the DataSet to get the DataAdapter to reflect your changes into the actual database.

In my next post, I'll give some code examples using the DataSet and DataAdapter classes.

Wednesday, 7 January 2009

Error reading setup initialization file: InstallShield Problem

Yesterday I had a support mail from a user who was unable to install on Windows Vista Home Premium. I was rather worried about the report since we'd just release SliQ 1.5.1 the evening before and I always get a bit nervous when we make a new software release.

The error the user was getting was "Error reading setup initialization file". Googling for info I found information saying that the error was sometimes reported if the InstallShield package had been corrupted. I tried downloading the lastest installer from SliQTools and tested it successfully on Vista and XP machines here in the office so I knew the live release wasn't corrupt. More research on Google indicated that the problem sometimes occurred if the installer took a long time to download -it was taking over 20 minutes on the user's Vista machine. Luckily the user was very technically aware and was very helpful in trying out different things.

I asked the user to download the installer to a Windows XP machine. This time the download took only 5 minutes over the same office broadband connection as with the Vista machine. The installation ran perfectly. The user then copied the installer on a flash drive and installed correctly on the original Vista machine. That evening the user download and installed correctly on his home Vista machine.

I find it hard to believe but the finger is pointing at the installer package being corrupted during the download process. There's not a lot I can do about people's broadband connection but I may have to think up some strategies for reducing the size of the download.

Tuesday, 6 January 2009

Getting out of the supplemental index

Back in late summer I reevaluated the linking strategy between my websites. Up until then I'd used my main site to feed link juice into my newer sites - softwarelode and so on. I decided this was a bad thing to keep on doing, since my original intention was to feed link juice back into my main money-earning site and not do things the other way around.

It was interesting though to see how well sites like softwarelode responded to getting a few links from my main site. Basically within 3 weeks softwarelode started getting a few hundred visitors a day even though it was a new site. Predictably, when I removed the links the visitor numbers began to fall but at a much slower pace than the visitor numbers grew in the first place. Rather than the 3 weeks or so for the visitor numbers to peak, it took 2 to 3 months for the visitor numbers to fall away. During those 3 months, Google did some mini toolbar PageRank exports and some of the inner pages of softwarelode started showing PRs of 2 or 3. By early December though all pages apart from the homepage were showing PR N/A and visitor numbers were 20% of the peak.

As the visitors fell away, more of the softwarelode pages were falling into Google's supplemental index. When a page is in the supplemental index it's not going to turn up in SERPS execept for very specific/ obscure search phrases. The supplemental index is purgatory for web pages. I had a look around the web to see what advice I could find. As to be expected the advice was that old chestnut - build backlinks. So, before Christmas I did a spurt of backlink building and I'm pleased to say that since the New Year visitors are returning to softwarelode and the Adsense income is beginning to climb again. Since yesterday (5th Jan), an extra 350 pages are marked as being in the main index. I know of similar sites to softwarelode with about 2500 pages in the main index that make a decent amount of Adsense income (few hundred dollars a month) so hopefully I'm on target to making softwarelode an earning website by the middle of 2009.

Friday, 19 December 2008

Measuring the competitiveness of keywords

When writing a new website and choosing keywords, it's easy to make the mistake of choosing ones which are far too competitive. For a new website, it will be difficult to rank well for competitive keywords unless you can get some very high quality links from high PR sites.

Competitiveness is a difficult thing to measure and it has to be balanced against the search volume for any particular keywords. It might be worth optimising for competitive keywords if the search volume is very high and you don't mind taking a longer term view of ranking well. Taking the opposite view, it isn't worth optimising for uncompetitive keywords if the search volume is very low.

One way of investigating the competitiveness of keywords is by using Google's own keyword research tool at https://adwords.google.com/select/KeywordToolExternal. This tool shows monthly search volumes for keywords together with a rough gauge of the competition for the keywords.

Another rough way of gauging the competition is by using the allintitle: operator when doing a search. Using the allintitle operator makes the SERPS only contain those pages which have the search words in their title. Since a page title is a key SEO factor, the number of results returned is a rough and ready gauge of competitiveness. For example, if you want to measure the competition for web design, do the following search on google.co.uk:

allintitle:web design

This returns 9,800,000 results. Trying:

allintitle:seo

returns 13,800,000 results.

Contrast these numbers with a set of keywords that we can guess are pretty uncompetitive:

allintitle:british vineyards

This returns 639 results, or ...

allintitle:web design worcester

which returns 1050 results.

Of course, the number of pages that include keywords in their title doesn't tell you the full story of how competitive a set of keywords are but it is a start. For one thing, allintitle doesn't tell you how well optimised the pages are, e.g. the first 50 pages in the results might have good links and content and be hard to beat without a lot of work. allintitle though is a useful tool to add to your SEO arsenal.

Monday, 15 December 2008

SliQ 1.5 Released

After a few months development SliQ 1.5 has been released and includes a new recurring invoice feature. The recurring/ automatic invoice feature has been in development for the last 3 months. During this time, feedback from a number of users/ potential users was used to guide the implementation. In line with earlier feature additions, the emphasis has been on making recurring invoices as easy to set up as possible.

With two or three mouse-clicks you can now make SliQ Invoicing automatically raise repeat copies of invoices. All you need is select an invoice and check the Recurring? box in the toolbar.

... then confirm the frequency for raising the invoices ...



This will save loads of time for anyone regularly raising repeat copies of invoices, e.g. website designers charging monthly for SEO or website maintenance.

SliQ 1.5 also includes a bulk printing facility. SliQ now tracks which invoices have been printed and allows the user to print all un-printed invoices with a single menu click. This should greatly speed up the monthly billing process for SliQ's users, especially if most of the user's invoces are automatically raised by SliQ using the recurring invoice feature.

Wednesday, 3 December 2008

SEO: Doing it professionally

After helping out a few friends and acquaintances with website optimisation, I've been approached by a web designer about doing SEO work for them on an ongoing basis. They would like me to propose a service or set of services I could offer together with a set of prices.

The easiest and cheapest service I could offer is sets of directory submissions. To do these I could use my development version of professional directory submitter, SliQ Submitter Pro. This should allow me to do a hundred or so submissions an hour.

Of course there are a lot of other techniques I could use to do link-building. The more I think about it though, the more I feel a fixed price service won't do the job. SEO is a long-haul activity and needs to be spread over a number of months. Ideally I would spend 6 or so hours a month doing offsite optimisation for a website using directory submissions, articles where appropriate plus other link-building techniques I've become familiar with.

Spreading the SEO work over a few months should give better value and satisfaction to the customer. With a one-off hit at link-building, there won't be time to see any results before the work is completed. It's also likely to be unsuccessful. To do optimisation, you have to be able to monitor the results and make changes over a period of weeks. with newer sites this is especially important as the sites tend to perform well for a period before dropping back.

The other aspect I've got to price up is the on-page optimisation. Do I charge per page? Do I have a minimum charge that makes it worthwhile doing the job in the first place? If I think back to when I was looking for SEO help, I would often get quoted £350 a site or £100 per page. I never felt entirely comfortable with quotes like that since they didn't quantify what work was being done. Now, I've got more experience I can also see that it's pretty hard - or at least less optimal - to optimise a single page on a website.

I'll also have to think through whether I offer any PPC, e.g. Google Adwords advice. My feeling right now is that I shouldn't since I don't think it's a good medium to long term way of getting traffic/ sales, or rather I think that organic SEO will be the most cost-effective after a 6 month to 1 year period.

Sunday, 30 November 2008

Software Trial Periods: How long before customers buy?

With the November releases of SliQ Invoicing and Quoting (Standard and MC), I made a change to the format of the product and unlock codes. The idea behind this was to simplify the process for users, making it easier to check if a product code was correct. The new format also makes it easier to generate an unlock code. The new unlock code format is also longer - meaning that people will be less likely to try and type the code in by hand. This should reduce the chances of the unlock code being mistakenly typed. On the advice of a fellow software vendor, I now use the customer’s identity - land and email addresses in the code making it easier to match codes to customers in the future.

I've always wondered how long people use my software before purchasing. People have up to 30 days free use before they need to buy but until now I've had no way of gauging how long people try before buying on average. With the change in the code format, I've been able to tell whether someone download the software before or after the change. Previously, I’d read posts from other shareware authors or marketing people advising that people tend to buy more or less immediately - within hours - if they are going to buy. The longer people leave between trying and buying, the less chance of a purchase. Although not a scientific test, in the three or so weeks since the last release, 90% of purchasers still use the old format code. I'm taking this to mean that, at least with my products, most people take pretty much full advantage of the 30 day trial period.

Of course, I could get worried by purchasers still registering with the old product codes. With the credit crunch I could assume that I’m not getting any new customers and I’m just exhausting the supply of people who downloaded a trial a month ago. However Google Analytics is actually showing an increase in traffic over the past 3 weeks and my download bandwidth has increased too. This means I'm probably getting proportionately more new trial users. The sales haven't dropped off either, which I was kind of expecting for business-related software in the run-up to Christmas.

If all this means that most people take advantage of the trial period then I’m glad. I want people to use the full trial period to make sure they are happy to purchase. Hopefully it reduces the support overhead in the long-term since those people who do buy will be more happy with the features the software provides.

Friday, 28 November 2008

Remote Support Access

For a while, I've been looking for a way of improving support to customers. If a customer is confused by a feature or we can't understand the problem they are trying to describe things can be difficult. The only real way to move forward in such situations is to see what the customer is actually doing on their PC. Site visits are not really possible - for cost reasons if nothing else - so I've been looking for a way of sharing PC desktops remotely over the internet.

Discussions with friends raised a number of possibilities - Webex, Windows Invite a Friend and NetViewer were mentioned. The cheapest options is Windows Invite a Friend - it comes free with Windows XP and Windows Vista. I tried it our on a pair of PCs in our office but found that:

  1. You have to explain to the client/ customer how to get the service going and send an invite for support.
  2. The help pages linked from XP's help are no longer present on Microsoft's website.

Both of these points make me wary of using Invite a Friend - they wouldn't make SliQTools look professional.

So I took at NetViewer. This seems a reasonable service - the cost is good and the service works well. The support technician sends an invite to the customer, the customer downloads a small client program (linked from the support invite email) and gives access to his PC to the support person.

To see an alternative, I took a look at LogMeIn Rescue. This turned out to be the Rolls-Royce remote support service. It's a really good package, working more smoothly and with a more professional, friendly feel for the technician and customer. The only downside is the cost - 4 times that of NetViewer. Overall though, I think you get what you pay for and LogMeIn Rescue seems like a good choice.

Wednesday, 12 November 2008

Free Directory Submission Software

It's about 3 months since I made a new release of my free directory submission tool, SliQ Submitter. Since I made the release, I've been busy on other projects. One of those projects is a faster directory submitter that should make the whole submission process much quicker - perhaps as little as 1 or 2 seconds if the directory doesn't have a captcha.

SliQ Submitter was my first attempt at writing directory submission software. Initially I made 3 releases very soon after each other - first with a free web directory list containing 450 directories, quickly followed by 2 more releases until the package listed over 2000 web directories. I initially tested submissions to all the listed directories and was confident that all directories worked and would accept submissions.

Soon after the last release though, I realised that web directories don't stand still. Before long the PR of the web directories changed, with a lot going to PR0. Whether this caused a number to give up I don't know, but quite a few of the 2000 went offline. As the months have passed, a number of the domains expired and a good percentage of the directories switched to paid.

In the last few days, I've rechecked the directories, removing those which are dead or have switched to being paid. Of the original 2250, there are now about 1250 left. As of today though, all of these are free and if a submitted website gets accepted by a good proportion of the 1250 directories, the site should get a good boost in PR and performance in SERPs.

Getting more Visitors and Page Views

I've been helping a friend optimise his software archive site SoftTester. The site is nearly 5 years old and has about 100,000 pages as well as being listed in DMOZ. Over the last couple of years his site had been slowly losing visitors. By June he was down to only a few hundred a day. Needless to say, his income from Adsense had fallen away to almost nothing.

In June, we decided to do some SEO on the site. We mainly concentrated on on-page SEO and improved page titles and descriptions as well as adding good h1 and h2 tags. His site is database-driven, with most of the content coming from PAD files submitted by software authors.

We changed some of the data used to display info as well as shuffling the position of some the displayed items. Whatever we did, it seems to have paid off. Within a couple of weeks, search engines started sending more traffic to the site. In particular traffic from Google began to grow steadily.

As well as on-page optimisation, we set about getting new links to the site. One of the main ways software download sites get links is by reviewing and making awards to listed software packages. Software authors can then use a nice award graphic on their own websites and link back to the archive. The existing graphics were a bit tired, so I encouraged my friend to buy classy new ones and before long he began to get extra links to his site.

After waiting 4 or 5 months, the number of visitors and page views had grown by a factor of nearly 5 and the income from Adsense had grown along with the traffic. Not a bad result for a few hours work spread over a few days.