Wednesday, November 18, 2009

Testing SaaS, all parties included?

This week I was at a meeting in The Hague where a SaaS-solution in e-government (GovUnited) was discussed by people from science, companies and the Dutch government.
The idea is to develop and maintain a standardized website for Dutch cities, which can be customized per city for its particular needs. The maintenance will be done from a central place in the Netherlands and the cooperating cities (the customers) will pay the service-provider GovUnited a yearly fee for development and maintenance of their website.
Next to this, GovUnited can act as a intermediate between the cities and other parties, like e-payment services (eg.Ogone) or other government services and facilitate the connection between both e-services.

This makes it for me as a testprofessional interesting, because with all these different parties involved, who is solely responsible for the quality of the SaaS-product?
If the website is running, but one of the links to another party (like Ogone) is malfunctioning, who is responisble for this, Ogone or GovUnited? Or perhaps even the party hired to develop the website?
Each party can develop and test their component of the SaaS-product, but who is responsible for testing the SaaS-product as a whole. This multisystem integration test must be considered in the development + maintenance and can't be just be planned and executed at the end of development because if things get wrong then (and most times it will) it gets nasty and dirty for all parties involved.

So, a careful planning of development and test should be made between all stakeholders to ensure the deadline can be made with possible risks taken care for.

Sunday, April 5, 2009

Einstein 2.0 and BC, a SaaS vendor and its client

In an earlier blog I said I am going to illustrate model based testing of a SaaS-application.
I invented a company named Einstein 2.0 which develops ERP-SaaS applications for companies.
This blog-item will give more information about a client of Einstein 2.0: Beta Computing Inc. (BC).
BC is a global commercial enterprise specialized in selling computer hardware. Its main reason for choosing Einstein 2.0, BC wanted to outsource the development and maintenance of her ERP-software to a specialized company which also could develop it web-based, a disciplin not present in BC.
The agreement between Einstein 2.0 and BC is recorded in an SLA which describes agreements on different levels (eg. performance, payment, warranties etc.)between both parties. This SLA is very important for testing because it outlines the boundaries of the scope of the test.

As already said BC is globally present (Europe, Asia and USA).
Einstein 2.0 on the other hand is Dutch and based in The Netherlands.This is no problem, because Einstein 2.0 can develop and maintain it SaaS-product locally, but distribute it throughout the Web world wide. No local installation of 'ERP On Demand' is necessary.

This was just a short description of the fictional world of Einstein 2.0 and its client BC.
The next blog-entry will discuss one of the most important characteristics of 'ERP on Demand!', the way BC can access the software(security issue!) and to test this in a model based manner.

Sunday, February 15, 2009

Testing SaaS, a necessity for both vendor and client

Last week I read Phil Wainewright's blog about a SaaS application with a very serious security breach.
Now you might think: 'What has that do do with testing SaaS applications?'.
Well, just read this part of his story and you will know:

'I suspect the root of the problem in Sage’s case was an unthinking assumption that Aqualogic was such an established Web platform that basic security would just be built in as standard. This is typical of the blind-leading-the-blind nature of the on-premise software model, in which customers blithely believe that vendors have built everything they’ll need into the platform, while vendors naively assume that anything they’ve missed will be easily spotted and corrected by customers during the implementation process. It’s bad enough when it results in catastrophic roll-outs at just a single company, but when the application is being deployed as a service to multiple downstream customers, a far higher duty-of-care is required, because the risk exposure is massively amplified.'

If Mr. Wainewright hunch is correct, this shows it is again all about communication between a software (read SaaS)vendor and client(s). Both parties rely so much on each other's testing process, blindfolded for both testing processes, believing everything is covered and 'ok'.

This example is bad for SaaS-marketing, but it is not the fault of the SaaS but a typical mistake of communication between vendor and client and also a risky time-to market damaging all parties connected to the SaaS-application.
A solution for such a mistake? YES!
Get rid of the barrier between the testing teams of client and developer and let both testing teams develop a strategy how to plan their tests and who covers what.
This narrows down the time to test because each party knows what they and the other testteam have to test and when to test.
This will allow a more efficient test process,covering all risks to be tested in a shorter time. Creating this way a shorter time to market enabling a better economic position for both SaaS vendor and client, giving SaaS a best practice.
It's a waste when innovation does not succeed due to bad communication.

Saturday, February 7, 2009

Model based testing and SaaS, an example

In one of my earlier posts, I discussed the possibility of using model based testing (MBT) as a methodology for testing SaaS-applications.
I already discussed MBT is possible at a system test-level, not at acceptance test-level. MBT for complex software systems like SaaS is an area still evolving, and it could be a good idea what the current possibilities are.

Let's say a crack testteam from the fictional (!) company 'Einstein 2.0' has been assigned to do a system integration test or SIT for the company's SaaS-solution: 'ERP On Demand!'.

But first a short introduction to 'ERP On Demand!'.
This innovative product is a ERP-suite designed as an online ERP-dashboard for the enduser with all the benefits of web2.0(!):
By using the dashboard the enduser has secure access to its various ERP-resources (eg. CRM, HRM) through the internet and can change its settings by choosing from various modules given by 'ERP On Demand!'

The online dashboard 'ERP On Demand!' is for use as a service provided by 'Einstein 2.0' to customers on demand.
Inplementation of the software is not necessary, a good internet connection is enough, enabling the application to be used by the customer effectively from day 1.
For all this, the customer has to pay a monthly fee to the software vendor 'Einstein 2.0' so it's licensed to use 'ERP On Demand!' serviced by the vendor with the latter obliged to give 24h. secure service and maintenance.

This obligation is very essential for the testteam of 'Einstein2.0': 'ERP On Demand! should be online 24 hours a day with excellent performance and high security.
This addresses one of the issues associated with SaaS: how to deliver a safe B2B-application through the internet 24 hours a day??

From a tester's point of view these issues are nonfunctional: performance and security.
A model based testing approach could be an option, next to the available loadtesting and security testmethods.

The next weeks I will discuss this MBT-approach for performance and security testing of 'ERP On Demand!' in my blog.
Feel free to share your thoughts with me about the testing method MBT for SaaS.