Good practices when deploying a QlikView™ BI platform: Dev-Test-Prod

In any software development it is wise to set a 3-stage approach:

  • Development;

  • Testing;

  • Production.

See the following diagram to depict the ways a QVW application is moved among stages and the roles of each stage.

QlikView™ 3 Stages Diagram (Development-Testing-Production)

While the Production is clear we should have it (or else we’ve missed the point, right? ;-) ), for the first two stages, some aspects should be covered:

1. For the Development stage, it is sufficient to have one (or more, if you have more developers) QVLC ( = Local Client = former Developer) from the licensing perspective. It provides to the developers all what he needs in order to develop new QlikView™ applications. It can be even a borrowed license from a Named CAL of a QvS (in case you have this), but the safest way is to have a dedicated QVLC in any situation. We can also consider here a scenario where the development is an outsourced service, as well.
For the development stage it is recommended to have also a mechanism (either automatic, or at least manual) to allow snapshot-ing of the various development phases. In other words, a way to keep older versions.
One option that is always available is the “Save As” approach, of course. But QlikView™ offers other wiser options, as well: the option QlikView™ Local Client has to keep previous versions of the QVW files (see Menu – Settings-User Preferences – Save – Use Backup), or the option to export all metadata from a QVW file to a set of XML files. One way so to do it is manually via Menu – File – Export.

But the best option here is to have a sub folder named ”.-PRJ” in the same folder with the application. In that sub folder, all the XML files with metadata of the QVW file (script, variables, interface definitions, actions, macros, etc) will be saved automatically. This one is the most recent available and it brings additional value, especially in a multi-developer environment, through the integration with CVS, SVN or another source versioning solution.

This allows not only version comparison, but also check-in / check-out facilities (avoiding two developers to work in same area and the inherent risk of generating conflicting versions).

2. For the Testing stage, at least 3 perspectives need to be covered:

  • testing the new local developments (whole new QVW applications or new parts of existing ones) before launching them into Production (real life usage). Including extensiveing the tests of correct cascading of data staging processes or for the incremental loads;

  • testing the new versions and builds of the QlikView™ platform released (several times a year) by QlikTech, before migrating our Production platform to the newest version/build;

  • testing the performance capabilities of our platform and our new reports, especially.

For the first perspective of the testing stage, the most common practice we’ve met was to add a separate folder within the production server, where the “under testing” files are hosted. (Sometimes, as a variation of this approach, even adding within the existing files some special tags to the object titles that are under validation makes sense) .
For the second perspective, having a testing environment to check both that all your QVW applications are compatible with the new version of the platform is a wise, preventive approach.
More than that, doing a migration first in an Testing environment, we can be sure also that the migration process is well documented and is also working as expected, in our particular environment.
Not to be forgotten that the migration of the Production environment is usually constraint to late night or weekend hours! (not so funny to do it over and over). Why shouldn’t we make the migration of the testing environment during regular day time, and than just switch the production environment to the newly migrated Testing environment, while having also an option to revert to the old one, in case some major migration failure is recorded? It’s easier and safer, in the same time! And afterwards, just assign the role of new Testing environment to the old Production one.
By the way, this can be considered also as a pretty neat way of providing a real back-up option of the whole Production environment (some limitations do apply, that’s true).
For the third perspective, unfortunately most of the current deployments we know are reactive only. (Which makes some sense, based on the high performance QlikView™ is usually providing).
Still, a better way is to do performance testing in a separate “sandbox” and push the limits of the performance so that we can predict in advance when we will face performance issues and we’ll need to upgrade the hardware (or fine tune the applications) to support a bigger number of users or an increased level of complexity in our set of applications.


Assuming there are enough reasons mentioned above to consider the 3 stage approach, the budgetary constraints are still, especially nowadays, a major issue. It is not easy for any kind of organization to double the licensing budgets just to provide this kind of redundancies.

But even under all these constraints, there are good news: the licensing model of QlikView™ is providing a special testing license, named QlikView™ Test Server, that is approx half price of the regular equivalent QlikView™ Server and is providing the same number of CAL licenses as the associated QlikView™ Server (without extra charge for the CALs!).

For the QvS Enterprise Edition this was available for some time already, but there is also now a brand new one, called QvT SBE (Small Business Edition), available since this summer, at a fairly reasonable price point!


Procedural approach
Recently speaking with some of our customers on this matter, we realized that it is also needed to have a clear procedural approach of Testing, defining testing check lists and expected results for various scenarios.
For this, it would be useful to have also a place to collect previous encountered errors and possible ways to prevent and test them during the next testing stage of a new deployment.
More on that in a future post… ;-)