Beyond dedicated business solutions you can find in QQsolutions, we have developed in time also a set of tools and principles for processing data and creating generic apps regardless of industry and department.
First QQtools for QlikView™ have been developed in 2006. Ever since we add, improve, and adapt them continuously.
Generic Solutions to Increase Trust, Security and Flexibility
Rather than to do the same thing twice, we prefer to build a library of basic processing and visualizations that allows us to offer similar solutions to a fraction of the usual effort and cost, dedicated to each beneficiary.
Here are some of these generally applicable approaches, that we have already prepared (some are usable in any application, industry, or department, others are somewhat more relevant in financial analysis):
A package of QQinfo, tools, plus methodologies and good practices that minimizes the risks of using incorrect analytics in the decision-making process.
Read more
It is one of the most important components of the concepts brought together in the QQtrust™concept, mentioned above.
Read more
The QQdata.quality™ engine complements the QQvalidator, leveraging similar principles, but applied to assess the quality of source (or processed) data and to identify problems requiring alerts, manual correction interventions in the source information systems, automatic corrections or alerts and manual correction proposals to the Data Quality Stewart.
Read more
When data confidentiality is extremely important to your organization, you may want certain subsets of data to be available only to certain people, while some users to have with access rights to the entire data set.
Differentiated management of access to sensitive data within the same Qlik™ application, for heterogeneous levels of data access, can be defined in several ways, with QQsecurity™ providing everything a non-experienced Qlik™ administrator needs to control the rules of applicable security.
Dedicated applications or dedicated streams for different security level are no longer needed, making administration easier, and QQsecurity™ is ready to manage differentiated access rights for an entire set of Qlik™ analytics applications, including rules that apply simultaneously to a group of Qlik™ applications (files).
Read more (coming soon !)
This is a special page, associated with certain generic processing, which allows medium & advanced users of Qlik™ (especially Qlik™ script and interface creators) to navigate the available data space within the Qlik™ application, to better understand the structure and content of the data.
Read more (coming soon !)
This concept, borrowed from world-renowned and expanded and improved Qlik™ experts, allows the reuse of already created interface pages, with any calculation formulas already defined, but also the quick and systematic addition of new calculation formulas to the entire analysis set of the respective application.
In addition, this mechanism brings to the beneficiaries, from the first delivery, a huge number of options for reconfiguring the available views.Thus, most of the requirements for small changes in views are a priori addressed and resolved, without additional interventions and costs.
The number of combinations available at the user’s fingertips is unimaginable: on the order of the number of atoms on Earth. Hence the name Qosmic.interface™.
Read more
In some of the last projects, we faced a special situation: the need to consolidate information located on several thousand servers located in multiple locations. Beyond the huge number of servers (and implied databases, tables and fields, etc.) the challenges also consisted in identifying the meaning, logics, quality, age, relevance and redundancy of the data in all these containers. This is how QQdata.inventory™ was born.
The QQdata.inventory™ engine allows, once configured, addresses and credentials to connect to all of these databases:
- connecting to all data sources;
- saving the login result;
- scanning related metadata;
- identifying the age of the data contained;
- if necessary, copying all data in compressed format (this is where backup options are born!);
- analyzing data structures and identifying potential logics for connecting tables to each other;
- construirea unui catalog de date.
Read more
When an extremely clear separation between tax accounting records and additional data for managerial analytics, is required, each with a different accounting and financial treatment, it is wise to avoid keeping this additional information in the same system (ERP). Simultaneously, however, maintaining the data in 2 different systems also brings problems of doubling the effort of entering information, as well as problems of data synchronization and understanding the differences
QQadjustments™applies the principles of exception management, taking the basic data, mostly from the tax-accounting system and providing additional space for introducing alteration exceptions (add/ delete/ modify/ etc.).
Finally, the included Qlik™ analytics offer, by a simple selection, options to view:
- the original tax-accounting version read from ERP,
- or the adjusted version,
- but also allows immediate highlighting of applied adjustments (for maximum consistency and transparency).
Read more (coming soon !)
Generic Data Exploring & Evaluating Solutions
Early stages of reading and exploring data for creating new analytics often requires a separate set of tools from the ones used in day-to-day.
Therefore, these QQdev.tools, have arisen.
Fast identification of the data structures within a data base is the fundamental starting point of any BI project.
Read more (coming soon !)
Read more (coming soon !)
QQkey.profiler&preparator™ is a data processing and visualization engine and has 4 user variants.
Read more
Read more (coming soon !)
Read more (coming soon !)
Read more (coming soon !)
Generic Data Processing and Integration Solutions
Retrieving data, processing it, storing it, and sending it to other systems is becoming an increasing challenge, especially as the data sets involved become larger and more heterogeneous.
We address this challenge with generic, reusable, and reconfigurable tools.
We have encountered many situations in which we have been asked to distribute certain value records, of a certain kind, in many smaller slices, in proportion to other values nearby.
Possible areas of application: ABC (Activity Based Costing), CCC (Cash Collection Cycle) and Working Capital analytics on products and/ or suppliers.
Read more (coming soon !)
SIRQ™ was first created in 2006 to give us the opportunity to repeat similar introductory projects (SIBs) with minimal effort and budget. Since then, we have been continuously improving, expanding, and transforming it.
SIRQ Gen4 is currently under development, aiming to bring in Qlik™ also the final configuring decisions regarding the data processing to be executed in Qlik apps.
Read more
For some time, we have started to replace Excel satellite tables with more powerful alternatives, capitalizing on online no-code database solutions. Such solutions bring more robustness simultaneously with the versatility of multiple online, multi-user editing options.
From the tried and tested options, we concluded that AirTable offers a major package of benefits, sometimes even in the basic, free, package.
We have already prepared a quick configuration package (AirTable Data Base & Qlik Scripting) for reading tables from an AirTable online database (or any other REST API data source).
Read more (comin soon !)
Provides incremental partitioned reading of large tables of facts, such as sometimes:
- stock movements
- accounting notes
- history of invoices issued and receipts
- even detailed sales on invoice lines or products
- manufacturing logs
- quality logs
without having a significant impact on the speed and frequency of refreshing analytical applications, while providing protection against overloading of servers (SQL, etc.) from which data is read.
Read more (coming soon !)
It offers a more simplified approach to the challenges mentioned above, in QQpartitioning™, for situations where the data sources are large in volume, but a little less overwhelming.
Read more (coming soon !)
Transforming and delivering data sets in JSON format can be a major challenge, in particular:
- if we are talking about large data sets.
- if we must integrate data from multiple data sources, including heterogeneous sources as underline technology,
- if we want a solution that allows rapid subsequent changes, using selections of options and not rewriting the whole processing code.
Immediate applications:
- generating JSON sets that define automatic invoice creation based on incoming payments.
- reporting commercial activity or inventory to suppliers & vendors
Read more (coming soon !)
Transforming and delivering datasets into complex XLS and XLSX formats can be a major challenge, especially
- if we are talking about large data sets,
- integration of data from multiple data sources, including heterogeneous as technology,
- if we want a solution that allows rapid subsequent changes, using selections of options and not rewriting processing code.
Immediate applications:
- reporting commercial activity or inventory to suppliers & vendors
- reporting commercial activity or inventory to suppliers & vendors
Read more (coming soon !)
Transforming and delivering data sets in XML format can be a major challenge, in particular
- if we are talking about large data sets,
- if we must integrate data from multiple data sources, including heterogeneous as a technology,
- if we want a solution that allows rapid subsequent changes, using selections of options and not rewriting processing code.
Immediate applications:
- export in SAF-T format
- generating XML sets that define automatic invoice creation based on incoming payments.
Read more
Repeatedly reading an XML data structure and mapping that data to a structured destination (database, or analytics system, or whatever) is very challenging when the XML structure is likely to change over time.
Our approach adds value by:
- robust processing survives the appearance and disappearance of data substructures in the new XML data instance read subsequently
- in addition, we identify, signal, and exhaustively exemplify the appearance of all new data structures, not yet mapped
Simultaneously associating a record in a database with a whole package of hierarchical belongings, but in which the inclusion of numerical values is done only as a component, including with different arithmetic sign, is really a challenge of structuring and processing data. Most common place of application: P&L financial statements and Balance Sheet (profit and loss and balance sheet).
Read more (coming soon !)
QQfallback™ is a tool that applies a set of configurable parameters that are defined through a fallback procedure. It is very useful for situation where a lot of entities requires a set of additional parameters to be managed manually. We developed a way to ease the effort by applying the management by exception.
Read more (coming soon !)
Read more (coming soon !)
Read more (coming soon !)
QQconfig.tracker™ allows regular retrieval of the entire set of currently valid processing configurations with each new data processing.
This saving of the complete set of current configurations is highly relevant in the context of using SIRQ™ and other QQtools modules that use configuration tables, which basically change the processing logic (in the Qlik™ script or interface).
Read more
QQsemaphors™ is a materialized concept of modularized Qlik™ processing, which allows quick activation and deactivation of processing modules in order to increase the speed of testing and (modular) processing of Qlik™ processing scripts.
Read more
QQparallel.processing™ is a module that allows increased processing speed when hardware resources have significant (un)load reserves and shorter data processing times are required.
Read more
QQalerts™ offers a minimal set of alerting features that are similarly valid in any version of Qlik Sense™ server, whether on premise or SaaS.
Read more (coming soon !)
Generic Solutions for collecting decisions directly in Qlik
Qlik™ is a decision support tool, and most of the times the taken decisions should be saved for further usage and transmission.
We strongly believe in the previous statement since 2005, when we became Qlik™. And since the solutions brought in time by Qlik™, haven’t been enough, we decided to develop further these functionalities.
After many attempts to use various borrowed solutions to directly introduce the Qlik™ interface the decisions (numerical or of any other nature) supported by the analytical interfaces we built over time, we decided to build our own Qlik™, our own Qlik™ extension care, which address this growing challenge and need.
Read more (coming soon !)
For more details about how these products and solutions can help you, or for the development of solutions tailored to the specific needs, please contact us !
If you want QQinfo products, please fill in the form here.
In order to be in touch with the latest news in the field, unique solutions explained, but also with our personal perspectives regarding the world of management, data and analytics, we recommend the
QQblog !
For information about Qlik™, please visit this site: qlik.com.
.