Script Inline tables and JSON files in Qlik Sense™

Did you know you can use Script Inline tables and you can read JSON files in Qlik Sense™?

How to use Script Inline tables în Qlik Sense™?

All application load scripts have one or other of the tables inline. This is an inline table as taught in Qlik Sense™classes:

    name, age
    Christof, 49
    Dhruv, 27

This is the simple and most commonly known syntax. Field content is automatically trimmed (no leading/trailling spaces), type is auto-detected (number as number, text as text), standard delimiter is comma. You know, that you can use ‘ ‘ or ” “ quotation, if the delimiter (comma) is part of the text itself?

  id, age, name
  PDR, 10, “De Redelijkheid, Paul”
  CSW, 20, ‘Schwarz, Christof “Blacky”‘

However, there is much more about it.

  • You can alter the opening and closing quotation of the inline table: instead of [ ] you can use ` ` or ” “ (helpful especially if a square bracket is part of the field content)
  • You can exchange the delimiter to something else than comma (e.g. “;” or “^”)
  • You can work without header line and comment out rows with the help of a WHERE-clause
  • You can immediately add calculated columns right after the LOAD *

Below, you will see combined what has been explained above:

    @1 AS types,
    If(@2 LIKE ‘[*]’, @2, ‘[‘ & @2 & ‘]’) AS fields
    numFields ^ [D],[E]
//    textFields ^ [F],[G]
    textFields ^ [F],[G],[H]
    dateFields ^ J
` (delimiter is ‘^’, no labels)
WHERE NOT @1 LIKE ‘//*’;

The outcome is the following table:

The table has “auto-headers” called @1, @2 one line was “commented out” using WHERE because we needed “[” and “]” to be part of the field content itself, we exchanged the table quotation to “`” and the field separator to “^”. We are adding a formula that manipulates the @2 column to add “[” and “]” around the field value if not contained already.

Here is a final observation:

  • You can even change the type of the inline-block

Typically, we provide txt content (csv), but the type parameter can be altered to XmlSimple or – at least already on Qlik™ Cloud – to Json!

  { “name”: “Christof”, “age”: 49, “partner”: true },
  { “name”: “Dhruv”, “age”: 27 }
` (json);

How you can read JSON files in Qlik Sense™?

Recently, Qlik added script support for .parquet but the common .json files are still not officially supported, but you can read them, it is a bit tricky and slower for large files – and an import wizard is missing, which would create the script blocks.

Note: Qlik™ may still change the described behavior of json import at any time.

The hassle starts when you click on Add data in the connection pane. The file dialog for the folder has no specific filter for .json files:

Choose “All files” and then select your .json file. Next you will see, that the wizard attempts to interpret it as csv. Pointless, but at least you get the FROM construct properly filled. Forget the rest, as we will replace it immediately.

Just hit “Insert script” and then put LOAD * in front of FROM, and instead of (txt …) put (json):

Not too complicated so far. If your .json file only has one root object (or one root array of objects) and you will already get the keys from the top level. You are already done.

Next, we present more details about parsing.

Simple JSON Object

A singe json file containing only one object …

… will result in one line of data

Simple keys are loaded 1:1

  • keys with simple values will be 1:1 loaded into a Qlik Sensefield
  • arrays and sub-objects will get a 16-char hash-value (will be explained below)
Multi-elements JSON File

If the json file has multiple “rows” (objects in an outer array [ … ]) you automatically will get multiple rows in Qlik.

The arrays can have different keys. Common keys will concatenate in the same field, whereas keys missed out in a given ‘row’ will read “null” (unfortunately as text, not as Null()) in the result.

Note that “arr” gets a hash-value back because it is not a basic value

How to deal with sub-objects and arrays? For this we need understand the table is construct and the hash values.

  • The root element has the pseudo-table name “Root”

Every element in an array has an offset, starting with 0, even the outmost array. Hence, the first “row” above you would get with “Root/0”

  • An object or an array will come back as a hash-value
  • To get the object or array contents itself, alter the “table is” construct to address this key, e.g. in above example “Root/1/arr”
  • If it is an array, you need to use the Pseudo key NoName to get the value and the key itself to get the hash value to link it to its parent

The good thing is, the whole import logic can be simply done by a recursive sub procedure: Json in, data-model table(s) out.

Importing JSON sub-arrays

That is fine for one row, but if you have a json file with many rows, you don’t want to call every line with “Root/0/arr”, “Root/1/arr”, “Root/2/arr” …

Assuming a .json file with this content:

We will introduce a FOR loop now. Note, if an array is empty [] or is null, the LOAD command inside the Loop would fail on that “row”, so we set ErrorMode to 0 (=continue script upon error) and reset it to the default 1 (=fail on error) after the NEXT command.

The result is a nice 1:N relation resolved.

The fieldname “kidsAge” also exists on the inner LOAD (next to “NoName”) and it returns – yes – the same hash value as in the outer, first LOAD.

Importing Json sub-objects

Last example is for a sub-object (instead of the sub-array): Similarily, you get hashes on the parent level and the content itself when addressing it right with the “table is” position. A FOR loop will ease things again, plus some more logic.

In this .json file we have one row where “kids” is null and the sub-object doesn’t always contain the key “happy”.

We can’t count on auto-concatenate any more for the sub-objects (row 4 would end up in a different table than rows 1 and 2). Also, without error-tolerance, the 3rd row import would fail. So we go for this script, where we create an empty table for the “kids” and then explicitly concatenate into it. We have to use the LOAD * functionality, because the structure of the sub-objects is not identical throughout the main array. (Assuming you set the vFile yourself).

We will get now two tables, that are 1:1 related (or 1:1 or 0, to be precise).

Note: the inner LOAD * will return the “kids” as a field, too, containing the hash that links to the first table.
The result can easily be used now in the Client.

Unfortunately, it can get tricky, but we will be back very soon with an article, in which we’ll show you that all the manual steps (loop, field aliasing, concatenating) could be solved by a recursive SUB.

Hopefully, however, Qlik™ will also add a wizard, that does that parsing on import like we have it for JSON objects received via the REST Connection or, likewise, as XML files are parsed.



  • A surprise and already a good starting point for importing .json natively
  • Everything can be imported using “table is” directive


  • Quite some script logic needed to parse nested JSON objects
  • Can be slow
  • Still undocumented and may change in future

Source of the article:

Other articles from our blog on this theme:

For information about Qlik™, please visit this site:
For specific and specialized solutions from QQinfo, please visit this page: QQsolutions.
In order to be in touch with the latest news in the field, unique solutions explained, but also with our personal perspectives regarding the world of management, data and analytics, we recommend the QQblog !

Cum se pot utiliza tabelele Script Inline în Qlik Sense™?