Unlock a world of possibilities! Login now and discover the exclusive benefits awaiting you.
I am working on a Qlik Sense mashup that contains charts from multiple Qlik Sense applications. These applications share common fields such as Date and Department.
I noticed that when I apply a filter to one chart, other charts from the same application respond correctly, which is expected. However, charts coming from different applications do not respond to the same filter, even though the field names and values are the same.
What is the recommended or best‑practice approach to create common/global filters in a mashup so that selections are applied consistently across charts coming from different Qlik Sense apps?
Any guidance, examples, or architectural recommendations would be appreciated.
Hi everyone,
I’m currently working on extracting app metadata using the Qlik Repository API, specifically this endpoint:
/api/v1/apps/{appId}/data/metadata
My goal is to retrieve reload performance metrics such as:
cpu_time_spent_ms
peak_memory_bytes
fullReloadPeakMemoryBytes
for exemple on around 600 apps (latest reload only).
I’m looping over all app IDs and calling the API via the REST connector.
Here is the script I’m using:
SUB app_metadata
For Each vAppGUID in FieldValueList('AppLoopID')
LET vGetAppMetadataUrl = 'https://$(vs_host_and_virtual_proxy_prefix)/api/v1/apps/$(vAppGUID)/data/metadata';
LIB CONNECT TO 'monitor_apps_REST_app';
[TempAppMetadata]:
SQL SELECT
"static_byte_size",
"__KEY_root",
(SELECT
"cpu_time_spent_ms",
"peak_memory_bytes",
"fullReloadPeakMemoryBytes",
"__FK_reload_meta",
"__KEY_reload_meta"
FROM "reload_meta" PK "__KEY_reload_meta" FK "__FK_reload_meta")
FROM JSON (wrap on) "root" PK "__KEY_root"
WITH CONNECTION (
Url "$(vGetAppMetadataUrl)"
);
[AppReloadMetadata]:
LOAD
'$(vAppGUID)' AS [AppID],
[cpu_time_spent_ms],
[peak_memory_bytes],
[fullReloadPeakMemoryBytes]
RESIDENT [TempAppMetadata]
WHERE NOT IsNull([__FK_reload_meta]);
DROP TABLE [TempAppMetadata];
Next vAppGUID
ENDSUB
I frequently get the following error:
HTTP protocol error 504 (Gateway Timeout)
/api/v1/apps/{appId}/data/metadata
Processing ~600 apps takes around 20 minutes
Some calls timeout (504), likely because the endpoint is slow
Is /data/metadata the right endpoint for retrieving reload performance metrics?
Is there a more efficient API endpoint to get:
cpu_time_spent_ms
peak_memory_bytes
fullReloadPeakMemoryBytes
(ideally only for the latest reload)
Are there best practices to avoid 504 errors when looping over many apps?
Has anyone faced similar performance issues with this endpoint?
Looping sequentially over apps
Even though I get the 504 error, the script still manages to retrieve the data.
Any advice or feedback would be greatly appreciated
Thanks!
Welcome to all new members and hello to our OGs!
We’re happy to have you here and excited for you to be part of this growing, global community.
Women Who Qlik exists to connect, support, and empower women and allies across the Qlik ecosystem through shared experiences, learning opportunities, career growth, and access to expertise and resources. Our mission is to create an inclusive space where women can learn from one another, amplify their voices, and grow their careers while deepening their impact with Qlik.
The Women Who Qlik Community Group is the place to access all the exciting events and content around the Women Who Qlik program.
Throughout the year, you can look forward to:
All members will receive the Women Who Qlik badge to share on your social shortly, as well as one to share on Community. We will send you your social Credly badge via email. If you have any objections, please let us know!
How to get started:
Reply to this post and tell us:
This helps us shape future events, speakers, and topics around you.
Subscribe so you don’t miss event invites, discussions, and announcements.
Join us for an exclusive conversation on mentorship and leadership with Sadie St. Lawrence, tech influencer and founder of Women in Data and founder/CEO of the Human Machine Collaboration Institute. On March 26 at 10am EST/4pmCET get ready to share your thoughts and experiences in a discussion with Women Who Qlik leaders right here in the Women Who Qlik Community!
I’m excited to get to know you all throughout the year!
Warm regards,
Sarah
Our developers struggle with keeping up with changes to our ETL scripts in Qlik Sense. Having to constantly refer back to the Extractor from the Transformer field names/aliases, etc. So How does the Community document their work? Do you tediously put everything in an Excel spread sheet or is there a way to grab the Scripts via the API and bring them into Excel?
Do you use Source control like Git or some other method of tracking changes?
Appreciate any and all ideas from the Community!
Hi,
We would like insert data from Qlik Sense app into the MS SQL Database.
We can unfortunately not use Qlik Application Automation to insert/update these records in the MS SQL database as the database is behind the firewall. Can we use Qlik scripting (SQL Insert) to insert these lines into our MS SQL database via the data gateway direct access?
KR
R
Good day
We are trying to load a big table from on-premises to AWS Kinesis.
At some point during the initial full load, we get the following errors and the table is reloaded.
03449616: 2026-04-14T07:05:29 [TARGET_LOAD ]E: Failed to produce kinesis message with record id <663349006> to partition <0> in stream 'xxx'. : Encountered network error when sending http request [1022601] (queue_utils.c:120)
03449616: 2026-04-14T07:05:29 [TARGET_LOAD ]E: Failed to send message [1022601] (queue_utils.c:1877)
03449616: 2026-04-14T07:05:29 [TARGET_LOAD ]E: Failed to produce message. [1022601] (queue_load.c:74)
03449544: 2026-04-14T07:05:29 [TASK_MANAGER ]I: Task error notification received from subtask 1, thread 1, status 1022601 (replicationtask.c:3653)
03449616: 2026-04-14T07:05:29 [TARGET_LOAD ]E: Error executing data handler [1022601] (streamcomponent.c:2034)
03449616: 2026-04-14T07:05:29 [TASK_MANAGER ]E: Stream component failed at subtask 1, component st_1_xxx [1022601] (subtask.c:1506)
03449616: 2026-04-14T07:05:29 [TARGET_LOAD ]E: Stream component 'st_1_xxx' terminated [1022601] (subtask.c:1675)
03449544: 2026-04-14T07:05:33 [TASK_MANAGER ]I: Subtask #1 ended (replicationtask_util.c:591)
03449544: 2026-04-14T07:05:33 [TASK_MANAGER ]I: Reloading table 14 because subtask #1 finished with error (replicationtask.c:2761)
I've already added the following internal parameters:
resultsWaitMaxTimes : 3000
resultsWaitTimeoutMS : 2500
Task settings: Commit rate during full load = 150000
There is no throttling on the Kinesis side.
I suspect it might be either a break in the network or slow response.
Running Qlik Replicate V2025.5.0.134 on Microsoft Windows Server 2019 Standard
Any suggestions on what other parameters I can set to do retries / wait longer, instead of reloading from start?
Thanx
Hi Team,
We're trying to get a Qlik Replicator connector up to write to Big Query.
Our SA credentials that QR exists in project_x; while the dataset which we're trying to write to exists in project_y.
The SA have the correct permissions to write to project_y.
How can I configure the task and endpoint to write to project_y when by default it is trying to write to a dataset in project_x?
Hi everyone,
I'm encountering a compilation error when trying to use the tIcebergOutput component in a Talend Job (Job Name: Talend_AWS).
The Problem: The Job fails to build/run with a ProcessorException. Upon inspecting the generated Java code, I found that a local variable named schemaArray is being declared multiple times within the same method, causing a "Duplicate local variable" error.
Steps taken:
I checked the "Code" tab in Talend and confirmed that schemaArray is indeed duplicated in the generated source.
I tried to refresh the schema and propagate changes again, but the issue persists.
Has anyone encountered this bug with the Iceberg components? Is there a known workaround?
Stream Qlik Connect Live on April 14–15 and learn how to succeed with data and AI by daring to be different.
Data heroes, suit up. Hackathons and Late-Night Lunacy events are back. Seating is limited — register now to lock in your spot
Introducing a new agentic experience in Qlik Answers and open AI access through MCP. Later this March: Discovery Agent and trusted data products embedded into analytics.
Think you’ve got game? Step into the Performance Zone — a data-driven sports arena on the show floor featuring Qlik customers Topgolf, the Malmö Redhawks, and Pinarello-Q36.5 Pro Cycling
Your journey awaits! Join us by Logging in and let the adventure begin.
Data Restores Confidence: How SEFAZ-RS Used Qlik to Support the Rebuilding of a State.
AFIP revolutionizes management, optimizes resource allocation and strengthens data-driven culture
Global container shipping giant delivers data transparency to enable confident decision-making and operational efficiency.
Qlik turns raw data into valuable learning experiences, empowering both students and faculty at TTUHSC.
MillerKnoll partnered with Qlik Talend® data solutions to unify data across its diverse systems, accelerating critical processes and nearly eliminating data integration issues organization-wide.
Migration to Qlik Cloud Analytics optimizes analytics applications within months.
With Qlik Cloud Analytics, Rexel boosts performance by enabling self-service analysis, reducing prep time, and giving 3,000 users access to trusted insights.
Join one of our Location and Language groups. Find one that suits you today!
Únete a la conversación con usuarios de Qlik en todo México: comparte ideas, haz preguntas y conéctate en español.
Qlik Communityの日本語のグループです。 Qlik製品に関する日本語資料のダウンロードや質問を日本語で投稿することができます。
Connectez-vous avec des utilisateurs francophones de Qlik pour collaborer, poser des questions et partager des idées.
Join us April 13–15, 2026 in Kissimmee, Florida for 2+ immersive days of innovation, hands-on learning, and powerful connections shaping the future of data and AI.
You can test-drive Qlik for free? Try Qlik Talend Cloud to integrate and clean data without code, or explore Qlik Cloud Analytics to create AI-powered visualizations and uncover insights hands-on.
You can move beyond batch processing and harness real-time data to power AI and faster decisions? Discover how in our new eBook, Mastering Change Data Capture.