This section contains solutions, fixes, hints and tips to help you solve the most common issues encountered when using Quix.

Data is not being received into a Topic

  • Ensure the TopicId is correct. See here on how to find your TopicId

  • You can check the data in / out rates on the Topics tab.

  • If you want to see the data in the Data Catalogue please make sure you are persisting the data to the Topic otherwise it may appear that there is no data.

  • If you are using a consumer group, check that no other services are using the same group. If you run your code locally and deployed somewhere and they are both using the same consumer group one of them may consume all of the data.

Topic Authentication Error

If you see errors like these in your service or job logs then you may have used the wrong credentials or it could be that you have specified the wrong Topic Id.

Authentication failed during authentication due to invalid credentials with SASL mechanism SCRAM-SHA-256
Exception receiving package from Kafka
3/3 brokers are down
Broker: Topic authorization failed

Check very carefully each of the details.

The following must be correct:

  • TopicId

  • Username (or Workspace Id)

  • Password

  • Certificates

These can all be found here

Broker Transport Failure

If you have deployed a service or job and the logs mention 'broker transport failure' then check the workspace name and password in the SecurityOptions.

Also check the broker address list. You should have these by default:,,

401 Error

When attempting to access the web API’s you may encounter a 401 error. Check that the bearer token is correct and has not expired. If necessary generate a new bearer token. How to generate access tokens

Example of the error received when trying to connect to the Streaming Reader API with an expired bearer token


The API’s that require a valid bearer token are:

YOUR_WORKSPACE_ID can be found on the Topic Information

Error Handling in the SDK callbacks

Errors generated in the SDK callback can be swallowed or hard to read. To prevent this and make it easier to determine the root cause you should use a traceback

Begin by importing traceback

import traceback

Then, inside the SDK callback where you might have an issue place code similar to this:

def read_stream(new_stream: StreamReader):

    def on_parameter_data_handler(data: ParameterData):
            data.timestamps[19191919] # this does not exist

        except Exception:

    new_stream.parameters.create_buffer().on_read += on_parameter_data_handler

input_topic.on_stream_received += read_stream

Notice that the try clause is within the handler and the except clause prints a formatted exception (below)

Traceback (most recent call last):
  File "", line 20, in on_parameter_data_handler
  File "/usr/local/lib/python3.8/dist-packages/quixstreaming/models/", line 22, in __getitem__
    item = self.__wrapped[key]
IndexError: list index out of range

Service keeps failing and restarting

If your service continually fails and restarts you will not be able to view the logs. Redeploy your service as a job instead. This will allow you to inspect the logs and get a better idea about what is happening.

Possible DNS Propagation Errors

There are currently 2 scenarios in which you might encounter an issue caused by DNS propagation.

  • 1. Data catalogue has been deployed but DNS entries have not fully propagated. In this scenario you might see a banner when accessing the data catalogue.


  • 2. A dashboard or other publicly visible deployment is not yet accessible, again due to DNS propagation.


In these scenarios simply wait while the DNS records propagate. It can take up to 10 minutes for DNS to records to propagate fully.

Python Version

If you get strange errors when trying to compile your Python code locally please check that you are using Python version 3.8

For example you may encounter a 'ModuleNotFoundError'

ModuleNotFoundError: No module named 'quixstreaming'

For information on how to setup your IDE for working with Quix please check out this How To guide.

Jupyter Notebooks

If you are having trouble with Jupyter Notebooks or another consumer of Quix data try using aggregation to reduce the number of records returned.

For more info on aggregation check out these docs.

We’ve also created a short video on it too.

Process Killed or Out of memory

If your deployment’s logs report "Killed" or "Out of memory" then you may need to increase the amount of memory assgned to the deployment.

You may experience this:

  • At build time if you want to load large third party packages into your code

  • At runtime if you are storing large datasets in memory.