Error Code 5003 When Accessing Analytics Queries API

Hi,

I was trying to do a POST request using this endpoint via Postman:
https://api.bitmovin.com/v1/analytics/queries/count

Here is the request body:

{
    "start": "2022-07-18T17:00:00Z",
    "end": "2022-08-23T17:00:00Z",
    "licenseKey": "{{bitmovin_analytics_key}}",
    "limit": 200,
    "offset": 0,
    "filters": [
        {
            "name": "CUSTOM_DATA_3",
            "operator": "NE",
            "value": null
        }, {
            "name": "CUSTOM_USER_ID",
            "operator": "NE",
            "value": null
        }, {
            "name": "DOMAIN",
            "operator": "NE",
            "value": "localhost"
        }
    ],
    "groupBy": [
        "CUSTOM_DATA_3",
        "CUSTOM_USER_ID",
        "DAY",
        "BROWSER",
        "COUNTRY",
        "DOMAIN",
        "PLATFORM",
        "DEVICE_CLASS",
        "OPERATINGSYSTEM"
    ],
    "orderBy": [
        {
            "name": "CUSTOM_DATA_3",
            "order": "asc"
        }, {
            "name": "CUSTOM_USER_ID",
            "order": "asc"
        }, {
            "name": "DAY",
            "order": "asc"
        }
    ],
    "dimension": "IMPRESSION_ID"
}

But I keep getting this result:

{
    "requestId": "something-something",
    "status": "ERROR",
    "data": {
        "code": 5003,
        "message": "Error querying analytics!",
        "developerMessage": "Attempting to groupBy a very high cardinality column that cannot be aggregated. Please filter the query result."
    }
}

This POST request will run successfully if I only include up to 4 of any attributes in the groupBy.
The problem is, if I can, I want to keep include all of those attributes as they are.
Is there any solution?

Thanks before

Hi Phill, the error you’re getting indicates that the total count of all possible elements to be returned is so high that it cannot be handled in one single query. There are some limits put intentionally in place with the aim of protecting performance and database stability. For example, when custom data is involved, the cardinality limit is 15000 elements, as you can read here: Configuration Guide

For customData field queries we allow a cardinality of maximum of 15,000 distinct values per customData field within the selected time-frame.

where limits apply to any field that is subject to have a high cardinality (number of elements) as well…so in order to reduce the cardinality (number of elements returned) of your query and have it work, it would be advisable to add as many filters as possible and to reduce the groupBy elements to the minimum possible. Regards

Hi Phill,
it also seems to me that you are trying to get a per-user breakdown of your data which is something we don’t really support via the API as with growing usage this will very quickly lead to a unsustainable amount of API queries having to be sent.

For cases like this I would suggest you use the Analytics Data Export feature to get all the data and you can then run your analysis on it any way you want.

greetings Daniel

This topic was automatically closed 60 minutes after the last reply. New replies are no longer allowed.