Skip to content
This repository was archived by the owner on Mar 10, 2025. It is now read-only.
This repository was archived by the owner on Mar 10, 2025. It is now read-only.

Problem with library: com.azure.cosmos.spark:azure-cosmos-spark_3-3_2-12:4.23.0 #486

@Spoccia

Description

@Spoccia

Our team is developing notebook databricks using SPARK and SCALA. We are working on inserting data in Collections in CosmosDB.

Following multiple guides, we used the configuration field:
"spark.cosmos.throughputControl.globalControl.container" = collection for throughput
"spark.cosmos.throughputControl.targetThroughputThreshold" = 0.2

to limit the use of the RU's in Cosmos but we are noticing that after multiple executions it looks like the library is not considering the limitation we set.

What we are saying is that it is like the limitation is ignored in fact the RU usage can grow up untill 100%. We are opening this issue after Microsoft suggested it.

Thanks for your feedback

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions