Working with data management and synchronization can be challenging at times. We have listed a set of challenges and their potential solutions to help you troubleshoot some of the most challenging issues you can encounter.


This document is subject to improvements as we are continuously identifying new issues and trying to provide adequate solutions.

Common issues with data synchronization

  1. Insert or update of an entity throws an exception due to the target system not being idempotent.

  2. Deletes fails due to dependencies on the object you want to delete.

  3. Deletes fails due to the object already being deleted in the target system.

  4. Wrong order of inserts, an object has dependencies that must exist in the target system before an insert can occur, such as a parent object.

  5. Batch inserts and updates, when they fail it is difficult to locate which entities causes the batch to fail.

  6. Reading data from a SQL database or other systems that supports incremental updates without configuring partial rescans can lead to entities not being read from the source system.

  7. Data that contains erroneous values.

  8. Data that is in the wrong form.

  9. Microservices causing errors due to:

    1. Memory issues.

    2. Unstable or unrobust code.

    3. Lacking good exception handling.

  10. Configured dead letter datasets, but no monitoring or handling of them.

  11. Duplicates in the target source.

  12. Pipes that are stuck and unresponsive.

  13. Expired certificates on the server.


Common solutions to some of the operation issues above are:

  1. The result / status / feedback from the target system for both inserts, updates and deletes should be stored in a dataset in Sesam. Need to separate entity and system failures.

  2. Partial rescans on all connect pipes that reads data using incremental update.

  3. Data in dead letter datasets should be collected and sent to a monitoring system or other error handling systems.

  4. Data being sent from Sesam should be validated.

  5. Error messages from deletes that fails due to the object already being deleted should be ignored.

  6. Insert pipes should have safeguards against sending duplicates to the target system.