0
Not a bug

UNIFYConnect - Duplicate Account - Stops Sync

Dimitra Atkins 1 year ago in UNIFYBroker/Microsoft Active Directory updated by Matthew Davis (Technical Product Manager) 1 year ago 7

When a manual account creation occurs in AD, Broker's sync stops, and all other changes do not flow. 

Why would this occur: 

  • A user maybe needed quickly and hence the manual intervention occurs. 

This is causing issues for multiple customers, who are performing this activity and unknowingly breaking the sync and causing an outage on the system. 

Can this be configured so that all changes are not halted? 

Thanks

Answer

Answer
Not a bug

If multiple entries were created, then the solution is performing as expected based on the configuration. Broker expects that the defined unique key stays unique according to the data source output. If this key stops being unique, processing will stop to ensure unintended behaviour doesn't happen against the data. 

If supporting duplicates is needed, the solution configuration may need to be modified to support this scenario. Otherwise, some education exercises with the customer may be necessary to assist in this area to avoid the problem reoccurring. 

We've got a backlog item to review some better resiliency around joins and handling duplicates, however this is a large item that involves a significant number of edge cases and is therefore scheduled for investigation before our next major product version release.

Under review

Hi Dimi,

A manual account being created in AD shouldn't impact the sync of UNIFYBroker. For the sync to stop, other actions will have to be taken (for example, creation of duplicate accounts etc). 

Do you have any information on any errors that were being thrown, and the state of the system at the time? Have you been able to reliably replicate the problem in a lower environment for us to work through the behaviour? 

Request to sync changes on link Employee > Active Directory User (e3358877-2cd7-41e2-b103-9243cf9f6a4f) in direction outgoing failed with message Source entity 'b7b827c0-0118-449c-b621-b4ce8cfbc9b3' cannot be joined to ambiguous join targets: [d5328418-a2d9-4f97-aaa6-0152a3ebffff, 2e3847df-b3f3-415b-aa91-750b7ff5a74a]. Cannot proceed with join. [Count:4268]. Duration: 00:00:01.7543079

Thanks for providing the error. This error would show when the join criteria defined in the solution is unable to find a unique entity to join to. The join criteria is an implementation detail, not something the product controls. This error would normally only occur if multiple accounts with the same join identifier were created, not because a single manual account creation occurred.

Are you able to share where you think the bug in the product lies?

Multiple accounts are created, see images. When this occurs, the sync stops functioning. 

Thanks Dimi. The screenshots you've provided are for two different PersonNumber values, which depending on the solution configuration could be intentional behaviour that both of them exist. 

Are you able to replicate the problem in a lower environment using the same steps the customer did? That might be easier to investigate. Has the SME been able to provide any insights into what they believe the root cause of the problem is?

Yes, we spoke to the customer, they weren't intentional. Customer has deleted one of the entries. 

No, we haven't tried to replicate the issue in a lower environment. We only have a T & M support contract. 

Answer
Not a bug

If multiple entries were created, then the solution is performing as expected based on the configuration. Broker expects that the defined unique key stays unique according to the data source output. If this key stops being unique, processing will stop to ensure unintended behaviour doesn't happen against the data. 

If supporting duplicates is needed, the solution configuration may need to be modified to support this scenario. Otherwise, some education exercises with the customer may be necessary to assist in this area to avoid the problem reoccurring. 

We've got a backlog item to review some better resiliency around joins and handling duplicates, however this is a large item that involves a significant number of edge cases and is therefore scheduled for investigation before our next major product version release.