Identity Broker Forum

Welcome to the community forum for Identity Broker.

Browse the knowledge base, ask questions directly to the product group, or leverage the community to get answers. Leave ideas for new features and vote for the features or bug fixes you want most.

+1
Won't fix

How can we re-trigger an AD User provision?

Andrei Nicolas 1 year ago in UNIFYBroker Service updated 1 year ago 4

Hi All

Is there a way to re-trigger an Outbound provision going from the Locker to AD.

- We have done a baseline sync

-  We have checked the criteria required for a provision to AD

Is this a bug or is there a solution already made?

Many thanks

Andrei


+1
Planned

Allow consultants to add any SCIM attribute to the SCIM gateway configuration

Allow consultants to add any SCIM attribute (core or extension) to the SCIM gateway configuration.

+1
Completed

Scheduled execution of Test Connection on agents

Adrian Corston 4 years ago in UNIFYBroker Service updated by Matthew Davis (Technical Product Manager) 3 years ago 1

An automated periodic execution of the Test Connection functionality on an agent for UNIFYMonitor to pick up and report on would give UNIFY early warning that a low level service problem exists.

[Bob's suggestion]

Answer

Hi Adrian,

This could be completed using the Scheduled Jobs feature of the UNIFYBroker logging engine. This gives access to the $components.AgentEngine component, which has a method void Test(Guid agentId) that could be used to execute tests. Alternatively you could call the REST API from a scheduled job to execute the task.

0
Under review

Register Contributions where the contributing attribute is from a joining connector?

David Poyner 3 months ago in UNIFYBroker Service updated 2 months ago 2

Is it possible to call register-contribution in an adaptor PowerShell schema, where the source field is not from the base connector, but a joining connector?

I have been working on fixing the Register-contribution functions in a customer environment, however some of the values that are imported and are eventually used for time offset flag calculations, are coming from a non-base connector using the "Join on" transformation.

When I test using an imported change from the base connector, the changes are schedule for the correct time. When I test using an imported change a joining connector, my test fails (the imported change flows to the adaptor but does not appear to register the future change). This leads me to the conclusion that maybe an imported change from joining connector does not register future changes.

Is this correct? Is there a solution given termination/end dates required for some of the calculations, are not currently available on the base connector? 

0
Answered

PowerShell group connector returning null for dn attribute

Hayden Gray 3 months ago in UNIFYBroker Service updated by Matthew Davis (Technical Product Manager) 3 months ago 2

Version v5.3.2 Revision #0

I have a PowerShell connector that queries and database to build groups including their memberships. But when importing it returns the following error. The connector does not have an associated adapter.

I have another connector that uses the same script that works just fine, which would indicate a data issue however, there are no obvious fields that are null.

Connector Processor Connector processing failed.
Connector Processing page 1 for connector Test Group Errors failed with reason Value cannot be null.
Parameter name: dn. Duration: 00:00:18.4144781.
Error details:
System.ArgumentNullException: Value cannot be null.
Parameter name: dn
at Unify.Framework.IO.DistinguishedName.op_Implicit(DistinguishedName dn)
at Unify.Product.IdentityBroker.Repository.EntityDistinguishedNameValueDataUtility`1.ConvertValueToString(DistinguishedNameValue value)
at Unify.Product.IdentityBroker.Repository.StringBasedValueDataUtilityBase`2.SetEntityValue(__EntityValueInsertRow dataValue, TValue value)
at Unify.Product.IdentityBroker.Repository.EntitySingleValueDataUtilityBase`2.CreateEntityValue(TEntityKey key, IValue value, IEntityCollectionKeyUtility`1 collectionKeyUtility, EntityDataSet set, __EntityInsertRow row, EntityDataContext sourceContext)
at Unify.Product.IdentityBroker.Repository.KnownEntityContextBase`4.ConvertEntityValueToDataValue(KeyValuePair`2 entityValueAndKey, __EntityInsertRow row, EntityDataSet entityDataSet, EntityDataContext sourceContext)
at Unify.Product.IdentityBroker.Repository.KnownEntityContextBase`4.<>c__DisplayClass33_0.b__0(KeyValuePair`2 entityValueAndKey)
at System.Linq.Enumerable.WhereSelectEnumerableIterator`2.MoveNext()
at System.Linq.Enumerable.d__17`2.MoveNext()
at Unify.Framework.Visitor.Visit[T](IEnumerable`1 visitCollection, Action`2 visitor)
at Unify.Product.IdentityBroker.Repository.KnownEntityContextBase`4.InsertItems(ISet`1 addedItems, EntityDataContext sourceContext, SqlConnection connection)
at Unify.Framework.Data.LinqContextConversionBase`4.SubmitChanges()
at Unify.Product.IdentityBroker.SaveChangedEntitiesTransformationUnit.Transform(IDictionaryTwoPassDifferenceReport`4 input)
at Unify.Product.IdentityBroker.ConnectorEntityChangeProcessor.ProcessEntities(IEnumerable`1 connectorEntities, IEnumerable`1 repositoryEntities, IEntityChangesReportGenerator`2 reportGenerator)
at Unify.Product.IdentityBroker.RepositoryChangeDetectionWorkerBase.PerformChangeDetectionOnConnectorEntityPage(IEnumerable`1 connectorEntities, Int32& index, Int32 entitiesProcessedSoFar, IEntityChangesReportGenerator`2 reportGenerator, IHashSet`1 seenKeys)
at Unify.Product.IdentityBroker.RepositoryChangeDetectionWorkerBase.<>c__DisplayClass11_1.b__0(IEnumerable`1 page)
at Unify.Framework.Visitor.ThreadsafeVisitorEvaluator`1.ThreadsafeItemEvaluator.Evaluate()



Change detection engine Change detection engine import all items failed.
Change detection engine import all items for connector Test Group Errors failed with reason An error occurred while evaluating a task on a worker thread. See the inner exception details for information.. Duration: 00:00:49.7519745
Error details:
Unify.Framework.EvaluatorVisitorException: An error occurred while evaluating a task on a worker thread. See the inner exception details for information. ---> System.ArgumentNullException: Value cannot be null.
Parameter name: dn
at Unify.Framework.IO.DistinguishedName.op_Implicit(DistinguishedName dn)
at Unify.Product.IdentityBroker.Repository.EntityDistinguishedNameValueDataUtility`1.ConvertValueToString(DistinguishedNameValue value)
at Unify.Product.IdentityBroker.Repository.StringBasedValueDataUtilityBase`2.SetEntityValue(__EntityValueInsertRow dataValue, TValue value)
at Unify.Product.IdentityBroker.Repository.EntitySingleValueDataUtilityBase`2.CreateEntityValue(TEntityKey key, IValue value, IEntityCollectionKeyUtility`1 collectionKeyUtility, EntityDataSet set, __EntityInsertRow row, EntityDataContext sourceContext)
at Unify.Product.IdentityBroker.Repository.KnownEntityContextBase`4.ConvertEntityValueToDataValue(KeyValuePair`2 entityValueAndKey, __EntityInsertRow row, EntityDataSet entityDataSet, EntityDataContext sourceContext)
at Unify.Product.IdentityBroker.Repository.KnownEntityContextBase`4.<>c__DisplayClass33_0.b__0(KeyValuePair`2 entityValueAndKey)
at System.Linq.Enumerable.WhereSelectEnumerableIterator`2.MoveNext()
at System.Linq.Enumerable.d__17`2.MoveNext()
at Unify.Framework.Visitor.Visit[T](IEnumerable`1 visitCollection, Action`2 visitor)
at Unify.Product.IdentityBroker.Repository.KnownEntityContextBase`4.InsertItems(ISet`1 addedItems, EntityDataContext sourceContext, SqlConnection connection)
at Unify.Framework.Data.LinqContextConversionBase`4.SubmitChanges()
at Unify.Product.IdentityBroker.SaveChangedEntitiesTransformationUnit.Transform(IDictionaryTwoPassDifferenceReport`4 input)
at Unify.Product.IdentityBroker.ConnectorEntityChangeProcessor.ProcessEntities(IEnumerable`1 connectorEntities, IEnumerable`1 repositoryEntities, IEntityChangesReportGenerator`2 reportGenerator)
at Unify.Product.IdentityBroker.RepositoryChangeDetectionWorkerBase.PerformChangeDetectionOnConnectorEntityPage(IEnumerable`1 connectorEntities, Int32& index, Int32 entitiesProcessedSoFar, IEntityChangesReportGenerator`2 reportGenerator, IHashSet`1 seenKeys)
at Unify.Product.IdentityBroker.RepositoryChangeDetectionWorkerBase.<>c__DisplayClass11_1.b__0(IEnumerable`1 page)
at Unify.Framework.Visitor.ThreadsafeVisitorEvaluator`1.ThreadsafeItemEvaluator.Evaluate()
--- End of inner exception stack trace ---
at Unify.Framework.Visitor.ThreadsafeVisitorEvaluator`1.CheckForException()
at Unify.Framework.Visitor.ThreadsafeVisitorEvaluator`1.WaitForCompletedThreads()
at Unify.Framework.Visitor.ThreadsafeVisitorEvaluator`1.Visit()
at Unify.Framework.Visitor.VisitEvaluateOnThreadPool[T](IEnumerable`1 visitCollection, Action`2 visitor, Int32 maxThreads)
at Unify.Product.IdentityBroker.RepositoryChangeDetectionWorkerBase.PerformChangeDetection(IEnumerable`1 connectorEntities)
at Unify.Product.IdentityBroker.ChangeDetectionImportAllJob.ImportAllChangeProcess()
at Unify.Product.IdentityBroker.ChangeDetectionImportAllJob.RunBase()
at Unify.Framework.DefinedScopeJobAuditTrailJobDecorator.Run()
at Unify.Product.IdentityBroker.ConnectorJobExecutor.<>c__DisplayClass30_0.b__0()
at Unify.Framework.AsynchronousJobExecutor.PerformJobCallback(Object state)

Answer

Thanks for the update Hayden. I was just about to respond - it seems like there was a 'not-quite-null' value trying to be parsed into a DN field, which then when Broker was trying to store it in the entity context couldn't grab a valid string value to actually store. Some types in Broker, including the DN type, will treat an empty value differently to a null value - so if anything other than null is seen, it will attempt to convert (and in this case, fail). 

0
Under review

Reflect change entities to adapter errors about duplicate entries when there are are no duplicate key values in the connector.

Hayden Gray 3 months ago in UNIFYBroker Service updated by Beau Harrison (Senior Product Software Engineer) 3 months ago 1

I've been having issues on particular PowerShell connectors/adapters in UNIFYBroker where reflecting change entities to the adapter is complaining about duplicate entries when there are are no duplicate key values in the connector.

The schema setup between the connectors and adapters is an ID key in the connector that is then used within the adapter as the DN. So it is a very simple DN template. E.g:
Name Type Key Read-only Required
AccountName String True True True

Distinguished Name Template CN=[AccountName]

The issues are generally fixed by clearing and repopulating the whole adapter, which is not a repeatable solution since it happens on a weekly basis, sometimes more often.

These errors also don't seem to happen after an obvious failures on the connector side, which is what I have previously attributed these issues to. All these connectors have deletion thresholds setup of at least 50%.

Its like broker seems to get itself tied up even though the schedules in the environment have been reduced to the point where only 1 operation is interacting within and with broker at a time.

SQL maintenance is also performed frequently and the SQL instance has plenty of resources allocated.

Version details: v5.3.2 Revision #0

Any help would be appreciated as this has been a long ongoing issue that I've seen across multiple environments.

Adapter
Adapter eb42757f-2f23-4228-928e-993942b0c050 page errored on page reflection. Duration: 00:00:21.5551444. Error: Unify.Framework.UnifyDataException: Duplicate DNs detected on adapter eb42757f-2f23-4228-928e-993942b0c050. Reflection failed. Duplicate DNs: CN=<obfuscated name>,OU=sIAMGroups,DC=IdentityBroker, CN=<obfuscated name 2>,OU=sIAMGroups,DC=IdentityBroker.
at Unify.Product.IdentityBroker.DuplicateDnDetector.DetectDuplicateDns(IDictionaryTwoPassDifferenceReport`4 report)
at Unify.Product.IdentityBroker.Adapter.ReflectChangesInner()
at Unify.Product.IdentityBroker.Adapter.ReflectChanges()
at Unify.Product.IdentityBroker.AdapterAuditingDecorator.ReflectChanges()
at Unify.Product.IdentityBroker.AdapterNotifierDecorator.ReflectChanges()
at Unify.Product.IdentityBroker.ReflectAdapterOnChangeDueJob.b__9_0(IOperationalAdapter adapter).
Error details:
Unify.Framework.UnifyDataException: Duplicate DNs detected on adapter eb42757f-2f23-4228-928e-993942b0c050. Reflection failed. Duplicate DNs: CN=<obfuscated name>,OU=sIAMGroups,DC=IdentityBroker, CN=<obfuscated name 2>,OU=sIAMGroups,DC=IdentityBroker.
at Unify.Product.IdentityBroker.DuplicateDnDetector.DetectDuplicateDns(IDictionaryTwoPassDifferenceReport`4 report)
at Unify.Product.IdentityBroker.Adapter.ReflectChangesInner()
at Unify.Product.IdentityBroker.Adapter.ReflectChanges()
at Unify.Product.IdentityBroker.AdapterAuditingDecorator.ReflectChanges()
at Unify.Product.IdentityBroker.AdapterNotifierDecorator.ReflectChanges()
at Unify.Product.IdentityBroker.ReflectAdapterOnChangeDueJob.b__9_0(IOperationalAdapter adapter)


Adapter Request to reflect change entities of the adapter.
Request to reflect change entities of the eMinerva Student: Groups (eb42757f-2f23-4228-928e-993942b0c050) adapter errored with message: Duplicate DNs detected on adapter eb42757f-2f23-4228-928e-993942b0c050. Reflection failed. Duplicate DNs: CN=<obfuscated name>,OU=sIAMGroups,DC=IdentityBroker, CN=<obfuscated name 2>,OU=sIAMGroups,DC=IdentityBroker.. Duration: 00:00:59.9516712
Error details:
Unify.Framework.UnifyDataException: Duplicate DNs detected on adapter eb42757f-2f23-4228-928e-993942b0c050. Reflection failed. Duplicate DNs: CN=<obfuscated name>,OU=sIAMGroups,DC=IdentityBroker, CN=<obfuscated name 2>,OU=sIAMGroups,DC=IdentityBroker.
at Unify.Product.IdentityBroker.DuplicateDnDetector.DetectDuplicateDns(IDictionaryTwoPassDifferenceReport`4 report)
at Unify.Product.IdentityBroker.Adapter.ReflectChangesInner()
at Unify.Product.IdentityBroker.Adapter.ReflectChanges()
at Unify.Product.IdentityBroker.AdapterAuditingDecorator.ReflectChanges()
at Unify.Product.IdentityBroker.AdapterNotifierDecorator.ReflectChanges()
at Unify.Product.IdentityBroker.ReflectAdapterOnChangeDueJob.b__9_0(IOperationalAdapter adapter)

0
Under review

Jobs Stuck Processing

Liam Schulz 10 months ago in UNIFYBroker Service updated by Matthew Davis (Technical Product Manager) 10 months ago 1

Hi,

We have observed jobs such as Connector imports and Link synchronization will occassionally get stuck in a processing loop and not complete. This causes a block in operations as Broker cannot import or synchronize new data. To clear the process a restart has to be performed. Attempting to cancel the job does not have any impact.

This happens intermittently and doesn't appear to have a consistent way of reproducing the issue. I understand this makes it difficult to troubleshoot the issue, so is there other possibilities for a solution we could explore? For example, could there be a timeout introduced so that the job is killed if it runs over a period of time without closing?

Let me know your thoughts and feedback.

Thanks,
Liam

0
Under review

Link Synchronization not triggering

Hayden Gray 11 months ago in UNIFYBroker Service updated by Matthew Davis (Technical Product Manager) 11 months ago 1

In UNIFYConnect test environment when attempting to perform a baseline synchronisation or delta changes synchronisation, the buttons seemingly do nothing when attempting to trigger on the link. The link is between a CSV connector/adapter and a locker with about ~10k entities. Nothing else is currently running, all other link schedules are disabled and no connectors are importing, nor are any change reflect options are running. 

When I click a button to sync the page refreshes like it has executed but nothing else happens. Nothing appears under the "Recent Jobs" section of the link page and it logs 2 messages in the log:

23/Jan/2024 23:29:28
Information
Link Request to manually queue a baseline synchronization job on link started.
Request to manually queue a baseline synchronization job on link Managed User > AD User started.
23/Jan/2024 23:29:28
Information
Link Request to manually queue a baseline synchronization job on link completed.
Request to manually queue a baseline synchronization job on link 'Managed User > AD User' completed. Duration: 00:00:00.0310830

Is there a way I can see what is stopping the sync operation from executing? Let me know if you need more information.

Thank you

0
Under review

Gateway was unable to be started due to One or more errors occurred.

Liam Schulz 1 year ago in UNIFYBroker Service updated by Matthew Davis (Technical Product Manager) 1 year ago 3

Hi,

We have seen across multiple Broker instances that the following error occurs for LDAP gateways:
"The gateway <gateway name> (guid) was unable to be started due to One or more errors occurred."

Unfortunately there doesn't seem to be much more information that what is provided in the log. Examination of the log file further with CMTrace doesn't reveal anymore information.

In one particular affected customer's case, I checked the Azure Provisioning service to see if there was any significant event that may have caused this, but could not find anything there either.

The workaround is to Recycle the gateway, but this currently relies on manual checking to see if it has occurred or not and this appears to be happening on a frequent basis. We would like to address the root cause issue if possible.

Is there additional logging levels that could be applied to find out what could be causing this?

Thanks,
Liam

0
Not a bug

Time Offset Flag not re-evaluated when current time passes source field timestamp

My customer is failing UAT of the solution configuration because Time Offset Flags are not automatically updating when the source field timestamp is passed.  There have not been any Clear Entity Changes run in this environment for many months, and entity data fields have been updated recently as part of the customer's UAT testing processes.

Example: entity ID 70cb5e8e-8a8d-48f9-a123-911a836574f4 in partition 838b79fc-a31e-4b70-bcc8-e94550b3ff57 in ACCC TEST.  PostEnd, based on EndUTC, is still "No" but it should be "Yes".

Could you please check entity ID e71b989c-1a01-4542-9334-8e69c12abb6c on that same partition, which is due to see PostEnd change from "No" to "Yes" in around 15 hours from now (8/22/2023 2:00:00 PM UTC).  Please confirm that it is all fine (i.e., a future change exists in the database) and then after the timestamp is passed I'll check and verify if the PostEnd value has updated automatically as it should.

Answer

Changes are registered for the times that the contributing transformations dictate they need to be registered. In this case, there could have been a scenario under which a transformation has determined, based on current (or previous) configuration, that a change should be created for that time because at that time a date time offset flag or similar needed to be recalculated. It may have been from one of the time offset fields that has a value already and is known to recalculate at that time.

The original part of the ticket is as you suspect - where clearing pending changes will remove future dated changes, and they won't be recreated through a generate changes process (which is something we've improved for UNIFYBroker 6.0). 

The feature (Register-Contribution) was added to handle a scenario where a PowerShell transformation is adding/modifying a field, which is used in a transformation that can calculate future-dated changes (such as the Time Offset Flag transformation). With a normal chain of transformations (powershell aside), each adapter tracks the chain of where its source came from. This is done so we can calculate whether or not a change is needed for an entity (for example, if field X from relational connector A is used through a chain of transformations and then to calculate a datetime offset, we need to know on import if field X has a value which would trigger a change in the future, so we can register that change in the database so it recalculates at the right time, rather than too early / late). This calculation process is called Change Detection in the Broker engine.

Traditionally, the PowerShell transformation had no way of letting the Broker engine know how entity fields were being used, so it had no way of being involved in the change detection process. This also means it broke downstream change detection - if a PowerShell transformation is outputting a field value which is then used in a Time Offset Flag transformation, the engine had no way of knowing the source field of that value to notify of future dated changes.

The Register-Contribution call allows you to register this linkage to let the engine know how to handle those scenarios. So as an example, you may have an EndDate schema field coming from the parent or relational connector. You take this EndDate field in, and use a PowerShell transformation to convert it to UTC time, placing it into an EndTimestampUTC field. That EndTimestampUTC field is then used to calculate the PostEnd field through a TimeOffsetFlag. You would use the Register-Contribution call on this field, to essentially tell the adapter that "the PostEnd field passes through this magical script and results in the EndTimestampUTC field, so if there's a change on the PostEnd field it can be used to calculate changes on the EndTimestampUTC field". The adapter can then use that linkage to know that a change on the PostEnd field which sets the date in the future can be used to calculate a change in the future based on the configuration for the TimeOffsetFlag transformation. 

The configuration would look similar to this (screenshot from the linked ticket):

Image 6511

Image 6512

Image 6513

TL;DR: 

The PowerShell transformation does not participate in the change detection process by default. This can be enabled by manually describing the fields which contribute in some way to other fields, either created by the transformation, or pre-existing.

To do this, call the Register-Contribution method in the Schema Script for each instance of one field contributing to another.

# 'fieldA' contributes to the resulting value in 'fieldB'
Register-Contribution("fieldB", "fieldA"); 

Manually registering field contribution isn't required in most cases and for all fields, and can normally be omitted from the schema script. The typical situations where contribution registration would be required involve a PowerShell transformation preceding a Time Offset Flag or Business Day Offset transformations, where the contribution chain of the the involved Timestamp or Date fields is required to correctly schedule future-dated changes.