Identity Broker Forum

Welcome to the community forum for Identity Broker.

Browse the knowledge base, ask questions directly to the product group, or leverage the community to get answers. Leave ideas for new features and vote for the features or bug fixes you want most.

+1
Under review

Test harness for Adapter and Link PowerShell Transformations

Bob Bradley 4 years ago in UNIFYBroker/Plus updated by Matthew Davis (Technical Product Manager) 2 years ago 1

In order to support the unit testing requirements for transitioning PS solutions on Broker+ to the UNIFYConnect hosted platform, a test harness is required for all PowerShell transformations.

+1
Completed

Port UNIFYBroker Azure O/S Platform

Bob Bradley 4 years ago updated by Matthew Davis (Technical Product Manager) 4 years ago 4

It is becoming an impediment to future UNIFY* opportunities, particularly in the hosted solution space, that UNIFYBroker runs only on the Windows Server O/S.  If porting it to run natively on Azure would significantly reduce the current hosting impediments, while at the same time retain the natural partitioning between sites that comes from hosting the service within a VM, this would be of significant benefit to all parties from Sales to Implementation.  It would also make the idea of having Broker 3rd-party configurable more of a possibility.

Answer

Capability currently provided through the UNIFYConnect service offering. Can be provided for demos or poc's as necessary.

Further improvements will be provided in a future release of the product (version 6.0)

+1
Under review

Attepting to retrieve the CollectionKeyId for caption "FieldName" failed.

Carol Wapshere 6 years ago in UNIFYBroker/Aurion updated by Matthew Davis (Technical Product Manager) 2 years ago 3

I have added a new string field to the Aurion Person connector "ExtraField1". We already had "ExtraField2" (which was working).

The config already had a mapping:

<attribute name="Extra_Field_2" target="ExtraField2">

I have added underneath that:

<attribute name="Extra_Field_1" target="ExtraField1">

When I try to run the Import All now it runs for quite a while (this report takes a long time to generate), then fails with the error:

Attempting to retrieve the CollectionKeyId for caption ExtraField2 failed. No collection key found for that caption.

What has gone wrong?

I will send full error and config files by email once someone picks this up.

+1
Fixed

MIM Adapter Error if no IDB Adapters Enabled

Tested Against: Identity Broker v5.3

Currently if you have no adapters enabled in IDB, and you attempt to create an MA in MIM using the MIM Adapter ECMA2, you get the following error:

The extensible extension returned an unsupported error.
  
The stack trace is:
 "System.InvalidOperationException: Sequence contains no elements
   at System.Linq.Enumerable.Aggregate[TSource](IEnumerable`1 source, Func`3 func)
   at Unify.Product.IdentityBroker.LdapConnectionProxy.get_Schema()
   at Unify.Product.IdentityBroker.UnifyLdapConnectorTypeProxy.GetSchema(KeyedCollection`2 configParameters)
Forefront Identity Manager 4.4.1302.0"

It would be good if the error could either be reported in a more logical way (IE inform that there's no adapters enabled, and therefore no OU's to load), or simply allow the creation process to continue and the user will realise there's no adapters enabled in a subsequent step.


The error also occurs if you have adapters which are enabled with valid schema, but inhibited due to a condition with the base connector. 

Answer

Fixed, will be in next release

+1
Completed

Identity Broker dashboard enhancements

Andrew Silcock 8 years ago updated by anonymous 7 years ago 3

In doing development I found myself continually jumping between IDB Connector and Adapter pages to look at high level statistics such as polling object counts and pending changes on a few adapters - this can result in having half a dozen tabs open for this purpose.


As an enhancement it would be nice if the IDB Dashboard displayed some more high level statistics such as last run time/status, object counts and pending changes (for adapters) to get a more complete view of the system state.

Answer
anonymous 7 years ago

This is definitely something that will be considered if/when we do the UI rewrite.

+1
Answered

Entity in IdB connector and adapter but does not exist in target directory

Carol Wapshere 8 years ago in PowerShell connector updated by anonymous 8 years ago 3

IdB 5, Powershell connector, target system is RedHat LDAP.


There are three objects which exist as entities in the IdB connector and adapter but do not exist in LDAP. FIM is trying to update them and we're getting "Object does not exist" errors back from LDAP.


Connector Full Imports have been run. I turned on the verbose logging I'd added to the script which lists the DN of every object found by the Import script and these objects are not listed. I can't see any errors in the IdB log and the Full Import appears to have completed successfully.


So the question is, if they were not imported in a connector full import, shouldn't the entities have been removed from IdB?

Answer
anonymous 8 years ago

Looking at the logs shows that there were exported entities during the full import. The import logic is designed to not delete entities that are added whilst an import is occurring, as it has no way of knowing whether the end system is omitting the entry because it was deleted immediately or because it’s just not available yet for the import (e.g. snapshot or read copy/write copy style systems).

+1
Completed

Add support for integration external Workflow/Ticketing systems

Adam Bradley 8 years ago updated by anonymous 7 years ago 4

Add support for integration external Workflow/Ticketing systems

0
Under review

Link Baseline sync still able to be run from Links page when link is disabled.

Kelly Green yesterday at 2:53 a.m. in UNIFYBroker/Plus updated by Matthew Davis (Technical Product Manager) 12 hours ago 1

I have noticed that under normal operation, a baseline synchronization task on a link cannot be executed while the link is disabled. In the link UI, the option to run a baseline sync only becomes visible when the link is enabled. However I have found that on the Links page (that lists the links in the solution) selecting a disabled link and running a baseline sync through the Actions button at the top of the page still executes the baseline sync on the disabled link. Not sure if this is expected behaviour or is a bug.

Screenshot #1: Running a baseline sync task on a disabled link through the Actions button:

Image 6600

Screenshot #2: The baseline sync still executed even though the link is disabled:

Image 6601

0
Under review

Clarification of the Register-Contribution function

Liam Schulz 3 weeks ago updated by Matthew Davis (Technical Product Manager) 3 weeks ago 1

Hi,


Just seeking some clarification about how the "Register-Contribution" functions on PowerShell Schema transformations and what scenarios this applies to.


For example, if I have 2 fields Field1 and Field2 and apply the function like: "Register-Contribution Field1 Field2". Does this mean that a change to Field1 will trigger reprocessing changes of any transformations for Field2?

Also, would I be correct in assuming that this can be used to help process Time Offset Flag transformations where there may not necessarily be a change to retrigger evaluation of the flag?


Thanks,
Liam

0
Under review

Jobs Stuck Processing

Liam Schulz 2 months ago in UNIFYBroker Service updated by Matthew Davis (Technical Product Manager) 2 months ago 1

Hi,

We have observed jobs such as Connector imports and Link synchronization will occassionally get stuck in a processing loop and not complete. This causes a block in operations as Broker cannot import or synchronize new data. To clear the process a restart has to be performed. Attempting to cancel the job does not have any impact.

This happens intermittently and doesn't appear to have a consistent way of reproducing the issue. I understand this makes it difficult to troubleshoot the issue, so is there other possibilities for a solution we could explore? For example, could there be a timeout introduced so that the job is killed if it runs over a period of time without closing?

Let me know your thoughts and feedback.

Thanks,
Liam