Identity Broker Forum
Welcome to the community forum for Identity Broker.
Browse the knowledge base, ask questions directly to the product group, or leverage the community to get answers. Leave ideas for new features and vote for the features or bug fixes you want most.
Conflicting web service definitions between deployments
Lifehouse and our test Riskman environment have conflicting web service namespaces, which means the connector has to be recompiled against each one depending on what the client has. I've asked Riskman (company) what to do about this but have not received an answer.
Date: 5/05/2014 10:30 AM
From; Patrick
To; Simon Welsh (simonw@riskman.net.au)
Subject: Riskman.Net web service question
Good morning Simon,
Hope things are well on your end.
From what I understand the deployment of our RiskMan integration product at Lifehouse went very well and is now up and running in production, so thank you again for your help in making that happen.
We did run in to one small issue relating to the namespaces of the web service though; the namespace of our test instance (https://online.riskman.net.au/RiskManACTHealthUnifyTest/WebServices/GenericWebService.asmx) is tempuri.org while Lifehouse’s instance is riskman.net.au. As such we had to recompile our software to contain the new namespace.
Is there a way I can know ahead of time what the namespace will be? Is it configuration, version, hosted vs on-premise or something else?
Thanks in advance,
Duplicate Key: Add key to log message
Currently, when importing on a connector, Identity Broker will generate a duplicate key error when it encounters multiple records that have the same key. This is problematic in itself, but is exacerbated because IdB apparently drops the whole page (~1000 records) when this occurs.
The error presented is something like this:
System.ArgumentException: An item with the same key has already been added.
Given that the impact of having a single duplicate record is quite high (1000 records not imported!), it's important to find these duplicate records in the end source and to either eradicate them, or select a more appropriate key value. To this end, I request an improvement for IdB to display the duplicate key in the error message when the error is thrown. Eg,
System.ArgumentException: An item with the same key (12345) has already been added.
Or perhaps even go so far as to include the attribute(s) used in the key:
System.ArgumentException: An item with the same key (entityID:12345) has already been added.
or (in the case of multiple attribute keys
System.ArgumentException: An item with the same key - (entityID,12345):(locationID,123) - has already been added.
Identity Broker wishlist
I have not played with IdB nearly enough a, nor read the manual enough times, so please be gentle with me if any of these features already exist.
"Search" entities in connector or adapter - it would be great to have an option to enter search criteria first rather than have to list all then sort.
It is probably not so bad in many sites, but DET as an example, takes an age to load 109,000 entities, which is a pain when we only want to look at one.
It would also be really nice to have an option to do a limited import based on similar criteria (so as well as full import or delta import to have an option to import all sn=smith or something. That is of most use during set up and troubleshooting so maybe an "admin" feature.
I guess what would achieve the same (and be more operationally useful) would be a filter option to include or exclude specific entities - based on attributes and built up like an SQL query or an LDAP query.
Configuration via a web front end rather than by editing XML files - yipee!
Scheduling.
The polling intervals being set in ticks is horrible, so something in seconds or minutes would be far nicer.
Better still would be some form of schedulting in the product that offers more than get full import every 30 minutes, get deltas every 5 mins etc.
Something along the lines of the scheduling that can be set up in EB would be great, but even if there was just the option to not run on certain days or between certain times.
I guess a hook into EB would work too.
GUIDs
I was talking to Nick Mathas about a problem in the Novell world, where clearing the connector also clears the GUIDs which stuffs up the Novell IDM association value (unique key). He and I think this is something that needs to be addressed in the NIM adapter or the NIM end of the system rather than the IdB engine, but would there be a possibility to have a configuration item that could allow the source system unique key to be used as the IdB unique key in place of the IdB GUID?
i.e. If you do not select a unique key, you get IdB generated GUID and clearing the Connector and re-importing will generate new keys, but if you have selected that, for example, detnumber from a Chris connector should be the unique key, it will use that (relying on the source to guarantee uniqueness) and clearing the connector and re-importing will bring entities in exactly the same.
That's all the springs to mind for now
Allow Relational.Compare.String to exclude items not in priority
Optionally allow Relational.Compare.String to exclude items not in priority list. This was raised as an issue on BCE-220.
chris21 accounts left open at AHG
Hi Doug,
Hoping you can direct this to the right person.
One of the administrators of Chris21 pointed something out to me yesterday. It seems that the Unify application is connecting to Chris21 and leaving behind open logins.
It appears to be the scheduled process that picks up any changes in Chris21: one login per Connector every 2 hours. So every 2 hours there are another 6 open logins created.
I'm not sure what the potential implications are - I'm trying to find out what I can from this end - but could I please ask you to check with your people as to why these logins are left open? As you can imagine the number of open logins builds up very rapidly and need to be cleared manually.
Regards
Deanna March
Senior Applications Developer
Corporate Office
Automotive Holdings Group Limited
21 Old Aberdeen Place, West Perth WA 6005
P: +61 8 9422 7758
F: +61 8 9422 7686
M: 0457 524 306
E: dmarch@ahg.com.au
W: www.ahg.com.au
AdapterEngine.extensibility.config.xml
ConnectorEngine.extensibility.config.xml
LoggedOnScreen.png
UnifyLog20110202.csv
The field DateOfBirth was value 1900-01-01T00:00:00Z of type TimestampValue. Type Entity was expected..
20130416,06:58:31,Adapter request to save entity to adapter space failed.,Adapter,Warning,"Adapter request to save entity 1dbf37db-5ed3-49fa-9161-ea6c9c5b1b7b to adapter space 365e6a23-2e27-485f-a6e5-52ccd3347634 failed with reason The field DateOfBirth was value 1900-01-01T00:00:00Z of type TimestampValue. Type Entity was expected.. Duration: 00:00:00.9999360 Error details: Unify.Framework.GroupedNameValueCollectionInvalidTypeException: The field DateOfBirth was value 1900-01-01T00:00:00Z of type TimestampValue. Type Entity was expected. ---> System.InvalidCastException: Specified cast is not valid. at Unify.Framework.EntityBase`3.GetValue[TValue](TKey key) --- End of inner exception stack trace --- at Unify.Framework.EntityBase`3.GetValue[TValue](TKey key) at Unify.Framework.EntityToConnectorEntityBridge.GetValue[T](GroupedNameValueCollectionKey key) at Unify.Connectors.LifeUserConnector.SaveEntities(IEnumerable`1 entities, Action`4 preSaveAction, Action`1 responseAction, IDictionary`2 matchingEntities) at Unify.Connectors.LifeUserConnector.SaveEntities(IEnumerable`1 entities) at Unify.Framework.ConnectorToWritingConnectorBridge.SaveEntities(IEnumerable`1 entities) at Unify.Framework.EventNotifierWritingConnectorDecorator.SaveEntities(IEnumerable`1 entities) at Unify.Framework.Adapter.SaveEntities(IEnumerable`1 entities, Boolean reflect) at Unify.Framework.Adapter.SaveEntity(IAdapterEntity entity, Boolean reflect) at Unify.Framework.CompositeAdapter.SaveEntity(IAdapterEntity entity) at Unify.Framework.AdapterNotifierDecorator.SaveEntity(IAdapterEntity entityToSave) at Unify.Framework.LDIFAdapter.ExportAdapterEntity(IAdapterEntity adapterEntity, Guid adapterId) at Unify.Framework.LDIFAdapterServiceHostDecorator.ExportAdapterEntity(IAdapterEntity adapterEntity, Guid adapterId) at SyncInvokeExportAdapterEntity(Object , Object[] , Object[] ) at System.ServiceModel.Dispatcher.SyncMethodInvoker.Invoke(Object instance, Object[] inputs, Object[]& outputs) at System.ServiceModel.Dispatcher.DispatchOperationRuntime.InvokeBegin(MessageRpc& rpc) at System.ServiceModel.Dispatcher.ImmutableDispatchRuntime.ProcessMessage5(MessageRpc& rpc) at System.ServiceModel.Dispatcher.ImmutableDispatchRuntime.ProcessMessage4(MessageRpc& rpc) at System.ServiceModel.Dispatcher.MessageRpc.Process(Boolean isOperationContextSet)",Normal
SAP test harness cannot update infotype with subtype
Monash require to write back UserID, Email and MonashPersonID as below:
- Subtype 9008, is for FIM SAP interface on user ID and email address. Field mapping:
Field Name Data Type length
User ID ZZFIMINT Characters 30
Email USRID_LONG Characters 241
- Subtype 9009, is for Monash Person ID. Field Mapping :
Field Name Data Type length
Monash Person ID USRID_LONG Characters 241
When trying to test the write back permission using the SAP harness, the infotype update function does not allow to specify subtype.
On schema pages, allow collapsing of non-key fields
In order to prevent lots of scrolling on the adapter schema pages, allow collapsing of non-key fields.
Alternative approach to dealing with export timeouts to IdB
An alternative to having to change batch size/timeout settings on export, it might be worth considering adopting a similar approach that Microsoft did with the MIM (Service) MA - i.e. changing the default option to asynchronous instead of synchronous exports.
Under the current default configuration, the MIM MA gets a success returned from the MIM Service once an exported change has been successfully queued (inserted in the REQUEST - either single or batch request objects). A similar approach might be worth considering for IdB such that we can decouple long-running connector export times from the MIM export itself.
I am categorising this request under O365 because that is where I am seeing the most need for this feature right now - however this would be a generic option.
Will be investigated as part of a roadmapped item on more granular and expanded set of export results - that could possibly include an export status of async/pending.
Customer support service by UserEcho