Wednesday, January 22, 2014

Function of SharePointResourceUrl property in BDC model

Lately we had an issue with SharePoint BCS usage. In my analysis I noticed that the value of the ‘SharePointResourceUrl’ in some models was not pointing to our SharePoint farm. But as the role of ‘SharePointResourceUrl’ parameter in BDC Models is badly documented, I was not certain this was really the problem cause. Also an Internet search on ‘SharePointResourceUrl’ only returns a few (2) hits, but none useful to explain its purpose.
This week I via-via received an answer from Microsoft Support on the functionality of this BDC model parameter. And although we already found the cause of our initial problem (it was SSL + webdispatcher related, between the external system and SharePoint farm), I still consider it worthwhile to log that described functionality.
Response from Microsoft support:
The SharePointResourceUrl is used to determine where the STS (Security Token Service) is located, i.e.:
SharePointResourceUrl + "/_vti_bin/sts/spsecuritytokenservice.svc?wsdl"
I've only encountered problems with SharePointResourceUrl on client solutions with SharePoint BCS when users sync an External List offline to Outlook or SharePoint Workspace. In these situations SharePointResourceUrl is required in order for the client to locate the STS service.

Wednesday, January 15, 2014

Manage Trust entire SSL certificate structure to avoid SharePoint-farm internal SSL trust issues

SharePoint BCS has the central role to consume data from external systems. One of the supported consumption approaches is via SOAP webservices. In case the external systems is SSL protected, SharePoint BCS must trust the SSL certificate of the external system. This is achieved by importing the SSL certificate in Central Admin, Manage Trust.
Recently we faced a situation that despite we set SharePoint to trust the SSL certificate, still SSL issues were reported (ULS, EventLog): An operation failed because the following certificate has validation errors....
Mind you, although the SSL issues are logged as critical, SharePoint BCS is tolerant and still sets up the connection to the external system for data exchange. But of course it is an undesired situation, certain for a production environment, that system logs (ULS, event logs) are piling up with critical errors; even when the platform is tolerant for them.
Upon investigating the logged SSL error, I noticed something strange. It was not the SSL certificate of the consumed external system that was qualified as non-trusted. Instead it appeared to be the SSL certificate that the SharePoint farm uses internally for the service communication between the SharePoint webapplication process and SharePoint BCS service application.
With this insight that the problem was internal in the SharePoint farm, the cause was good to locate. In the SharePoint farm only the certificate on lowest level was imported into SharePoint Manage Farm. Thanks to this post, SharePoint Operations learned that actual the entire certificate structure/hierarchy upto the certificate root level must be added to Manage Trust. With that fixed, the critical although tolerated errors are no longer polluting the logs on production system.

Sunday, January 12, 2014

Explanation + resolution of BCS "Cannot find any matching endpoint configuration"


SharePoint BCS operates with external systems through the connection information in BDC Models, administrated in its metadata store. BDC Models can 1. be imported via Central Admin UI, Business Connectivity Services application; 2. provisioned via features; 3. and manually be created via SharePoint Designer.
When BDC administrates a new BDC Model, the BDC administration process reads the WcfMexDocumentUrl value from the LobSystem node in the model; and uses this to runtime retrieve from the WcfMexDocument (= Service Metadata) all the service endpoints. The received service endpoins in the WcfMexDocument are administrated in the internal WcfConnectionManager class.
When the BDC service application is invoked by a BDC client to interoperate with the external system, BDC uses the WcfEndpointAddress value from the LobSystemInstance node in the model as the service endpoint url. But it does not directly invoke the (WCF) service on that endpoint address. It first uses this endpoint address to retrieve additional endpoint configuration data (e.g. what character is used as wildcard in the external system) from the WcfConnectionManager class. Information that was derived when administrating and processing the BDC Model, and stored as key-value pairs with the key equal to a service endpoint address included in the service metadata / WcfMexDocument.
Now, in case the WcfEndpointAdress value in the LobSystemInstance node is not present as one of the service endpoints in the WcfMexDocument; BDC is not able to locate the additional endpoint configuration; and all BDC client invocations of this External ContentType will fault with the error message “Cannot find any matching endpoint configuration”.


The above sketched situation originates from a mismatch in the BDC Model versus the WcfMexDocument. The fix is to make sure that the WcfEndPointAddress in the BDC Model, does have a match with one of the service endpoint addresses of the WcfMexDocument. This either means to adjust the WcfEndPointAddress value in the BDC Model to a correct and present vaue, and reimport the model. Or to modify the WcfMexDocumentUrl to include the WcfEndPointAddress value as one of the service endpoints. Note that as BDC builds up its internal administration in WcfConnectionManager on processing a BDC Model, also in the latter case the BDC Model must be re-imported despite that itself has not changed. Through reimport the situation in WcfConnectionManager will be reset and corrected.

Saturday, January 4, 2014

Augment GWPAM AddIn project to consume JSON dataformat

GWPAM consumption of efficient JSON format only possible when Gateway service supports OData V3
Standard the GWPAM generated AddIns consume the Gateway REST OData service via the AtomPub dataformat, and not via the more data-efficient JSON format. The reason is that the used .Net WCF Data Services Client library is not capable of consuming JSON dataformat. Because I prefer to consume REST services via the mobile-friendly JSON dataformat, I set out to augment a default generated GWPAM Outlook AddIn project to consume via OData JSON. As the service consumption is within .Net code, this is mostly a Microsoft.Net tail. However, as it turns out that to be even able to consume JSON in the Microsoft WCF client library, an important prerequisite is imposed on the consumed REST service; it also extends to there.

JSON consumption via WCF Data Services Client Library

Initially, WCF Data Services Client library only supported consumption of REST services via AtomPub dataformat. As of release 5.1 it is also possible to consume JSON, but with the limitation that the JSON format must be OData V3. Prerequisite for the consumed service is thus that it must be able to provide its data in OData JSON V3 dataformat. For the older OData versions (V1, V2), it is not [yet] possible to consume JSON in WCF Data Services Client library.

Steps to augment to GWPAM project to consume JSON dataformat

First prepare your project to be able to handle the JSON consumption:
  1. Install the latest WCF Data Services Client Library, at present this is 5.6.0. Installation is done via NuGet Package Manager, and must be applied to each GWPAM project in which you want to consume via JSON dataformat. The effect per project is that reference ‘Microsoft.Data.Services.Client’, version 5.6.0 is added; and that the (via GWPAM project template) already existing references ‘Microsoft.Data.EDM’, ‘Microsoft.Data.OData’, ‘System.Spatial’ are also upgraded to version 5.6.0.
  2. If present, remove the reference ‘System.Data.Services.Client’ from the GWPAM project(s). Note: Visual Studio 2010 default installs with that older WCF Data Services Client library.
  3. Install WCF Data Services 5.3.0 RTM Tools Installer in Visual Studio; also via NuGet.
Next setup the consuming code to consume the REST service via OData. That is, if the service supports it
  1. Validate that the consumed service is able to return data in the required JSON dataformat, OData V3. Query for this the $metadata of the service, and validate that it contains “m:MaxDataServiceVersion='3.0'”
  2. Generate an EDMModel for the service, via ‘Add Service Reference’ (iso ‘Add SAP Service Reference’). Open the generated service proxy, and change the accessibility of ‘GeneratedEdmModel’ to public
  3. Edit the earlier generated SAP data services client code to consume the service via JSON dataformat
    1. In the constructor, set to initialize for ‘System.Data.Services.Common.DataServiceProtocolVersion.V3’
    2. Set the Format to link to the generated EDM Model: this.Format.LoadServiceModel = GeneratedEdmModel.GetInstance;
    3. Set the Format to useJson: serviceContext.Format.UseJson()
    4. Copy the implementation + usage of methods ‘ResolveTypeFromName’ and ‘ResolveNameFromType’ from the generated service proxy in step 5. Modify the code to correspond to the full name of the SAP service proxy.
    1. Query the consumed REST service for JSON format; either via querystring param ‘$format=JSON’, or via Http Request Header ‘accept
      = application/json’

That's it

And that’s it. Now the GWPAM based Office AddIn is able to consume the REST Service via the more efficient JSON dataformat. Note that for the consumption itself it does not even has to a Gateway REST Service; it is also possible to consume REST services from other webservice platforms. But the GWPAM project templates and code generated are focussed on standard SAP NetWeaver Gateway services + data models; WFService as one evident example.