Saturday, December 10, 2011

Synchronisation issue with SharePoint FBA claims-based

In a SharePoint 2010 extranet I apply a custom membership provider for Forms-Based Authentication. The provider works like a charm, external users are authenticated when logging on with valid credentials, and denied access otherwise.
However, after functioning smoothly for a while, we suddenly encountered the error below when trying to logon via FBA:
[FaultException`1: The context has expired and can no longer be used. (Exception from HRESULT: 0x80090317)]
Microsoft.IdentityModel.Protocols.WSTrust.WSTrustChannel.ReadResponse(Message response) +1161205
Microsoft.IdentityModel.Protocols.WSTrust.WSTrustChannel.Issue(RequestSecurityToken rst, RequestSecurityTokenResponse& rstr) +73
Microsoft.IdentityModel.Protocols.WSTrust.WSTrustChannel.Issue(RequestSecurityToken rst) +36
Microsoft.SharePoint.SPSecurityContext.SecurityTokenForContext(Uri context, Boolean bearerToken, SecurityToken onBehalfOf, SecurityToken actAs, SecurityToken delegateTo) +26060225
Microsoft.SharePoint.SPSecurityContext.SecurityTokenForFormsAuthentication(Uri context, String membershipProviderName, String roleProviderName, String username, String password) +26063596
Microsoft.SharePoint.IdentityModel.Pages.FormsSignInPage.GetSecurityToken(Login formsSignInControl) +188
Microsoft.SharePoint.IdentityModel.Pages.FormsSignInPage.AuthenticateEventHandler(Object sender, AuthenticateEventArgs formAuthenticateEvent) +123
System.Web.UI.WebControls.Login.AttemptLogin() +152
Logon via Windows authentication had no problem, only the FBA route. Since there had been no software or configuration changes for the custom provider, the cause must be found to be at [SharePoint Farm] infra level. In the Application Eventlog I noticed the following Error log: An exception occurred when trying to issue security token: The context has expired and can no longer be used. (Exception from HRESULT: 0x80090317).
This steered me within the direction of the generic Security Token handling in the farm, instead of the context of the extranet webapplication self. As first attempt I decided to restart the SecureToken Service Application. And voila, this was already sufficient: problem resolved. That is, for a while... The problem namely structural reappears after a couple of days of minor or no activity in the SharePoint farm. It looks to me as something of a clock timer synchronization issue within the SharePoint farm, that can for a period be prevented by timely 'refreshing' the SecureToken application pool.

Friday, December 9, 2011

Exposing business information of SAP workflow via Duet Enterprise

Often a SAP workflow includes specific business data. This data is needed at the decision moments within the workflow to make an informed decision. And thus also needed within the SharePoint context if the workflow decision step is exposed there via Duet Enterprise workflow handling.
As the default Duet Enterprise workflow handling is a generic set-up, it is evident that it cannot provide direct support for all imaginable and variant occurences of specific business data. Instead there are hooks in the workflow pipeline in which you can plug-in custom extensions for exchanging workflow specific/contextual data from SAP backend to SharePoint frontend. You need to build a custom workflow outbound handler to transfer the contextual SAP business data from SAP workflow to SharePoint via Duet Enterprise workflow feature. Herein you have 2 options.

Expose business data in the format of static HTML or XML content

In this approach the additional workflow information is added to the SAP workitem-details (SOSP_TT_WF_RUNTIME_INFO). The Duet Enterprise system dataflow is as follows:
SAP ERPOutbound handlerExtend or replace the workitem details that are propagated via Duet Enterprise workflow capability, with pre-formatted business content (XML or HTML)
SCLNo action
SharePointTaskformIF HTML: simple display
IF XML: Xslt-transformation, preceeding display

Expose business data in the format of XML payload

In this approach the additional workflow information is passed via the XPROP key-value structure delivered with Duet Enterprise workflow handling at the SAP side. The Duet Enterprise system dataflow is as follows:
SAP ERPOutbound handlerPopulate the Duet Enterprise key-value XPROP structure with the required business properties (name => value; eg ‘CustomerId’ => ‘02345’)
The business property values are retrieved from the SAP workflow container.
SCLConfigurationRegister EACH individual XPROP business property at the associated workflow template
SharePointConfigRegister EACH individual XPROP business property as external/extended business property at the task definition in the Duet Enterprise workflow subsite
SharePointWorkflowExtend the ‘Approval task screen’ to render the additional business data properties [extend standard taskform via SharePoint Designer or via InfoPath]

Decide between the 2 approaches

Both options are viable to expose specific SAP business data to Microsoft SharePoint and Outlook context. Which is most appropriate depends on the characteristics of both the data, and what will be done with it within the Microsoft front-end. If it will only be displayed, it may be best to use the approach to expose it as single set of XML or HTML content; and then directly or via Xslt render within the UI. If the data is also needed within the front-end for additional processing, it is properly better to expose the data properties individual.

Tuesday, November 22, 2011

Workflow in Duet Enterprise: concept + implications

One of the Duet Enterprise core capabilities is publishing SAP workflows into the realm of SharePoint sites. In this posting I outline the conceptual working, and what implications result from the Duet Enterprise workflow architecture. The information below is not entirely new, but builds upon blog material of Xiaosheng (Edward) Lu [SAP AG] and Kiki Shuxteau [Microsoft Corp].
This blog is earlier published on SAP Community Network Blogs

Prerequisites for exposing a SAP workflow via Duet Enterprise

  1. SAP ERP workflow; it does not matter whether a standard workflow, or a company-custom developed one. SAP BPM workflows are currently not supported for Duet Enterpise exposition.
  2. The SAP workflow must contain one or more discrete user interaction tasks.
  3. Duet Enterprise out-of-the-box supports User Decision interaction step. Interaction task with an Activitity Dialog are also supported, but require some manual work. Effectively you need to replace both the SAP GUI screenhandling plus propagate the dialog resultdata as outcome for the SAP workflow Activity Dialog task.

 

Conceptual working

The following steps are performed in the exposition of a SAP ERP workflow interaction task to SharePoint context:

  1. In the SAP ERP workflow a status change occurs that next requires user interaction to act upon, via a SAP Workflow Interaction Task (Decision Step or Activity Dialog). When the SAP Workflow reaches the Interation Task, a Duet Enterprise specific outbound handler is invoked from within the runtime context of SAP workflow execution. This outbound handler uploads an XML payload document, containing the workflow instance details, to the associated [by configuration] SharePoint workflow subsite. In the Duet Enterprise architecture, this XML payload is send from SAP ERP Workflow context via the SAP Document Publisher, which resides on the Duet Enterprise SAP Add-on. The SAP Document Publisher service on its turn invokes the SharePoint OBAWorkflowService, which is deployed in the SharePoint farm as part of Duet Enterprise installation.
  2. On the receiving SharePoint document library [in the workflow subsite] is a Duet Enterprise specific SharePoint workflow configured. This workflow is triggered on addition of new content in the document library. The workflow contains 1 user task to act from the context of SharePoint user interface upon the specific surfaced SAP interaction task.
  3. The decision made by the SharePoint enduser in the SharePoint workflow task, is propagated to the SAP ERP workflow by means of a taskflow BDC entity. Duet Enterprise specific SAP webservices are herefore invoked via SharePoint BCS. These SAP webservices next invoke a Duet Enterprise specific inbound handler to receive and process the exchanged decision details, and propagate it as outcome of the user interaction task in the SAP ERP workflow.

Implications

  • Duet Enterprise provides default Function Modules for both inbound and outbound handlers. By design, these are limited to generic handling of SAP workflow. In case the SAP ERP workflow decision task includes context information that is workflow specific, it is required to build a custom outbound handler. Also in case the result datacontainer of the decision task contains context data besides or instead of a simple decision outcome, it is required to build a custom inbound handler to process that received data from SharePoint task into the SAP workflow.
    • In concrete reality, SAP workflows more often than not involve acting upon workflow specific data. In all such situations it is thus required to overload the default function module with a custom implementation, for surfacing that specific data outside the boundaries of the SAP workflow.
  • The default Duet Enterprise outbound handler S_OSP_WF_PAT_DEFAULT_CH_OB handles the publishing of XML payload to SharePoint. Even in case of a custom outbound handler, it must therefore still be utilized in the Duet Enterprise workflow publication. More specifically, the S_OSP_WF_PAT_DEFAULT_CH_OB outbound handler must always be the last invoked outbound handler in the SAP workflow outbound handlers pipeline. The order in which outbound handlers are invoked is configured within SAP ERP, not in the SCL (aka Gateway).
  • The default Duet Enterprise inbound handler S_OSP_WF_PAT_DEFAULT_CH_IB handles the receiving of Duet Enterprise taskflow entity from SharePoint BCS. It must therefore always be present as the first invoked inbound handler in the SAP workflow inbound handlers pipeline.

Saturday, November 19, 2011

Alternative ways to programmatic read contents of External List with filtered view

In order to export the displayed contents of a BCS External List to Excel (see previous post), I first have to programmatically retrieve the contents in custom code. The External List has a filter applied to it's Finder method, set via the default SPView:
My first thought was therefore that below code should retrieve the same filtered external data as when rendering the BCS External List on a SharePoint page:
However, GetItems returns an empty collection. Upon debugging the called GetNotifications webservice method, I discovered that the filter-param always has value ‘null’. To me unclear why, since it has been set in the DefaultView.
However, although this approach works; it left me rather dissatisfied. Conceptually I want to export the contents of the same (External) List that I already have provisioned on a SharePoint page; so why should I have to dive under the BCS hood to get the same contents? Also, this code is very much aware / coupled to BCS API, while the 'Export SPList to Excel' functionality on itself is general. Thus, with some ample time available, I decided to analyze the way in which the standard XsltViewWebPart is issuing the external data retrieval – using JetBrains DotPeek reflector tool (as .NET Reflector is no longer free of licensee charge). It appears that this is done slightly different as my original attempt:
Not only is the code LOC of this far less, but also this code is general; applicable for both regular SharePoint lists as BCS External Lists.

Friday, November 18, 2011

Excel 2010 Protected View hinders browser-opening of downloaded .xlsx file

An user requirement in one of our SharePoint 2010 projects is to export at any moment the displayed contents of an External List (with content originating from SAP ERP, retrieved via SharePoint BCS connecting to BAPI based web services) to an offline file. The functional rationale is version-administration for history and auditing purposes. The SharePoint platform supports this out-of-the-box for regular Lists, by the Export into Excel functionality. However, not so for BCS External Lists. But you can realize it yourself via some custom code. First retrieve the External List contents, and next construct a .xlsx file via Open XML SDK. The .xlsx file is generated server-side in memory, and send to the browser as HttpResponse content. The end-user can next either open the file, or save it somewhere at client-side:
Strange thing I noticed was that when saving the file, that saved file can next be opened successfully. But when instead choose to directly open the file, Excel 2010 displays the error message “The file is corrupt and cannot be opened”.
This must be a client-side issue; the server-side is not aware of the context in which the client-side handles the received HttpResponse (Note: via Fiddler I even analyzed that the HttpResponse contents were identical).

The resolution is hinted at in the File Download window, by the trust-warning about internet downloaded files. The default Excel 2010 TrustSettings are to distrust all downloaded content from non-trusted locations. To validate this I unchecked in Excel 2010 the default settings (via File \ Options \ TrustCenter \ TrustCenter Settings \ Protected View):
This helps, Excel 2010 now direct opens the downloaded file.

Thursday, November 10, 2011

Inconvenient configuration of Forms-Based Authentication in SharePoint 2010

For an extranet we aim to utilize Forms-Based Authentication to authenticate external users. In SharePoint 2010 this means that you have to apply the Claims-Based authentication model. And next set up the SharePoint webapplication configuration for the Membership provider that will be used for FBA. This is where things can be a little confusing. In fact you have to configure FBA membership on 3 different locations, and it is essential that all 3 are in sync:

1. In Central Admin

Configure Web Application \ Authentication Providers; here you specify the name of the Membership- and RoleProviders used in FBA context within the web application

2. Security Token context

In the SharePoint 2010 service applications architecture, membership handling is delegated to the SecurityToken service application. This is a major difference wrt SharePoint 2007 architecture, in which the individual webapplication process themselves handle the membership handling. Direct consequence for configuration is that you need to specify in the web.config of the SecurityToken service application all the membershipproviders and roleproviders that are used in the SharePoint farm.
The SecurityToken service application directory is located at: 14hive\WebServices\SecurityToken

3. Webapplication context

Finally, despite that membership handling is within SharePoint 2010 done via the independent SecurityToken service application; you may still need to add the membershipprovider to the web.config of the individual webapplication. This is required in case you want to be able to use the standard PeoplePicker for selecting credentials via the FBA membershipprovider. Besides adding the ‘membershipprovider’ node, you then also have to set the peoplepicker directing to that membership provider.
What if the 3 locations are not in sync? I evaluated the different inconsistent configurations:

1. Name entered in Central Admin does not match with name in SecureToken’s web.config

In this case, SharePoint Identity handling cannot get an handle to the configured MembershipProvider. When someone tries to log in via Forms-Based Authentication, SharePoint Identity handling will report the displayed exception.
Note, to have the 'Cannot get Membership Provider' exception details displayed you already need to make a change to the default settings within SecurityToken web.config. Namely set attribute includeExceptionDetailInFaults of ServiceDebug to true. Without this, only a general error is displayed: The server was unable to process the request due to an internal error

2. Missing MembershipProvider in Central Admin

Well, you cannot actually forget to fill them in if you selected ‘Enable Forms Based Authentication’. But you could per accident forget to select that option.
The result of this is more a functional error: the forms-based authentication is now missing as option to logon to the web application.

3. Missing or different named MembershipProvider in web application’s web.config

In this case you can still logon to the webapplication: either via Windows Authentication or via FBA. See above, membership is namely handled by SecurityToken service application, and not by the webapplication self. This can be a bit confusing at first. The result of the configuration error is noticed within the application self, when you try to search credentials from the FBA-Membership provider via the PeoplePicker. Strange enough the PeoplePicker is aware of the configured FBA-membership, but apparently cannot include it in it’s search space.

4. Mismatch PeoplePicker (incorrect) versus named MembershipProvider in webapplication’s web.config

In this case strangely enough, PeoplePicker is able to use the Membership provider, that is if it is consistently named wrt SecurityToken web.config (otherwise issue 1).
At first I could not detect any PeoplePicker malfunction as result of this configuration mismatch. I could still find all credentials in the membershipprovider, also via wildcarding.
Only when I on purpose sabotaged the wildcarding, I could see the effect.
So it appears that for wildcard search the ‘%’ is already somewhere set as default for all Membership providers; and that you only have to override this in case your Membership provider uses a different wildcard-pattern.
Personally, I find it better to always be explicit; and thus also explicitly specify ‘%’ as wildcharacter for each Membership provider that is valid for your application. But it is optional, and others might differ with me on this…

Friday, November 4, 2011

Programatically open SPSite using Windows Credentials

In a Proof of Concept I employ a SPListMembershipProvider for forms-based access into (sub)sites. For the PoC, a SPList in the rootweb is utilized as User Administration. In SharePoint 2010 architecture, Claims-Based authentication is handled by SecureToken service application. In the local development image, the STS service application may run under the same Application Pool account as the SharePoint webapplication. But this is not recommended, thus not to be expected within a real farm setup. As result, it is not possible to directly access from within runtime STS context the list in the external SPSite.
SPSecurity.RunWithElevatedPrivileges cannot help here; that only be used within the same application pool context. Instead the proper way is to open the SPSite in the external SPWebApplication via the credentials of a SPUser in that site; e.g. that of a service account. Problem is that the SharePoint API does not directly provide a way to open a SPSite with Windows Credentials. You can open a site under the credentials of a SharePoint user, but you need the SPUserToken of the user for this. And guess what, you can only determine that token when within the context of the site. Talking about a chicken-egg situation.
However I came up with a manner to get out of this loop. It consists of a 2-steps approach: first programmatically impersonate under the credentials of the service account, open the site, determine the SPUserToken of the site’s SystemAccount, and undo the impersonation; second apply the SPUserToken to (re)open the site under the authorization of the site’s SystemAccount. Since Windows Impersonation is a resource intensive operation, cache the SPUserToken in memory so that the impersonation is only initially required within the process lifetime.

internal static SPUserToken SystemTokenOfSite(Guid siteId)
{
    string account, pw, domain;
    RetrieveCredentialsFromSecureStore(<AppId>, out domain, out account, out pw);

    ImpersonationHelper _impersonator = new ImpersonationHelper(account, domain, pw);
    try {
        _impersonator.Impersonate();

        SPSite initialAccessIntoSite = new SPSite(siteId);
        return initialAccessIntoSite.SystemAccount.UserToken;
    } finally {
        _impersonator.Undo();
    }
}

...
if (sysToken == null)
{
    sysToken = SecurityUtils.SystemTokenOfSite(memberSitesToId[websiteIdent]);
}

SPSite site = new SPSite(memberSitesToId[websiteIdent], sysToken);

Monday, October 24, 2011

Data architecture considerations and guidelines for retrieving SAP data via Duet Enterprise into SharePoint

In Duet Enterprise FP1 it is relatively easy to generate a Gateway Model, as long as your SAP data sources live up to the required constraints. The challenge and intellectual work transfers to the data architecture, for deciding on and defining the proper data service interfaces given the application context.
This blog is earlier published on SAP Community Network Blogs

Real-time retrieval versus Search indexing

Bringing SAP data into SharePoint based front-ends can take different forms. You can unlock the SAP data real-time via ExternalList, to display and even edit the SAP data via the familiar SharePoint List UI metaphor. If the row-based List format doesn’t fit, you can develop a custom SharePoint webpart. You can also access the SAP data via the extensive SharePoint Search architecture and capabilities. With the improved Enterprise Search in SharePoint 2010 [or FAST if also purchased by customer organization], more and more new applications will be architected as search-driven. In context of SAP data think of searching from SharePoint context for a certain Supplier administrated in PLM, a Customer in CRM, et cetera.
Duet Enterprise can facilitate all of the above scenarios, in combination with the strengths of the SAP and SharePoint landscapes. However, the different nature of these scenarios may well result into different approaches for the data integration architecture. In the case of real-time retrieval of SAP data to render into SharePoint, it is advisable to limit the amount of data retrieved per SharePoint - Gateway - SAP backend interoperability flow. Don’t put unnecessary SAP data on the line that is not being used in the front-end: only include the fields of the SAP data that will be displayed in the UI and are relevant in this application context for the end-user. Search however has a different context. It is feasible to search on any of the field data of the SAP entity. This requires that all that SAP field data must be indexed upon SharePoint Search crawling time. So here it is advisable to not limit the amount of SAP data retrieved, but instead retrieve as much of the SAP data that expresses some functional value.
What if the same SAP data is to be unlocked both via SharePoint ExternalList, as via SharePoint Enterprise Search? The data integration architectures of these are thus contra dictionary: reduce the retrieved SAP data versus give me all. Well, nothing prohibits you from constructing multiple data integration pipelines, tuned for the different scenarios. For Search, it’s Gateway Model returns in the Query method all fields that contain functional value. And for the real-time ExternalList, that Gateway Model returns in the Query method only those fields that will be rendered as list columns.
In Duet Enterprise 1.0, constructing the Gateway Model takes considerable time. This is an obstacle for constructing multiple Gateway Models with different data representation / signature, as it can easily double your Gateway mapping effort and time. Luckily in Duet Enterprise FP1 it is relatively easy to generate a Gateway Model, as long as your SAP landscape data sources live up to the required constraints.

Complex SAP data source structure

If the SAP data is conceptually a flat structure, it is sufficient to have a single Gateway Model to retrieve the data into SharePoint context. However, we all know that SAP (or, business) data is typically of a more complex structure: hierarchical with multiple child entities. When you want to retrieve such a structure into SharePoint, you basically have 2 options.
The first is to flatten the structure. This approach suffers from some disadvantages. In case of multiple occurrences of the same child type (e.g. Ordered Item); how many of them should you include in the flattened representation? All, or an arbitrary limited number? Also, for the end-user a flattened structure can be conceptually wrong and counter-intuitive: Ordered Item information is of different functional level as the Purchase Order information.
The second option is to maintain the hierarchical SAP structure within the Gateway Model architecture. For this you need to generate a Gateway Model for the parent SAP data entity, as for each type of its child data entities. At SharePoint client-side you associate the 2 resulting External ContentTypes, to establish the parent-child relation. SharePoint BCS respects the association between the External ContentTypes when it operates on the data. In case of SharePoint search, the child-associations are crawled in the context of the parent data, and either the parent or child SAP data entity will appear in search results. The BCS Profile page of a parent data entity renders also the data of its child entities.
In case of real-time retrieval, you can apply the BCS Business Data webparts: add to the SharePoint page both a Business Data WebPart for the parent entity and a Business Data Related List WebPart for the child data entities. The latter will display the child SAP data entities of the parent SAP data entity that is selected in the Business Data List web part.
Mind you, the user experience of this setup is not always intuitive. In such case, it can prove better to build an own custom presentation. An example of this is outlined in Working with complex SAP business entities in Duet Enterprise.

Sunday, October 16, 2011

Unlocking SAP data via Duet Enterprise Feature Pack 1 in more agile approach

This blog is earlier published on SAP Community Network Blogs
A major bottleneck in applying Duet Enterprise 1.0 is the time it takes the SAP + Microsoft development team to unlock SAP data via Gateway to SharePoint. You have to define the service interface in ES Builder, create a proxy in transaction SPROXY, and next via transaction SE38 realize the GenIL model. The latter requires a lot of handcrafting ABAP code to do the mapping from Gateway runtime context and data representation to SAP backend, and vice versa. Code that follows a pattern, so a good candidate for code generation. This was a major pain point experienced by us when initially applying Duet Enterprise 1.0 within the Ramp-Up in 2010, and with emphasis reported back to the Duet Enterprise product team.
In the coming Feature Pack 1 version, the Duet Enterprise product team has evidently got the message. FP1 comes with multiple generator tools to ease and speed-up the realization of the internal Gateway Model (new name for GenIL model) for your application scenarios. Handcrafting is largely eliminated and replaced by full-automatic generation of first the mapping code to unlock the SAP data via NetWeaver Gateway 2.0, and next the required SAP Gateway service proxy, plus the SharePoint BDC model. The latter can be handed over to SharePoint side to import the External ContentType definition into Business Connectivity Services. Something that took us with version 1.0 several days to set things up at both SAP as SharePoint side, now is done in matters of minutes. Very appreciated side-effect of this is that it enables agile development: if the initial Model does not fit the requested application context, just change the mapping in the tooling and regenerate. With handcrafting approach you would loose a lot of elapse time here rearranging and testing your own mapping code.
Of course not all now suddenly comes for free. Before you start the Gateway Model generation, you still first must think about the data integration architecture to achieve your application scenario. An action that involves and requires consensus of both the SharePoint and SAP backend architects plus developers. The usage of the FP1 / Gateway 2.0 generation tools itself put some constraints on the SAP backend data sources; BOR, RFC or Dynpro Screens. Basically it comes down to it that the backend data entities must provide at minimum both a ‘Query’ and ‘ReadItem’ operation, and also ‘Create/Update/Delete’ operations for update scenario via SharePoint.
What if the available SAP backend entities do not self satisfy the requested integration pattern and/or the generation constraints? Even then, it is far easier and better manageable to realize a custom wrapper RFC within the ERP level that does satisfy the pattern + constraints, than apply the manual Gateway Model code crafting. Spoken from own experiences…

Friday, October 7, 2011

Read content of uploaded file within ItemAdding method

In an application it is required to validate the file content before allowing the upload into a document library. SharePoint 2010 enables this via the ItemAdding synchronous SPItemEventReceiver method. Problem is however that you cannot access the uploaded file through the SPItemEventProperties.ListItem object. The item is at runtime of ItemAdding not yet created in and thus not available via the list. Via blogpost Getting file content from the ItemAdding method of SPItemEventReceiver I found a code-snippet how to access the file. Last remaining issue was that I received empty result upon reading from the filestream. Debugging I discovered that the Stream position was set to the end of file. Which is logical since the file has been read in order to save it to the content database. By resetting the position I can read the file contents, validate it, and cancel the upload in case of invalid content
public override void ItemAdding(SPItemEventProperties properties)
{
  string searchForFileName = Path.GetFileName(properties.BeforeUrl);
  HttpFileCollection collection = _context.Request.Files;
  for (int i = 0; i < collection.Count; i++)
  {
    HttpPostedFile postedFile = collection[i];
    if (searchForFileName.Equals(
      Path.GetFileName(postedFile.FileName), StringComparison.OrdinalIgnoreCase))
    {
      Stream fileStream = postedFile.InputStream;
      fileStream.Position = 0;
      byte[] fileContents = new byte[postedFile.ContentLength];
      fileStream.Read(fileContents, 0, postedFile.ContentLength);
      ...

Thursday, September 29, 2011

VNSG event 'SAP en Microsoft - het beste van 2 werelden'

De VNSG focusgroep User Experience organiseert op 13 oktober een bijeenkomst met als onderwerp SAP - Microsoft integratie/samenwerking. Bij veel bedrijven zijn zowel Microsoft als SAP belangrijke leveranciers van (bedrijfskritische) software en wordt er continue gezocht naar mogelijkheden om het beste uit beide werelden te halen. Waar SAP vooral bekend staat om haar solide bedrijfsproces ondersteuning, is Microsoft vooral sterk in de User Experience.
De aard van het evenement is om SAP en Microsoft solution architecten van eindorganisaties met elkaar in gesprek te brengen, en te leren van elkaar. De sessies zijn hierop geselecteerd. Klantorganisaties die bereid zijn gevonden te vertellen over hun ervaringen, best practices en huidige status op SAP / Microsoft integratie vlak.
De volgende klantorganisaties geven een presentatie:
  • Achmea - Remco Jorna: Enterprise Architectuur aanpak
  • Tata Steel - Hans Brouwer en Tamas Szirtes: An Enterprise Portal Journey...
  • Eneco - Kees Voeten: HR futureproof bij Eneco
  • Ziut - Frank Voortman: Duet Enterprise voor SAP/Microsoft integratielaag in aannemersportaal
Volledige agenda.

Event details

Datum: 13 oktober 2011
Locatie: Microsoft, Schiphol-Rijk
Eventduur: middag
Doelgroep: Solution architecten van eindklanten (profit en non-profit)

Organisatie:
Marcel Rabe, secretaris VNSG Focusgroep User Experience
Pim de Wit, kernteam lid
William van Strien, kernteam lid
Gijs Leurs, kernteam lid

Aanmelden voor dit event is ook open voor niet-VNSG leden: aanmeldingsformulier.

Saturday, September 3, 2011

On-the-fly content-targetting via custom AudienceManager

In a SharePoint project, one of the requirements is to enable the end-user to set preferences, which are then immediately be applied in the portal. One of the settable preferences is the (non)interest in subjects/functions: if interested, webparts will be visible; if non-interest specified, the webpart will be invisible. Well, this kinda sounds like SharePoint audiences. As this is out-of-the-box SharePoint functionality, it is applicable for both our own custom webparts as well as standard and third-party webparts. The only problem is the sub-requirement to have the preferences immediately applied in the portal. Standard SharePoint audiencing requires to compile the Audience before it takes effect, which either occurs on scheduled basis or on request. But never immediately.
The solution for this is to utilize a custom PreferencesAudienceManager, with functionality to on-the-fly derive Audience membership on basis of the selected settings.

System architecture



PreferencesAudienceManager code-extract


public class PreferencesAudienceManager : IRuntimeFilter2
{
  ...
  public bool CheckRuntimeRender(string IsIncludedFilter)
  {
    bool render = false;
    if (SPContext.Current != null && SPContext.Current.Web.CurrentUser != null)
    {
      // Priority to OOTB SharePoint Audiencing.
      render = AudienceManager.IsCurrentUserInAudienceOf(IsIncludedFilter, true);
      if (!render)
      {
        string[] globalAudienceIDs = null;
        string[] dlDistinguishedNames = null;
        string[] sharePointGroupNames = null;
        int num = AudienceManager.GetAudienceIDsFromText(IsIncludedFilter, out globalAudienceIDs, out dlDistinguishedNames, out sharePointGroupNames);
        if (num > 0 && globalAudienceIDs != null && globalAudienceIDs.Length > 0)
        {
          string userIdent = PersonalizationUtils.GetCurrentUserIdentifier;
          UserPreferences userPreferences = PreferencesRepository.ReadUserPreferences(userIdent);

          SPServiceContext context = SPServiceContext.GetContext(SPContext.Current.Site);
          AudienceManager am = new AudienceManager(context);
          foreach (string audienceId in globalAudienceIDs)
          {
            Guid g = new Guid(audienceId);
            Audience a = am.Audiences[g];

            string preferenceAudienceName = a.AudienceName;
            UserPreferences preferenceAudience = (UserPreferences)Enum.Parse(typeof(UserPreferences), preferenceAudienceName );
          if ((preferenceAudience & userPreferences) == preferenceAudience)
          {
            render = true;
            break;
          }
        }
      }
    }
  }
  return render;
}
There is on caveat you need to be aware of; the ability to even use Audiences functionality, requires that the User Profile Service Application is installed in the farm. Microsoft namely placed the Audience functionality within the context of User Profile; even though you can thus apply Audiencing without dependency of User Profile values.

Tuesday, August 23, 2011

ListTemplate cached in Sandboxed process

Sandboxed solution are a great addition to SharePoint 2010 capabilities whenever it is required to deploy own customizations. However, be aware of the restrictions, and some oddities. Today I encountered such an unexpected behaviour. ListTemplates are one of the SharePoint artefacts that allow themselves to be deployed via a Sandboxed solution. In the initial version of a custom ListTemplate, some errors and omissions were made with respect to schema.xml and some forms. I corrected these, and deployed (= uploaded + activated) the Sandboxed solution. To discover next that my changes were not present; not within the Feature-provisioned ListInstance based on the ListTemplate, nor within a manually created FormsLibrary instance based on the custom ListTemplate. It took an AppPool recycle to have the latest deployed ListTemplate definition become effective in the sitecollection. Tip: always recycle the AppPool before re-deploying a Sandboxed solution.

Wednesday, August 17, 2011

HowTo omit the standard xml-namespaces in WCF MQ requests

Earlier I reported that we encountered an issue with communication via IBM MQ WCF Channel over a clustered IBM WebSphere MQ-queue. The problem was already known within IBM and the fixed software ready on the shelves, and naturally we were very willing to be a first beta-tester. With this IBM fix, the EndPointNotFound problem was indeed solved. Sadly, we next ran into another issue. This time it manifested itself in our own service implementation at the receiving side. Upon receiving a MQ-request, the service responded with a technical error. Due the holiday season it took some elapse time before the right people were available and able to investigate the cause. The problem analyse led to the suspicion that the presence of xml-namespaces in the webrequest was unexpected, and therefore the service refused the request:

<ServiceRequestName xmlns:xsi=\"http://www.w3.org/2001/XMLSchema-instance\" xmlns:xsd=\"http://www.w3.org/2001/XMLSchema\"> ... </ServiceRequestName>
Both namespaces are default included when serializing a System.ServiceModel.Channels.Message instance from inside WCF client processing. You can however prevent this by implementing an subclass of System.ServiceModel.Channel.Body.Writer, and override the OnWriteBodyContents method. Next inject this method within the WCF Channel:

    System.ServiceModel.Channels.Message message =
        Message.CreateMessage(
            MessageVersion.None,
            string.Empty,
            new XmlBodyWriter<TRrequest>(request));

    var responseMessage = webmqclient.Request(message);

...

public class XmlBodyWriter<TBody> : BodyWriter
{
    private readonly TBody request;

    public XmlBodyWriter(TBody request) : base(true)
    {
        this.request = request;
    }

    protected override void OnWriteBodyContents(XmlDictionaryWriter writer)
    {
        WriteBodyContents(SerializeRequest(), writer);
    }

    private static void WriteBodyContents(StringBuilder output, XmlWriter writer)
    {
        using (var reader = new StringReader(output.ToString()))
        {
           using (XmlReader xmlReader = XmlReader.Create(reader))
            {
                writer.WriteNode(xmlReader, true);
            }
        }
     }

    private StringBuilder SerializeRequest()
    {
        var output = new StringBuilder();

        using (var xmlWriter = XmlWriter.Create(output,
            new XmlWriterSettings { OmitXmlDeclaration = true, Encoding = Encoding.UTF8 }))
        {

            var serializer = new XmlSerializer(typeof(TBody));

            // Service does not expect xml-namespacing; including the standard
            // xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
            // xmlns:xsd="http://www.w3.org/2001/XMLSchema"
            // Omit these from the serialized request
            XmlSerializerNamespaces ns = new XmlSerializerNamespaces();
            ns.Add("", "");
            serializer.Serialize(xmlWriter, request, ns);
        }

        return output;
    }
}
With this modification to the serialization processing of the WCF MQ request, the service at the receiving WebSphere MQ side now replies with a correct functional response.

Monday, August 15, 2011

Deleted SPWeb in RecycleBin can obstruct deactivation of Sandboxed Solution

The other day, upon deactivating a Sandboxed Solution in our test farm, SharePoint aborted on it with the message:
Cannot access web-scoped feature {GUID} because it has references to a site with id {GUID}.
System.ArgumentException: Value does not fall within the expected range.
at Microsoft.SharePoint.SPWebCollection.get_Item(Guid id)
at Microsoft.SharePoint.SPFeatureEnumeratorBase.GetCachedWeb(SPSite site, Guid webId, Guid featureId)
We earlier encountered this problem in the development farm. Thorough investigation by SharePoint operations together with development led to the conclusion that the SharePoint content database had reached a corrupt state due deletion of a SPWeb. On that SPWeb a Feature provisioned via the Sandboxed Solution had been activated. And now after deletion of that SPWeb, apparently a lock internal in the SharePoint content database was present obstructing disablement of that Feature. And since it is not recommended - and certainly unsupported - to manually alter a SharePoint content database, there seemed no other realistic approach to get out of this erroneous situation as by recreating and reprovisioning the entire sitecollection.
For the development instance, this was an acceptable pragmatic solution. However, in our test environment a lot of content is already created by end-users. Just throwing away the SPSite and replacing it by a brand new, is not viable. The only remaining solution seemed to afterwards restore the backed-up content into the new SPSite. Not undoable, but it will take time [to setup and execute the site content restore; and next to validate the correctness and completeness of it]. And moreover, it will result in a loss of trust by our end-users and customer on the robustness of SharePoint 2010 as application platform. If it happens once [actually twice], what guarantee is there it will not happen again?
Luckily, then I had a smart thought while discussing the problem symptons with a co-developer. If the problem appeared to be caused by the deletion of a SPWeb, would it then help to restore this SPWeb instance from the RecycleBin? Worth a try. And guess what: it did!! After the earlier deleted SPWeb had been restored from the recyclebin, the earlier activated Sandboxed Solution could next successfully be deactivated.

Friday, August 12, 2011

Misleading SecureStore message with InfoPath Forms Services

In current project we publish an InfoPath 2010 formtemplate to SharePoint 2010 sitecollection. In the formtemplate, multiple controls are populated with data retrieved from a single SharePoint list via application of owssvr.dll and filtering views on that list. The SharePoint WebApplication is set up as Claims-Based. A consequence is that from within InfoPath Forms Services context it is not possible to authenticate via owssvr.dll webservice to the SharePoint list. The proper solution for this is to retrieve the data via Universal Data Connections, and let each UDCX authenticate itself to SharePoint. Either by explicit authentication; that is including the credentials in all UDCX files. But preferably by using the SecureStore; so that the credentials are maintained in a single location and not readable included in the UDCX files.
So far for the theory. In real practice, we encountered a problem with this setup: our formtemplate failed to retrieve the filtered data. In the ULS following message was logged per owssvr.dll/XmlQuery data connection upon opening the formtemplate in browser:
InfoPath Forms Services Maintenance 82lm Information Delegation was attempted for Secure Store application APPL_InfoPathService. (User: 0#.w|domain\useraccount, IP: , Request: http://appl.dev.hosting.corp/Pages/orderform.aspx)
This seems to indicate as if the individual logged-in user has no permission to access/use the SecureStore ApplicationID. Inquiry with Operations refuted that suspicion: the SSS ApplicationID was configured for 'All Authenticated Users'. The actual cause appeared that the service account configured in the SSS ApplicationID no longer had permission/read access to the SharePoint site collection. Instead of above message, I would have preferred a direct 401 or AccessDenied cause-indication in the ULS...

Wednesday, August 3, 2011

Publication on 'Succesful disclosure of SAP data and processes in SharePoint'

In the May-edition of the Dutch Software Development Network (SDN) magazine I published an article co-authored with Marcel Kempers on the why, and approach for how succesful disclose SAP data + processing within a SharePoint based solution. As of this month, the publication is also online available. Mind you; it is in Dutch...

Tuesday, July 26, 2011

Personalization approach / settings administrated outside content database

Personalization is an important functional aspect in my current SharePoint 2010 project. Personalization on itself is a phrase that can have multiple meanings and appearances. Examples:
  • Customize a page for your own preferences; via personal settings of webpart properties
  • Content targeting; on basis of the visitors profile
SharePoint enables page personalization via the Provider model, inherited from ASP.net. Implication of the standard SharePoint personalization approach is that each authenticated user gets an own copy per personalized page in the content database. The personalized settings are thus administrated in the content database, similar as in case of shared webpart properties. A disadvantage of this approach is that this makes it very difficult to (web content) manage the page. A content manager can update the (template) page, but none of the content changes are automatically propagated to the individual personalization page copies in the content database.
In our application scenario, we have the additional functional requirement that it must be viable to extract management information reports of the personalized settings: what type of user (profiles) typically select personalized setting X, choose to close webpart Y etcetera. Although not impossible, it is rather impractical to extract this management information when the personalized data is stored in the content database. The schema of the SharePoint content database must be treated as internal black box, and cannot be relied on. The schema also does not support ad-hoc querying for answering varying management info requests.
We also have User Experience related requirements: the users must be able to personalize, without being aware or be directly confronted with SharePoint. It should be intuitive and natural, without explicit [noticable] personalization-setting modus. Examples of requested personalizations are tuning the webpart behavior via settings, and dynamically reposition the webparts in the page.
These combined end-user and management requirements effectively rank out the utilization of standard SharePoint personalization. So what’s a viable alternative? The one we came up consists of the following elements:
  1. [Standard] shared webpart properties for enabling the content managers to set the initial and default values for personalizable settings. The shared value is regularly stored in the SharePoint content database, and applied for each visitor which has not [yet] explicitly personalized the shared setting.
  2. Storage of personalization settings in an own SQL Server database
  3. The iGoogle-like reposition behavior via jQuery and webpart zones tagged as droppable zones
  4. An abstract base PersonalizationWebPart; that handles both the server-side aspects of personalization (retrieving + applying personalization settings, as well as saving them), as the client-side (webpart movement); and that implements virtual methods for concrete subclasses to hook into.
  5. Custom AudienceProvider to derive on-the-fly whether visitor is within a certain audience by checking setting stored in the own SQL Server database [thus no need for User Profiles, nor compiling of Audiences]
Extra and a major advantage of this personalization approach is that there no longer originates a copy of the page being personalized. If the content manager updates the page, by adding or removing a webpart, modifying content on a publishing page; these changes are automatic and immediate effective for all the visitors, whether one has personalized the page or not.

Wednesday, July 13, 2011

Make AudienceFilter webpart setting visible without User Profile Service Application started

In current project we aim to apply SharePoint's audiencing mechanisme for content targetting. As audience filters we'll use SharePoint group membership, we will not derive the audience from User Profile properties. In our current state of SharePoint farm, the User Profile service application is even not available .
One of the advantages of SharePoint audiencing is that it is out-of-the-box available for every webpart; standard SharePoint and custom webparts. However, when we intended to apply an audience filter to a standard ContentEditorWebPart in our test-environment, we were confronted with a missing Audiencing setting in the webpart toolpart. Search on the web points to prerequiste of the User Profile service application being activated. However, this is neither possible in our farm infrastructure planning, nor needed for the manner in which our application will use audiencing, that is on basis of SPGroup memberships.
Via reverse engineering SharePoint code [using .NET Reflector] I found out what directly determines the visiblity of the AudienceFilter webpart property. The responsible AdvancedToolpart class does a check on the web.config for config property "SharePoint/RuntimeFilter": if the property is not present, or the referred assembly is not valid, none of the webparts in this SharePoint webapplication will display the AudienceFilter webpart property in their toolpart. I validated this by outcommenting in web.config the property; for instance the toolpart of a ContentEditorWebPart then misses the AudienceFilter setting. After reinstating the web.config property, and reopening the toolpart; the webpart property is visible again. And thus available for usage for filters on basis of SharePoint groups, or a custom AudienceProvider.

Friday, July 8, 2011

IBM WebSphere MQ WCF Channel has problem with clustered Queues

The Enterprise Architecture roadmap of customer aims at a SAP-unless policy, and SharePoint for all web-applications/front-ends. Currently there are still multiple applications in their IT landscape which do not obey do this future direction. For some of them, e.g. J2EE based service application, the enterprise architecture integration guidelines prescribes that the client-services communication occurs via IBM WebSphere MQ (WMQ). From the .NET client perspective, WCF based communication is nowadays preferred and proven technology, and must be applied for any new web-applications that invoke other applications.
For .NET client context there are in reality 2 viable WCF Channel options for putting requests as messages on an IBM WebSphere Queue (WCF WMQ). One is part of Microsoft Host Integration Server, the other one is provided by IBM.
For reasons explained else, the.NET and Integration architects decided to apply the IBM WebSphere MQ Channel for WCF. It is successful applied in several .NET applications to put messages on WMQ. In my current application project we also need to invoke services of a ‘legacy’ system, callable over WMQ. However, when this application puts a message on the assigned Queue via Request-Reply, we ran into the following exception:
System.ServiceModel.EndpointNotFoundException was unhandled by user code
Message=WCFCH0309E: An error occurred while attempting to open the channel for endpoint 'jms:/queue?destination=APPL.REQ.QUEUE@XXXX01&connectionFactory=connectQueueManager(XXXX01)clientChannel(WMQ.Channel.TST)clientConnection(dev-xx.xxx.com(7201))&initialContextFactory=com.ibm.mq.jms.Nojndi&persistence=1&replyDestination=APPL.RPL.QUEUE' The operation could not be completed. The endpoint may be down, unavailable, or unreachable, review the linked exception for further details.
Source=mscorlib
StackTrace:
Server stack trace:
at IBM.XMS.WCF.XmsChannelHelper.ThrowCommsException(OperationType op, Exception innerException, String endpointURI)
at IBM.XMS.WCF.XmsChannelHelper.CheckExceptionAndTimeout(OperationType op, Exception e, String timeout, String endpointURI)
at IBM.XMS.WCF.XmsRequestChannel.OnEndOpen(IAsyncResult result)
at IBM.XMS.WCF.XmsRequestChannel.OnOpen(TimeSpan timeout)
at System.ServiceModel.Channels.CommunicationObject.Open(TimeSpan timeout)
at System.ServiceModel.Channels.ServiceChannel.OnOpen(TimeSpan timeout)
at System.ServiceModel.Channels.CommunicationObject.Open(TimeSpan timeout)
at System.ServiceModel.Channels.ServiceChannel.CallOnceManager.CallOnce(TimeSpan timeout, CallOnceManager cascade)
at System.ServiceModel.Channels.ServiceChannel.EnsureOpened(TimeSpan timeout)
at System.ServiceModel.Channels.ServiceChannel.Call(String action, Boolean oneway, ProxyOperationRuntime operation, Object[] ins, Object[] outs, TimeSpan timeout)
at System.ServiceModel.Channels.ServiceChannel.Request(Message message, TimeSpan timeout)
Exception rethrown at [0]:
at System.Runtime.Remoting.Proxies.RealProxy.HandleReturnMessage(IMessage reqMsg, IMessage retMsg)
at System.Runtime.Remoting.Proxies.RealProxy.PrivateInvoke(MessageData& msgData, Int32 type)
at System.ServiceModel.Channels.IRequestChannel.Request(Message message)
at Application.<CustomCode>.Client.<InvocateAMethod>(MethodRequest_1_0 request)
at System.Web.UI.WebControls.Button.OnClick(EventArgs e)
at System.Web.UI.WebControls.Button.RaisePostBackEvent(String eventArgument)
at System.Web.UI.Page.RaisePostBackEvent(IPostBackEventHandler sourceControl, String eventArgument)
at System.Web.UI.Page.ProcessRequestMain(Boolean includeStagesBeforeAsyncPoint, Boolean includeStagesAfterAsyncPoint)
InnerException: IBM.XMS.IllegalStateException
Message=Failed to open MQ queue APPL.REQ.QUEUE. XMS attempted to perform an MQOPEN, but WebSphere MQ reported an error. Use the linked exception to determine the cause of this error. Check that the specified queue and queue manager are defined correctly.
Source=IBM.XMS.Client.WMQ
ErrorCode=XMSWMQ2008
It took me several hours and diverse try-outs together with an MQ developer at the receiving WebSphere MQ side, to ultimately find out the following. The IBM WCF transport channel for WMQ (WebSphere MQ 7.0.1 installation) has a problem with clustered queues, and errors when attempting to open the queue. In case of putting messages to a non-clustered queue – the WMQ infrastructure of the other .NET applications -, the channel opens successfully and messages request + replies are delivered. The problem is reproduced outside the context of our SharePoint application, via a simple testprogram; thus appears to be structural. We are currently addressing IBM to analyze it, and come up with a structural solution. For now, the pragmatic solution appears to communicate via non-clustered Queue.

Tuesday, July 5, 2011

Using ServiceLocator in Sandbox requires pre-generated xml-serialization

The SharePoint Guidance [SPG] ServiceLocator provides you with a ready-to-use implementation of a Dependency Injection container. However, be aware of the following restriction when applying the SPG ServiceLocator in a constrained Sandbox context: it is not possible to compile on-the-fly generated xml-serialization classes, which are required for administrating the service interface and implementation within the servicelocator typemappings.
The execution of code statement:
typeMappings.RegisterTypeMapping<IServiceContract, ServiceImplementation>();
resulted in the following exception inside ServiceLocator code:
Microsoft.Practices.SharePoint.Common.Configuration.
ConfigurationException: Error on serializing configuration data ---> Microsoft.SharePoint.UserCode.SPUserCodeSolutionProxiedException: Error on serializing configuration data ---> Microsoft.SharePoint.UserCode.SPUserCodeSolutionProxiedException: Cannot execute a program. The command being executed was "C:\Windows\Microsoft.NET\Framework64\v2.0.50727\csc.exe" /noconfig /fullpaths @"C:\Users\SPadministrator\AppData\Local\Temp\OICE_925B8D58-F6D1-4BB3-9E01-FCBB9D816D0B.0\wkdrltlt.cmdline".

Explanation of this behavior:

SPG ServiceLocator applies XML serialization to administer the typemappings for the service interface and service implementation class. .NET XML serialization approach supports JIT compilation of the required xml-serialization classes. The benefit is that you do not need to generate and deploy these xml-serialization classes, the XmlSerializer framework generates them on spot. However, to be able to do this assembly generation; the .NET compiler is needed. And when within the Sandboxed constraints, you are not allowed to use filesystem resources.
By doing a little ‘digging’ in the SPG runtime pipeline, I discovered that not even my own custom classes caused the runtime serialization error; it actually could be derived back to the following serialization:
new System.Xml.Serialization.XmlSerializer(typeof(
Microsoft.Practices.SharePoint.Common.ServiceLocation.ServiceLocationConfigData))
The solution is to make sure that the required xml-serialization classes are already available: either in the GAC for generic use, or also deployed in the sandbox. I opted in this particular case to deploy Microsoft.Practices.SharePoint.Common.XmlSerialization.dll in the Sandbox.

Sunday, June 19, 2011

Working with complex SAP business entities in Duet Enterprise

Duet Enterprise natively only supports flattened SAP data entities. Typically however business data exhibits a complex structure. Via custom programming against the SharePoint BCS ObjectModel it is possible to integrate complex SAP business entities via Duet Enterprise in SharePoint.
This blog is earlier published on SAP Community Network Blogs
A sales talk of Duet Enterprise is that you can integrate with minimal effort SAP data and functionality in SharePoint UI. This however is only true if your SAP data complies to the following constraints: 1) the SAP data entities must at least be made accessible via a Query and ReadItem operation; 2) in case of full update context, also the Create, Update and Delete operations must be available in the SAP data entity; 3) the SAP data entity must exhibit a flat data structure.
Constraints 1 and 2 are imposed by the usage of SharePoint Business Connectivity Services (BCS) for working with SAP external data from within SharePoint context. As result of the central BCS role in the Duet Enterprise architecture, these constraints are inevitable. Constraint 3 is required to visualize the SAP data via SharePoint External List in the well-known UI List format. Aspect of the list UI metaphor is that it resembles a data table, with row-based data.
Problem however is that typically, enterprise data is non-flattened. Enterprise data often has an hierarchical structure: parent with multiple child data entities. E.g. an Order containing multiple Orderlines; an Expense form with multiple individual ExpenseDetails. As result of constraint 3, it follows that such complex data cannot be visualized in SharePoint UI via the out-of-the-box External List. Does this mean that SAP complex business entities thus cannot be integrated within SharePoint UI context via Duet Enterprise? The answer to that is negative. The applied Duet Enterprise architecture itself, with SharePoint BCS and the SAP Service Consumption Layer (SCL) in the middle, has neither restrictions nor problems with interoperating on and exchanging hierarchical SAP data. It is merely the External List concept that cannot visualize complex data (note: this limitation is generic, thus not only for SAP data structures, but also non-SAP business data structures; as Oracle, Microsoft Dynamics, SQL-based, WCF data objects, …). You can still apply Duet Enterprise as SAP / SharePoint integration foundation in case of complex SAP data structures, but it requires you to build a custom UI in SharePoint instead of the out-of-the-box External List concept. Nothing withholds you from constructing a custom SharePoint UI to display the hierarchical data, e.g. in a master-slave UI concept.
Well, not entirely true…; a noticeable problem with building a custom UI to interoperate Duet Enterprise, is that the custom UI must then program against the BCS ObjectModel API. And the BCS OM currently only provides a [very] weakly typed programming model, instead of the strong-types we are used to in a regular .NET context (e.g. WCF data objects plus WCF interfaces). The explanation for this is that BCS is a generic concept, intended to interoperate against arbitrary external data repositories and various and a priori unknown data structures. In its current stadium, the mapping at .NET client side to the concrete external data structures is enabled via a general purpose ‘BCS operation language’. The problem with that BCS language is that there is no compile-time checking nor support to prevent you from making typo or structure mistakes.

An example interoperating complex SAP data into SharePoint

Via an example I will demonstrate in overview what it takes to interoperate complex SAP data via Duet Enterprise into a SharePoint front-end.

Example

Employee Self-Service process for expense handling. The process consists of multiple steps: it starts when an employee submits an expense form for receiving a refund; next the employee’s manager will review the form, and either approve, deny, or return it for further explanation. In case of approval, the finance department will refund the made expenses. In case of return, the employee can augment the expense form with additional remarks, and then resubmit. Or the employee can decide to withdraw the returned expense form. In case of denied expense, the process directly stops. Each process execution ends with archival of the submitted form and other documentation.
Current, this process is already implemented within the SAP environment. Employees can enter a form via a WebDynpro form; which kicks of a SAP workflow process. The manager receives the review task in this workflow within the SAP Universal Work List. Payment is done via the HR business package.
Although thus the expense process is already implemented in the IT landscape, management feels the need to improve on it. Employees are complaining about the WebDynpro UI, which looks and behaves different as the SharePoint based intranet in which they are performing their primary work activities. The managers on the other hand are perfectly satisfied to perform the review task within the SAP GUI, since this is for them also for most other tasks the familiar work environment. The financial handling is mostly automatic performed in the SAP backend, near to the other financial processes; so neither any reason to change on that. The management therefore decides to bring the employee’s involvement in the process into the context of the SharePoint based intranet, while leaving the rest of the process execution untouched (phase 1).

Step 1: derive the requirements and the functional + process requirements

Since the process is already implemented; this step can largely reuse on what is already available qua system and process specifications.

Step 2: derive the integration and software architecture

The project goal is enabling employees to operate their steps in the expense handling process directly within the SharePoint intranet, with the same familiar UI look & feel as other parts in that intranet. Implication of that is to interchange the WebDynpro front-end for a SharePoint front-end. Via well-derived integration points this new front-end must hook into the existing expense process.

Step 3: define the interoperability interface of the SAP backend

With the integration and software architecture in place, including the conceptual specifications of the integration interfaces SharePoint front-end <– SAP process; the next step is to realize the process integration at SAP side. Following the Duet Enterprise Development Steps, this first means to map the conceptual interfaces onto SAP SCL Interfaces. That development activity is done in SAP Enterprise Service Builder (ES Builder). There, you specify the data types, message types, and method operations.
Screenshots of the specification at SAP side of a Query operation signature, with a complex hierarchical SAP data entity

Step 4: Implement the SCL Interfaces

Actually, this is more than one single step. It involves Mapping, connecting to relevant parts in the SAP backend, routing, potentially composition of multiple SAP data entities, et al. Although all very interesting, this has no direct relation or influence on the way complex SAP data can be handled in the SharePoint front-end. Therefore these steps are not discussed here. See the Duet Enterprise Development Guide for a proper explanation of the steps.

Step 5: Generate the interoperability interface for the SharePoint front-end

A result of step 3 is the WSDL specification of the SAP SCL interface. With this WSDL, and a runtime SAP Proxy implementation, it is possible to generate an External Content Type. The tool used for this is SharePoint Designer 2010.
Screenshots of the generation at SharePoint side of an External Content Type to SAP backend, with a complex hierarchical SAP data entity exchanged

Step 6: Program to the interoperability interface for the SharePoint front-end

In case the SAP data entity had a flat representation, we would be nearly finished. Just generate an External List in SharePoint Designer or via the SharePoint UI, and connect to the generated External Content Type. However, the expense data entity at SAP side exhibits a complex structure: form with multiple detail lines. This cannot be reasonable visualized in the table-oriented format of External List (remember that the start of this project is to improve on the user experience for the employee; so it must feel natural and intuitive). If the External List is not feasible, then it is required to build a custom UI in SharePoint context. The custom UI must interoperate against the BCS ObjectModel API, to query, retrieve, create and update the SAP expense data entities. In its current state, the BCS API can only be operated via a weakly-typed program model. The draw side is that programming at that low-level is both cumbersome as error prone. However, if done correctly and secure, it does the job. And enables you to interoperate complex SAP data entities via Duet Enterprise in a custom SharePoint / .Net context.
Example code of interoperating BCS to create a new expense entity within SAP backend

BdcService service = SPFarm.Local.Services.GetValue(String.Empty);
IMetadataCatalog catalog = service.GetDatabaseBackedMetadataCatalog(SPServiceContext.Current);
IEntity entity = catalog.GetEntity("TopForce.com", "Expense");
IView createView = entity.GetCreatorView("Create");
IFieldValueDictionary methodFields = createView.GetDefaultValues();
methodFields["EmployeeNumber"] = "1";
methodFields["Status"] = "open";
methodFields["EmployeeComment"] = "my comment";
methodFields["Date"] = DateTime.Now;

object tickets = methodFields.CreateCollectionInstance("Tickets", 2);
methodFields["Tickets"] = tickets;
methodFields["Tickets[0]"] = methodFields.CreateInstance("Tickets[0]");
methodFields["Tickets[0].Date"] = DateTime.Now;
methodFields["Tickets[0].Amount"] = "35.45";
methodFields["Tickets[0].Description"] = "Description 1";
methodFields["Tickets[1]"] = methodFields.CreateInstance("Tickets[1]");
methodFields["Tickets[1].Date"] = DateTime.Now;
...
Identity id = entity.Create(fieldValueDictionary, LobSysteminstance);

Step 7: Build an UX custom SharePoint front-end

With the integration layer SharePoint/BCS - Duet Enterprise - SAP ready, it is now rather standard ASP.NET programming to build the custom UI. In the custom UI you can utilize the full toolbox of ASP.NET webcontrols, jQuery, and even Silverlight. Another viable option in SharePoint 2010 is InfoPath based forms.
Some custom UI forms for the expense handling: overview, details and submit new expense


Final note

Thus although it requires more work – at the SharePoint side; it is possible to apply Duet Enterprise for SAP / Microsoft interoperability even when dealing with complex SAP data entities. The extra work is restricted to the SharePoint front-end part; the other parts of the Duet Enterprise pipeline are not influenced. Duet Enterprise on itself does not limit the usage of complex SAP data entities; this restriction is merely within the current out-of-the-box capabilities of SharePoint BCS and External List.

Saturday, June 11, 2011

SPWebConfigModification playing tricks in farm

At customer premisses we apply a.o. the following deployment guidelines:
  1. In case of Farm-based deployment, then deploy to virtual bin (WebApplication), unless...
  2. Apply SharePoint support for all required web.config modifications; instead of manual action (which is error prone in the farm).
Both guidelines are valid from SharePoint operations AND development perspective, to ensure a consistent and controlled SharePoint farm situation.
The SharePoint platform provides multiple deployment functionalities to comply with the 2nd guideline. Via the SharePoint solution manifest, you can specify modifications that typically have effect on the runtime operation of an assembly: SafeControls, CodeAccessSecurity. And via the SPWebConfigModification class you can modify in a controlled manner the web.config for other changes; e.g. for custom application settings, navigation providers. As from SharePoint 2007, I'm a big fan of applying SPWebConfigModification - it gives you as developer full control to have the needed web.config modifications executed upon application deployment and/or provision time. And to make it even better: the same set of changes are applied to all the individual web.config files of the SharePoint webapplication in the total farm; accross zones and accross servers. Even when at a later time another server is added to the farm, the same set of web.config modifications are applied (rather, repeated) also on that new server.
However, the SPWebConfigModification functionality does have it's peculiarities. Lately we encountered one I was not yet aware of, and which caused us both some headaches as well as cost some elapsed time.
The situation is as follows: SharePoint assembly that uses EntLib 5.0 for accessing an external SQL database. Comform the guidelines, this custom assembly is deployed to the virtual bin; and it is therefore required to set custom CAS-policies for this assembly. One of the required permission is the SqlClientPermission. That permissions is not present in the default WSS_Minimal trustlevel, but it is within WSS_Medium. To have the total set of custom CAS-policies based on the Medium trustlevel, a 2-steps approach is applied:
  1. First change the trustlevel in the web.config from WSS_Minimal to WSS_Medium; a task performed via SPWebConfigModification class
  2. Next, deploy the SharePoint solution with in its manifest the CodeAccessSecurity element for the assembly. The resulting custom CAS-policy file is now based on the Medium trust level, thus inheriting a.o. the SqlClientPermission SecurityClass setting. The SharePoint solution framework takes care of the required modifications in the web.config: link to the generated custom policy file, and set the trustlevel to WSS_Custom.
The setup worked perfectly, both local as in the shared test-farm.
That is, initially. From time to time our application appeared broken. Root cause analysis exhibited that in those situations our web.config was modified; but without any deployment activity on our own webapplication (???) However, another SharePoint webapplication in the same farm had been redeployed. Each time that application was redeployed, our web.config was modified; and the trustlevel reset from WSS_Custom to WSS_Medium.
It appears that this is standard behaviour of the SPWebConfigModification class: each time that it is requested to apply administrated SPWebConfigModifications entries in the context of a single SPWebApplication, it effectively re-applies the administrated SPWebConfigModifications of ALL the SharePoint web-applications in the farm. At minimal, the result of this is that all the web.config files in the farm are touched; and have their timestamp updated. But in our case, the administrated SPWebConfigModification for setting the trustlevel to WSS_Medium was reapplied; which broke our application!
So I learned 2 things here:
  1. SPWebConfigModification is not a 'decent' SharePoint citizen; I regard it as ultimately incorrect that the intented application of SPWebConfigModification administrated entries on 1 SharePoint web-application, also effects all other SharePoint web-applications web.configs.
  2. That as result of this behaviour the considered 2-steps approach for setting the correct medium level of a.o. SQLClientPermission cannot be maintained; due the inherent risk that the administrated SPWebConfigModification can be re-applied at any moment; out of the control and knowledge of the administrators of our webapplication. So I modified this to set the CAS-policies in a single step; directly steered via the Solution manifest file.

    Thus instead of the below CodeAccessSecurity specification, which is relative and relies on the presence of a.o. SqlClientPermission SecurityClass in the base trust-level (WSS_Medium):

    <PermissionSet class="NamedPermissionSet" version="1">
    <IPermission class="SecurityPermission" version="1" Flags="Execution" />
    <IPermission class="AspNetHostingPermission" version="1" Level="Medium" />
    <IPermission class="SqlClientPermissionversion" version="1" Unrestricted="true" />
    <IPermission class="SqlClientPermission" version="1" Unrestricted="true" />
    <IPermission class="Microsoft.SharePoint.Security.SharePointPermission, Microsoft.SharePoint.Security, Version=14.0.0.0, Culture=neutral,
    PublicKeyToken=71e9bce111e9429c" version="1" ObjectModel="True" />
    <IPermission class="Microsoft.Office.SecureStoreService.Server.Security.SecureStorePermission, Microsoft.Office.SecureStoreService.Server.Security,
    Version=14.0.0.0, Culture=neutral, PublicKeyToken=71e9bce111e9429c" version="1" Unrestricted="true" />
    <IPermission class="System.Security.Permissions.ReflectionPermission, mscorlib, Version=2.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089" version="1"
    Unrestricted="true" />
    <IPermission class="System.Diagnostics.EventLogPermission, System, Version=2.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089" version="1"
    Unrestricted="true" />
    <IPermission class="WebPartPermission" version="1" Connections="True" />
    </PermissionSet>

    specify all the required permissions explicit:

    <PermissionSet class="NamedPermissionSet" version="1">
    <IPermission class="Microsoft.SharePoint.Security.SharePointPermission, Microsoft.SharePoint.Security, Version=14.0.0.0, Culture=neutral, PublicKeyToken=71e9bce111e9429c" version="1" ObjectModel="true" Unrestricted="true" />
    <IPermission class="SecurityPermission" version="1" Flags="Execution" Unrestricted="true" />
    <IPermission class="System.Web.AspNetHostingPermission, System, Version=2.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089" version="1" Level="Medium"/>
    <IPermission class="System.Data.SqlClient.SqlClientPermission, System.Data, Version=2.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089" version="1" Unrestricted="true"/>
    <IPermission class="Microsoft.Office.SecureStoreService.Server.Security.SecureStorePermission, Microsoft.Office.SecureStoreService.Server.Security, Version=14.0.0.0, Culture=neutral, PublicKeyToken=71e9bce111e9429c" version="1" Unrestricted="true" />
    <IPermission class="System.Security.Permissions.ReflectionPermission, mscorlib, Version=2.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089" version="1" Unrestricted="true" />
    <IPermission class="System.Diagnostics.EventLogPermission, System, Version=2.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089" version="1" Unrestricted="true" />
    <IPermission class="Microsoft.SharePoint.Security.WebPartPermission, Microsoft.SharePoint.Security, Version=14.0.0.0, Culture=neutral, PublicKeyToken=71e9bce111e9429c" version="1" Connections="True"/>
    </PermissionSet>