Sunday, December 29, 2013

Tip: Resolve GWPAM issue with misnamed embedded Resources reference

It is a common approach applied by .NET developers to give multipart names to the individual projects in a Visual Studio solution. The name parts identify different aspects of the Visual Studio project, a typical name pattern is "<CompanyName>."<Application>."<Component>".
Example of this: "TNV.Purchase.OrderManagement".
I applied this naming approach also for a GWPAM project to build a Microsoft Outlook AddIn: TNV.TasksClustering.OutlookAddIn. However, after I installed the compiled AddIn, Outlook failed to load it. The error details were:
System.Reflection.TargetInvocationException: Exception has been thrown by the target of an invocation. ---> System.Resources.MissingManifestResourceException: Could not find any resources appropriate for the specified culture or the neutral culture. Make sure "TNV_TasksClustering_OutlookAddIn.Icons.resources" was correctly embedded or linked into assembly "TNV.TasksClustering.OutlookAddIn" at compile time’
I inspected the GWPAM generated code to analyze the problem cause. It appears that GWPAM replaces in the code that it generates, the dots ('.') in the Visual Studio project name into the underline ('_') character. Thus 'TNV.TasksClustering.OutlookAddIn' becomes 'TNV_TasksClustering_OutlookAddIn' in the generated C# code. But a side effect is that this results in a naming-inconsistency with embedded Resources: Visual Studio namely by default embeds .NET resources files in the compiled assembly with the unchanged project name, in this case thus still with the dots:
I have reported this issue to the Duet Enterprise / GWPAM product team. I expect it to be fixed in a forthcoming service pack. Until then, the pragmatic and simple workaround is to correct in the generated C# code the line that tries to load the embedded resource:

Tuesday, December 24, 2013

SharePoint 2010 BCS SAML Assertion too coarse-grained

Results in false positives notification at service providers of message replay

Security Assertion Markup Language (SAML) is an XML standard for exchanging authentication and authorization data between security domains. SAML is an XML-based protocol that uses security tokens containing assertions to pass information about a principal (usually an end user) between a SAML authority, that is an identity provider, and a SAML consumer, that is a service provider. SAML assertions contain statements that service providers use to make access control decisions. SAML enables web-based authentication and authorization scenarios including cross-domain single sign-on (SSO).
Source: http://en.wikipedia.org/wiki/SAML_1.1
SharePoint 2010 supports SAML 1.1 to authenticate as service consumer to external systems. Involved service applications are Secure Token Service as SharePoint internal identity provider, and Business Connectivity Services as service consumer to external systems. Required part of SAML support is that the identity provider must unique assert each identity token. This uniqueness is used by service provider to verify the validity of a SAML assertion: in case already ‘used’, the identity assertion should be interpretated as ‘stolen’ and potential misused in message replays.
Message replays are applied in 2 categories of security attacks:
  • Service Availability: overflood the service provider with same requests over and over in a Denial of Service Attack.
  • Data Integrity: request same data updates multiple times (e.g. bank transfer).
To protect against message replay attacks, service providers need to verify the idempotency of each received client request. Each unique message should be handled only once, also in the case it is send + received more than once to compensate for unreliable message delivery.

SAML assertion by SharePoint BCS

SharePoint BCS in the role of service consumer supports this by unique tagging each request sent with a SAML Assertion ID added in the message header. SharePoint BCS calculates this SAML SignatureValue based on 2 values to uniquely tag the message: message issuer and message instant id. The issuer value is fixed set to “SharePoint”, the instant value is dynamic set to timestamp at moment of sending:
<saml:Assertion xmlns:saml="urn:oasis:names:tc:SAML:1.0:assertion" AssertionID="_2519fc29-9577-403c-a66d-7926b55760a2" IssueInstant="2013-11-08T08:39:07.274Z" Issuer="SharePoint" MajorVersion="1" MinorVersion="1">
As it turns out this tagging is not guaranteed unique. In case more than 1 message is send by BCS in the same millisecond, the issue instant is equal for all those messages and therefore also the calculated SAML Assertion ID. And the receiving SAML consumer, aka the service provider will thus likely regard all these messages as equal, handle only the first, and refuse to handle the supposed duplicates.

Duet Enterprise: SAP Gateway as SAML consumer of SharePoint BCS

Duet Enterprise 1.0 applies SAML 1.1 to provide Single Sign-On from SharePoint 2010 into SAP NetWeaver Gateway 2.0. In the Duet Enterprise runtime client-service flow, SharePoint BCS is the service client, and SAP NetWeaver Gateway is the service provider. So Gateway must verify requests received from SharePoint BCS on idempotency, aka detect message replays. As result of the vulnerability in the BCS SAML assertion approach, Gateway may report false positives of message replay. This is visible in the NetWeaver WS (WebServices) errorlog via presence of error: CX_WS_ST_CACHE_ERROR / Error when writing the ST to DB.
I raised this issue to Duet Enterprise team for investigation. Their rightful response is that the problem cause is not within Duet Enterprise, but lies within SharePoint. So we contacted Microsoft. Microsoft Support acknowledges the problem cause, but sadly stated that “as it is not critical” no solution will be provided on short notice. Consequence is that you either have to accept the potential occurrence of false positives of message replays, or to ask SAP BASIS to disable the message replay detection mechanism at the receiving SAP side. This boils down to first answering the question whether it is likely that multiple requests are made from SharePoint BCS to SAP within same timestamp. A typical example of this is SharePoint search crawling, which issues a batch of requests into SAP. But it also may occur in a SharePoint webpart, when it uses a composition of SAP data services to render the webpart UI (e.g. customer + order headers).
See also: Explanation + resolution for CX_WS_ST_CACHE_ERROR, on the Duet Enterprise Space on SAP Community Network (SCN).

Sunday, November 17, 2013

Duet Enterprise / GWPAM interview at SAP TechEd Amsterdam

Recording of the interview I gave at SAP Tech-Ed Amsterdam, talking about my experiences with SAP NetWeaver Gateway and Duet Enterprise, SAP Customer Engagement Initiative programs, and the recent released NetWeaver Gateway Productivity Accelerator for Microsoft (GWPAM).

Friday, November 1, 2013

Tip: activate Gateway Metadata cache to resolve NullPointer / CX_SY_REF_IS_INITIAL error

For a customer we are installing the myHR suite from Cordis Solutions to deliver ESS/MSS functionality within SharePoint application. The myHR suite is a SAP-certified Duet Enterprise product. As part of the project we first deploy the prerequisite supporting platform elements SAP NetWeaver Gateway 2.0 and Duet Enterprise for SharePoint and SAP. And next install the myHR suite. We do this across the customer landscapes of development/test, acceptance and production environments. Each environment installation ends with technical validation of the environment: Gateway, Duet Enterprise, Cordis myHR.
Per technical validation of the environments, I encounter the issue that the first 2 to 3 invocations for each of the myHR Duet Enterprise services returns the error CX_SY_REF_IS_INITIAL. After 3 service invocation calls, each myHR service successfully returns the requested SAP data. The explanation of this behavior is a kind of ‘warm-up’ effect: the Duet Enterprise services installed in the SAP Gateway system are not yet compiled / generated until first invocation, resulting in the CX_SY_REF_IS_INITIAL error on the Gateway system, and NullPointer exception in the invoking SharePoint UI-context. After a few invocation attempts, the service components are generated and available at runtime in the SAP system, and the invocation from there on gets successful the SAP data result.
Last week upon technical validation of the acceptance environment this symptom again manifested it as I expected. However, to my surprise the warm-up effect was not effectuated. All service invocations remained in erroneous state. Via ABAP debugging, I traced the error location. The problem originated at internal SAP Gateway method 'GET_META_MODELS' to instantiate the metadata model of the invoked service. I compared the runtime behavior on the faulty system versus an environment that has successfully be ‘warmed-up’. The difference was that on the latter system, the method to create the metadata model was not even reached. Well, not anymore… It was not needed to create the metadata model, because it could be retrieved from the Gateway Metadata cache.
This appeared the problem cause why the warm-up effect was not applied: at SAP Gateway deployment, one had forgot to activate the Gateway Metadata cache in this environment. After correcting that, the myHR services do start returning data. Well, that is... from each 3rd invocation on...

Monday, October 28, 2013

GWPAM - SAP data direct in Microsoft Office client-applications

Flagship of the Duet Enterprise / Gateway product team is Duet Enterprise for Microsoft SharePoint and SAP. Customers are very satisfied with the functionality and capabilities provided by this integration product, and the demonstrated product stability. A frequently asked question is to provide this level of exposing SAP data + processes also for use in Microsoft applications beyond SharePoint. The product team has responded to this market demand. Last week at SAP TechEd 2013 in Las Vegas, SAP NetWeaver Gateway Productivity Accelerator for Microsoft has been launched, shortly referred to as GWPAM.
As participant in the Duet Enterprise Customer Engagement Initiative (CEI) program, I was involved from the early development stage of GWPAM (under the internal codename BoxX). On request of the Duet Enterprise product team I performed so-called Takt-Testing, and reported my technical and functional thoughts + findings. Good to see that aspects of my feedback - predominantly influenced by my own technical background as an .Net architect/developer - have actually made it within the final product.
Like it’s big brother, GWPAM is in essence an end-product for the IT organization. It is an integration framework that internal IT departments and SAP + Microsoft partners (the ecosystem) can utilize, to build their own scenarios in which SAP / Microsoft integration is an important architectural element. GWPAM provides a Microsoft Visual Studio AddIn that .Net developers can use to directly in their familiair integrated development environment, lookup SAP Gateway OData services. And generate proxies to the Gateway OData services with standard .Net code.
The first foreseen scenarios are Microsoft Office Add-In’s, to expose and integrate the SAP business data in the everyday used Microsoft Office clients. For example, SAP CRM customer data in the form of Outlook contacts, invoice approval requests as Outlook tasks, functional data management of SAP masterdata through Excel, BW report data rendered in PowerPoint, and submit SAP CATS timetracking directly from your Outlook Calendar ...
Like Duet Enterprise for SharePoint, GWPAM provides support for the typical and recurring plumping aspects of SAP/Microsoft integration: Connectivity, Single Sign-on, End-2-End monitoring, .Net development tooling, integration with SAP Solution Manager. GWPAM offers a complete SAP / Microsoft integration package.
As with Duet Enterprise, the two suppliers have their collective strength and market presence behind this new product offering. This is also a major distinction compared to the various proprietary connectors of smaller parties.
As SAP / Microsoft interoperability expert, I am enthousiast about the addition of GWPAM to the SAP / Microsoft integration spectrum. GWPAM enables to build a new category of functional scenarios for end-customers. Now also for organizations that do not have SharePoint in their application landscape, but do have Microsoft Office installed on the desktops. And want to utilize that familiar employee environment for user intuitive operation of SAP data and business processes.

Saturday, September 28, 2013

Avoid corrupted site columns due feature re-activation

One of our clients reported an issue that provisioned site columns get corrupted after deinstall of the provisioning feature. The feature deinstallation is a step within the repair of erroneous situation in which sandbox solution per accident is also deployed as global farm-based solution. This incorrect deployment requires a fixture because an effect is that features now appear as duplicate entries in the list of site(collection) features: one installed via the sandbox solution (correct, administrated in the sitecollection's content database), and one installed via the farm solution (incorrect, administrated in the farm configuration database).
On first thought the simple fix is to deactivate the provisioning feature that originates from the farm deployment, remove the feature, then retract the global farm-solution, and remove it from the farm solution store. Next activate the feature deployed via sandbox-solution to arrive at the correct deployment situation. However, in a test execution we experienced that this approach gives an error upon the re-provisioning of the site columns: The local device name is already in use.
Error details in ULS log:
Unable to locate the xml-definition for FieldName with FieldId '<GUID>', exception: Microsoft.SharePoint.SPException: Catastrophic failure (Exception from HRESULT: 0x8000FFFF (E_UNEXPECTED)) ---> System.Runtime.InteropServices.COMException (0x8000FFFF): Catastrophic failure (Exception from HRESULT: 0x8000FFFF (E_UNEXPECTED)) at Microsoft.SharePoint.Library.SPRequestInternalClass.GetGlobalContentTypeXml(String bstrUrl, Int32 type, UInt32 lcid, Object varIdBytes) at Microsoft.SharePoint.Library.SPRequest.GetGlobalContentTypeXml(String bstrUrl, Int32 type, UInt32 lcid, Object varIdBytes)
To come with a solution, I started with a root-cause analysis. Why are the provisioned site columns not completely deleted from the site upon its feature deactivation? The explanation is that the provisioning feature also performs a contenttype binding to the Pages library, and that in our testcase a page was created based on that contenttype. This effectually results in the contenttype being kept ‘in usage’ by the Pages library. On feature deactivation the site columns can still be removed at sitecollection level, but the contenttype not anymore due the descendant sibling binded to the Pages library.
The real problem however lies in the 'removed' site columns. They are deleted from sitecollection level, but due the preserved contenttype (Pages library) their definition has remained in the sitecollection's content database, with the same ID as on sitecollection level (note this is different for a contenttype binded to a list, that gets a new ID based on the ID of the source/parent contenttype at sitecollection). SharePoint administrates per provisioned artifact whether the origin is a feature, and if so effectively couples the artifact to that feature. SharePoint disallows these artifacts by automatically deleted or modified by another feature. As result the feature re-activation halts with an error when trying to (re)provision the sitecolumns that are still present deep down in the content database, coupled to the Pages library.
With this SharePoint-internal insight, I was able to come up with a faultproof approach to fix the 'duplicate features' issue. The trick is to initial leave the feature definition that originated from the erroneous farm solution in the configuration database, deploy the sandbox-solution (stores the feature definition in the sitecollection's content database), activate the feature with the same feature id from that sandbox solution. Now the feature activation proceeds completely without errors, and restores the site columns. Ultimately the farm-based solution can then be retracted from the farm solution store.
Note: I came to this insight by inspecting on SQL level. We all know it is not allowed to perform changes on SharePoint content database level (or loose your Microsoft support), but it is perfectly ‘SharePoint’-legal to review and monitor on SharePoint content database level.
Used/useful SQL statements:
SELECT * FROM ( SELECT *, Convert(varchar(512), CONVERT(varbinary(512), ContentTypeId), 2) As key FROM [ContentTypeUsage]) as T where key like '%< contenttypeid >%'

SELECT tp_Title FROM ( SELECT *, Convert(varchar(512), CONVERT(varbinary(512), ContentTypeId), 2) As key FROM [ContentTypeUsage] Join AllLists on ContentTypeUsage.ListId = AllLists.tp_ID ) as T where key like '%< contenttypeid >%'

Saturday, August 31, 2013

Applying SharePoint FAST for unlocking SAP data

An important new mantra is search-driven applications. In fact, "search" is the new way of navigating through your information. In many organizations an important part of the business data is stored in SAP business suites. A frequently asked need is to navigate through the business data stored in SAP, via a user-friendly and intuitive application context. For many organizations (78% according to Microsoft numbers), SharePoint is the basis for the integrated employee environment. Starting with SharePoint 2010, FAST Enterprise Search Platform (FAST ESP) is part of the SharePoint platform. All analyst firms assess FAST ESP as a leader in their scorecards for Enterprise Search technology. For organizations that have SAP and Microsoft SharePoint administrations in their infrastructure, the FAST search engine provides opportunities that one should not miss.

SharePoint Search

Search is one of the supporting pillars in SharePoint. And an extremely important one, for realizing the SharePoint proposition of an information hub plus collaboration workplace. It is essential that information you put into SharePoint, is easy to be found again. By yourself of course, but especially by your colleagues. However, from the context of 'central information hub', more is needed. You must also find and review via the SharePoint workplace the data that is administrated outside SharePoint. Examples are the business data stored in Lines-of-Business systems [SAP, Oracle, Microsoft Dynamics], but also data stored on network shares.
With the purchase of FAST ESP, Microsoft's search power of the SharePoint platform sharply increased. All analyst firms consider FAST, along with competitors Autonomy and Google Search Appliance as 'best in class' for enterprise search technology. For example, Gartner positioned FAST as leader in the Magic Quadrant for Enterprise Search, just above Autonomy. In SharePoint 2010 context FAST is introduced as a standalone extension to the Enterprise Edition, parallel to SharePoint Enterprise Search. In SharePoint 2013, Microsoft has simplified the architecture. FAST and Enterprise Search are merged, and FAST is integrated into the standard Enterprise edition and license.

SharePoint FAST Search architecture

The logical SharePoint FAST search architecture provides two main responsibilities:
  1. Build the search index administration: in bulk, automated index all data and information which you want to search later. Depending on environmental context, the data sources include SharePoint itself, administrative systems (SAP, Oracle, custom), file shares, ...
  2. Execute Search Queries against the accumulated index-administration, and expose the search result to the user.
In the indexation step, SharePoint FAST must thus retrieve the data from each of the linked systems. FAST Search supports this via the connector framework. There are standard connectors for (web)service invocation and for database queries. And it is supported to custom-build a .NET connector for other ways of unlocking external system, and then ‘plug-in’ this connector in the search indexation pipeline. Examples of such are connecting to SAP via RFC, or ‘quick-and-dirty’ integration access into an own internal build system.
In this context of search (or better: find) in SAP data, SharePoint FAST supports the indexation process via Business Connectivity Services for connecting to the SAP business system from SharePoint environment and retrieve the business data. What still needs to be arranged is the runtime interoperability with the SAP landscape, authentication, authorization and monitoring. An option is to build these typical plumping aspects in a custom .NET connector. But this not an easy matter. And more significant, it is something that nowadays end-user organizations do no longer aim to do themselves, due the involved development and maintenance costs.
An alternative is to apply Duet Enterprise for the plumbing aspects listed. Combined with SharePoint FAST, Duet Enterprise plays a role in 2 manners:
(1) First upon content indexing, for the connectivity to the SAP system to retrieve the data. The SAP data is then available within the SharePoint environment (stored in the FAST index files). Search query execution next happens outside of (a link into) SAP.
(2) Optional you'll go from the SharePoint application back to SAP if the use case requires that more detail will be exposed per SAP entity selected from the search result. An example is a situation where it is absolutely necessary to show the actual status. As with a product in warehouse, how many orders have been placed?

Security trimmed: Applying the SAP permissions on the data

Duet Enterprise retrieves data under the SAP account of the individual SharePoint user. This ensures that also from the SharePoint application you can only view those SAP data entities whereto you have the rights according the SAP authorization model. The retrieval of detail data is thus only allowed if you are in the SAP system itself allowed to see that data.
Due the FAST architecture, matters are different with search query execution. I mentioned that the SAP data is then already brought into the SharePoint context, there is no runtime link necessary into SAP system to execute the query. Consequence is that the Duet Enterprise is in this context not by default applied.
In many cases this is fine (for instance in the customer example described below), in other cases it is absolutely mandatory to respect also on moment of query execution the specific SAP permissions. The FAST search architecture provides support for this by enabling you to augment the indexed SAP data with the SAP autorisations as metadata.
To do this, you extend the scope of the FAST indexing process with retrieval of SAP permissions per data entity. This meta information is used for compiling ACL lists per data entity. FAST query execution processes this ACL meta-information, and checks each item in the search result whether it allowed to expose to this SharePoint [SAP] user.
This approach of assembling the ACL information is a static timestamp of the SAP authorizations at the time of executing the FAST indexing process. In case the SAP authorizations are dynamic, this is not sufficient.
For such situation it is required that at the time of FAST query execution, it can dynamically retrieve the SAP authorizations that then apply. The FAST framework offers an option to achieve this. It does require custom code, but this is next plugged in the standard FAST processing pipeline.
SharePoint FAST combined with Duet Enterprise so provides standard support and multiple options for implementing SAP security trimming. And in the typical cases the standard support is sufficient.

Applied in customer situation

The above is not only theory, we actually applied it in real practice. The context was that of opening up of SAP Enterprise Learning functionality to operation by the employees from their familiar SharePoint-based intranet. One of the use cases is that the employee searches in the course catalog for a suitable training. This is a striking example of search-driven application. You want a classified list of available courses, through refinement zoom to relevant training, and per applied classification and refinement see how much trainings are available. And of course you also always want the ability to freely search in the complete texts of the courses.
In the solution direction we make the SAP data via Duet Enterprise available for FAST indexation. Duet Enterprise here takes care of the connectivity, Single Sign-On, and the feed into SharePoint BCS. From there FAST takes over. Indexation of the exposed SAP data is done via the standard FAST index pipeline, searching and displaying the search results found via standard FAST query execution and display functionalities.
In this application context, specific user authorization per SAP course elements does not appy. Every employee is allowed to find and review all training data. As result we could suffice with the standard application of FAST and Duet Enterprise, without the need for additional customization.

Conclusion

Microsoft SharePoint Enterprise Search and FAST both are a very powerful tool to make the SAP business data (and other Line of Business administrations) accessible. The rich feature set of FAST ESP thereby makes it possible to offer your employees an intuitive search-driven user experience to the SAP data.

Thursday, August 22, 2013

Powershell to list all site collections in farm with feature activated from ‘Farm’ definition scope

In SharePoint 2010, a Feature can be installed from a farm solution and from a sandbox solution. In case of farm solution, the Feature is installed on farm-level, and dependent on the feature scope, visible (unless hidden) for all webapplications, site-collections or webs to activate (by GUI, Powershell and yes, even stsadm). In case of sandboxed solution, only possible for scope = Site or Web, the feature is only visible within the site-collection, and can also only be activated in the site-collections to which added and activated.
Today we encoutered a situation in which a sandboxed solution was per accident also deployed on the farm. The result was that features within the solution were installed and visible twice in each site collection to which it also was added as sandbox. The remedy is to retract the per-accident farm deployment. But we must take into account that features from the SharePoint solution may have been activated from the farm-deployed version.
Powershell enables us to easily determine in which site collections in the farm the feature is activated from the farm-deployed solution:
$snapin = Get-PSSnapin | Where-Object {$_.Name -eq 'Microsoft.SharePoint.Powershell'}
if ($snapin -eq $null) {
    Add-PSSnapin "Microsoft.SharePoint.PowerShell"
}

$featureId = ".......-....-....-....-............";
$contentWebAppServices = (Get-SPFarm).services | ? {$_.typename -eq "Microsoft SharePoint Foundation Web Application"}
foreach($webApp in $contentWebAppServices.WebApplications) {
  foreach($site in $webapp.Sites) {
    Get-SPFeature -Site $site| where{$_.Id -eq $featureId}|%{
    if ($($site.QueryFeatures($featureId)).FeatureDefinitionScope -match "Farm"); {
        Write-Host $_.DisplayName " is Activated from Farm deployment on site : " $site.url
      }
    }
  }
}

Sunday, August 18, 2013

Evaluating SharePoint Forum products

At one of the organizations I consult, there is a business demand for forum functionality in their external facing websites. The organization has selected SharePoint as target architecture for webapplications, including public websites. SharePoint as platform itself contains DiscussionBoard as a (kind of) forum functionality, but this is not qualified for usage on external facing websites. Among its criticisms are the look & feel which is very ‘SharePoint-like', and not what endusers expect and typically are familiar with on public websites. Also DiscussionBoard lacks forum functionalities of moderation, sticky posts, avatars, lock a post, rich text editing, tagging, rating, vote as answer. And an important restriction for the usability on public websites: a SharePoint DiscussionBoard in practice requires authenticated users (ok, you can allow anonymous access, but as posting topics and answers occurs via SharePoint forms you then will have to give anonymous users access to layouts folder. Not a wise decision to make from security perspective).
One option would be to custom develop forum functionality that satisfies the extended business requirements. But because we regard forum as commoditiy functionality, this is not something we want to develop and maintain ourselves. Therefore instead I did a market analysis and evaluation for available SharePoint forum products. It appears a very tiny market, with only 5 market products found:
  • TOZIT SharePoint Discussion Forum
  • Bamboo Solutions Discussion Board Plus for SharePoint
  • KWizCom Discussion Board feature
  • LightningTools Storm Forums
  • LightningTools Social Squared
In each product evaluation I addressed the following aspects:
  1. Product positioning by supplier (wide-scale internet usage, scaleability?)
  2. Installation of the product
  3. Effect of the installation on the SharePoint farm (assemblies, features, application pages, databases, …)
  4. Product documentation: installation manual, user / usage manual
  5. Functional Management: actions involved how-to provision a new forum
  6. Functional Usage: using a forum, in the roles of moderator and forum user
  7. User Experience and internet-ready
  8. Branding capabilities (CSS, clean HTML)
  9. Product support
For the requirement set of this customer, Social Squared turned out as the best fit. It has a rich forum feature set, on the same level as what you see in other (non-SharePoint) established internet forums. Also considered a strong point is that the forum administration is done in own SQL database(s), outside the SharePoint (content database). This allows to easily sharing a forum between multiple brands of the customer, each with their public presence in an own host-named site collection. And LightningTools is an well-known software products supplier, which gives the customer (IT) trust that the product will be adequatly and timely supported. As a demonstration of this: during my evaluation I already had extensive contact with their product support, answerring on questions and issues that I encountered, and also give me insight on the Social Squared roadmap.

Tuesday, July 2, 2013

HowTo secure enable anonymous access of SharePoint external facing website

SharePoint can be used to provision public facing websites, accessible for the anonymous audience. Naturally it is then required that the contents (publishing pages, images, stylesheets) on the SharePoint website are anonymous accessible. In SharePoint 2007 the only option was to completely open up the SharePoint site collection, including all of its contents. Starting with SharePoint 2010 you have more fine-grained options. You can still open up the entire SharePoint context, which is the easy approach. Or you can decide to explicitly determine which of the content in the SharePoint site collection is anonymous accessible.

Enable Anonymous Access on site-collection level

As said, this is the easy, and as result also the most common applied approach. Basically you have to set one switch (ok, it takes some more, you can find the steps outlined here), and you're good to go.
What you must realize is that with this approach, basically all your content is anonymous accessible. Not only the publishing pages, but all SharePoint lists and libraries in the site collection are open for data retrieval from outside your domain. The data can be retrieved via the "old" Lists.asmx service (e.g. using CodePlex SPServices.js within jQuery context; or by invoking Lists.asmx from a .NET console application), or via ListData.svc. Whether this is a concrete problem depends: if the lists and libraries do not contain any sensitive data, what's the harm? But if they do, then you do not want that data to be retrievable outside your control.
Noteworthy: This is the applied approach when you host your public facing website in SharePoint Online / Office 365.

Enable Anonymous Access explicitly only for the required SharePoint artefacts

As can be seen from the picture, you also have the option to enable anonymous access on 'Lists and Libraries' level. If you select that option, default still all your SharePoint content is prohibited for anonymous access. You have to explicitly enable per List / Library if their contents must be anonymous accessible. In practice this means that for public facing websites, you must enable Anonymous Access on the Pages library in each Publishing Site in the site collection. But also on the Style Library, so that .css and .js resources are accessible. If you display images stored in a PictureLibrary on website pages, then you must also enable that library. Same for documents. In case you need to display on a page data from a List that is deliberate not enabled, you can still do. But you will have to wrap the data retrieval in an ElevatedPrivilege context; and thus requires custom code.
So this approach implies much more work as simple enable on 'Total website'. And also you'll have an extra maintenance responsibility: when the website is augmented with another Publishing Site, you have to [remember to] enable the Pages library in that site also explicitly for Anonymous Access.
Sounds like a drag? Well, the [very] good side is that you are in explicit control of which content you allow for anonymous access. And the default mode is Not Accessible, which is a security best practice.

Lock _vti_bin usage

There is a third option. Go the easy route of enabling Anonymous Access for total website; and avoid uncontrolled data retrieval via the SharePoint webservices by disable on DNS level the usage of '_vti_bin'. This approach has however some drawbacks. You cannot use SharePoint Designer anymore to open the site. And you can no longer use the SharePoint REST services in anonymous context. In the current time of Rich HTML5 Apps, it is becoming de-facto reference architecture to connect the front-end to your data and logic via REST services. So this poses a real limitation. Which in my book should not be taken too lightly.

Monday, June 24, 2013

Inconvenient Import-SPWeb with Sandboxed Solutions

In current project we deliver an initial SharePoint site-collection as white-label site. The aim is we utilize the initial site as template for concrete label websites. Challenge is how to duplicate the white-label site collection into a concrete label site. Backup-SPSite plus Restore-SPSite is an option, but requires that you restore each copy to its own content database. Reason is that Restore-SPSite preserves the siteid from the original / backed-up site, and this conflicts with the white-label site (and any other restored site) in same content database.
Second option is to restore the site collection structure via combination of Export-SPWeb plus Import-SPWeb. Approach is to export the root web plus all its recursive sub webs; and import them in a fresh created empty site collection.
When doing this, the import job halted with error:
Inspecting the Import log revealed that it concerned a SPList based on ListTemplate provisioned via feature in a Sandboxed solution. And this results in a sequencing issue: the list is first to be created in the copy site, before the sandboxed solution is also added plus activated within the copy site.
Solution is to upload and activate the Sandboxed solution within the fresh copy site collection, before running the Import-SPWeb.
Note that default you cannot download solutions from the solution gallery of the white-label site collection. But you can fix that by adding a view to the solution gallery with includes ‘Name’ as view column.

Friday, June 21, 2013

SharePoint - best all-rounder for the employee workplace

This posting is a translation from earlier Dutch publication on The Next Thought blog.
In this era in which our (work)life is more and more influenced by the capability to access the enterprise knowledge and functionalities, optimum support in the execution of our work is of essential importance. We need to focus on the aspects that really matter and not be bothered with the administrative burdens that the business proccesses and IT can bring. We all know it. Handling the daily and the ad hoc tasks often takes too much time. Our work productivity is killed by the diversity in how to access the various sources of information, and by the diversity in how to perform and complete our tasks.
This is a challenge that was promised to be resolved by portal technology. One shell around your processes and applications, that improves the access and uniformity in all its facets. For a decade we hear that portal technology makes this possible, but the reality appears more stubborn.
Granted, it is also a difficult problem to solve. On the one hand, once ultimately a portal technology is choosen, the organization must conform to the utilization of that technology and how it is deployed onto the employees. And that [governance] is no small thing.

Sufficient functionality?

Another issue is whether the selected portal technology is rich enough to support the various aspects of the integrated employee workplace: information sharing, locate information and people, work collaboration, process, task management, integration with other systems, information dashboards, and more. A multitude of functional requirements, supplemented with the Enterprise Architecture requirement that it is simply to align all the different systems already included in the enterprise architecture.
In the last decade SharePoint has established a prominent position in the market of portal technology. Gartner consistently grades SharePoint as [the] leader in the field of Horizontal Portals.
And rightly so. The integration of SharePoint with Microsoft Office products, plus the broad functionality and integration capabilities that SharePoint provides, makes it a very good all-rounder. A platform that provides virtually every need of the information worker.
As of release 2010, the SharePoint platform complemented by the ecosystem, includes adequate support for all recurring aspects for the integrated employee workplace. The key word here is 'sufficient': SharePoint provides certainly not for every single aspect the most advanced and complete solution, but for most scenarios the platform offers sufficient support.
An example is the field of Business Process Management: SharePoint workflow is not a world-class BPM suite, when we compare it with SAP or Tibco BPM; it does not intend that to be. But through SharePoint workflow, possibly supplemented with Nintex Workflow or K2, it is possible to achieve a "lightweight BPM" solution. For the average desired application often a much more cost-effective solution, that can immediately be deployed from the available SharePoint platform within [a] company.
The same applies to the 'social' components of SharePoint. The platform itself out-of-the-box delivers the minimal expected 'social' support: microblogging / status updates, follow, like, tagging etc. Whenever the 'social' needs of an organization go beyond these, then there are additional products in the SharePoint ecosystem (eg NewsGator) to go a step further on ‘social’ front.
And that applies to all SharePoint features: document handling, archiving, BI dashboards, enterprise search, collaboration: the product itself includes direct deployable enterprise functionality. And as a portal platform it is functionally expandable through standard products available on the market.

Monday, June 17, 2013

Architecture of a cloud-enabled SAP/SharePoint document viewer

End 2012 IntelliDocx LLC approached us to design the global architecture for their new DocSet.ECM for SAP and SharePoint product suite, and build DocSet.ECM for Plant Maintenance (PM) as first member. The DocSet.ECM functionality is enable end-users to search, navigate and access documents linked with SAP entities, directly within SharePoint based workplaces. The SAP entities come from any of the known SAP Business Suites - PM, Sales, HCM, SRM. IntelliDocx had an ambitieus deadline, as they aimed to release the PM solution to the public and market analysts at the European SharePoint Conference in Copenhague (Februari 2013).
The base functional requirement for the product suite is that it renders a view in SharePoint on documents linked in SAP. But to make it a winning solution, they have more requirements. The products must be very user-intuitive and friendly; enabling arbitrair end-users to utilize it without requiring inside SAP knowledge. Product deployment must be fit for both on-premise as cloud. More and more organizations nowadays have decided, or are looking into, SharePoint Online utilization iso maintaining an own on-premise SharePoint farm. And of course, the product should support both SharePoint 2010 and 2013; or at least be prepared for 2013 ahead.

System Architecture

The diagram below illustrates the architecture we proposed to address the requirement set. (And which we indeed applied to successfully deliver the DocSet.ECM for PM product to market, within the tight timeframe [News release]).

Essentials

DocSet Viewer
  • Deployed via Sandboxed Solution
  • UserControl iso aspx page
    • allows to add the DocSet viewer to regular SharePoint site and content pages / intuïtieve integration within customer’s SharePoint based portals
  • Server-side only rendering the html
    • No server postback handling
    • No interactive dependency on SharePoint server (may be cloud-hosting)
  • Client-side handles communication with the DocSet webservice, and dynamic UI-rendering
    • By-pass SharePoint server
    • Use jQuery:
      • Invoke DocSet JSON webservice
      • Render tree-view of hierarchical document folders
      • Drag-and-drop of documents (for Move + Add)
  • Configuration:
    • Endpoint of DocSet webservice (may be cross-domain)
DocSet webservice
  • WCF JSON webservice
  • Enable CrossDomain invocation via JSONP
    • CORS is not yet widely supported; and in particular not within the enterprise domain
  • Utilize ERP-Link RFC Connector for SAP / Microsoft.NET interoperability with DocSet.ECM RFCs
    • The architecture allows usage of other SAP / Microsoft connectors, e.g. Duet Enterprise, but we decided to initially use the ERP-Link connector as IntelliDocx told us a lot of their customers already use the Gimmal ERP-Link Suite.
  • Configuration:
    • Connection settings to DocSet RFC
    • Address of SharePoint server hosting the DocSet viewer, to allow cross-site requests from this possible remote / other domain server

DocSetViewer App - SharePoint 2010 + 2013

The DocSet.ECM for SAP and SharePoint product suite is fit to deploy and run within both SharePoint 2010 and 2013, without any code-change; and hosting supported on-premise and within the Office365 cloud.
  
Related info: IntelliDocx

Sunday, May 26, 2013

Tip: Use Productivity Tool to setup code structure for Open XML generation

A customer requirement is the ability to offline manage business data within a downloaded excel sheet, and upload for processing into the business administration. Strong demand is that the input and actions in the excel sheet must be user-prescribtive as well as restricting. Microsoft Excel supports this via capabilities as input cell validation, cell format, protecting sheets and so on. We provided our customer with an example Excel 2010 sheet as functional specification. After we reached agreement on the Excel sheet behaviour, next step is to realize the runtime Excel sheet generation, and bind it to the user-selected data retrieved from the business administration.
The Open XML SDK can be used for server-based Excel sheet generation. A problem however is that (usage of the) the Open XML API / language is not very well documented. Setting up the generation of a simple Excel sheet is not a problem [as there are also sufficient code examples online]. But when it comes to including capabilities like data validation, the situation changes. The Open XML generation is very fragile, and you easily end up with an incorrect Open XML structure. Due the badly documented Open XML API, fixing the programmatically generation is a frustrating and time-consuming task.
In case of a complexer Excel sheet, a better approach is to utilize Open XML Productivity Tool. Just open the Excel sheet in this tool, and then export the Open XML code generation for the imported sheet. Mind you, typically you will want to refactor the generated code to improve on its maintainability. Also in our example we bind it to the data retrieved from the business administration.

Saturday, March 30, 2013

No silver bullet for multilingual SharePoint websites

In 2007 I wrote an advice on how-to deliver multilingual user experience in a SharePoint 2007 based public facing website. Althought the SharePoint Variations concept promised to fulfill this need, our experiences turned out differently. As result my advice was that we better just made a localized copy of the website per language.
This month I repeated the analysis on multilingual solution directions for SharePoint websites; now for SharePoint 2010 (and looking forward to SharePoint 2013). Sadly, the findings have not really improved compared to the 2007 support.

Aspects of multilingual in SharePoint sites

The information architecture of SharePoint based websites differs considerable from plain asp.net, php websites. SharePoint artifacts are typically administrated within the SharePoint content database, instead of physical files on the webserver. The SharePoint platform is also very rich with out-of-the-box capabilities and components that are used to construct a website. And for public facing websites it enables content managers to logical extend the websites with new content and composition pages.
In a SharePoint website the text displayed can come from any of the following places:
  1. Contained within physical .aspx and .ascx files in the layouts folder; typically via asp.net Resources mechanism
  2. Contained within code of SharePoint controls: standard and custom
  3. SharePoint content database
On deeper level, the following multilingual aspects can be distinguished in SharePoint public facing websites
  1. Multilingual of the SharePoint standard UI (MUI: Multilingual User Interface)
    • menu’s
    • list and libraries column names
  2. Multilingual of custom developed SharePoint components
  3. Multilingual of the SharePoint content
    • Publishing pages content
    • Static html in Masterpages, PageLayouts and SitePages
    • List and libraries items
    • Localized images
  4. Multilingual of SharePoint navigation: Global, Current, Mega-menu
  5. Multilingual of SharePoint search-discovery items: website title, friendly-urls

Solution directions

Manual website-copy per language

This is the most basic and also most flexibel approach. But it also implies that all multilingual aspects must be performed manual and repeated per language website-copy: at moment of the initial website provisioning, and later for each change to its information architecture and content. The latter also beholds a risk: site content manager forgets to propagate a change on the source site to one or more of the language copy-sites.

Site-copy per language via SharePoint Variations

Variations is an out-of-the-box capability of the SharePoint platform since 2007 version. The concept is that you declare one site in your sitecollection as the Variation root, and create + declare sites for other languages as so-called Label sites. Each change to site structure and publishing pages in the Variation root is propagated as draft to all Label sites. Per Label site, the content manager must next translate the propagated page(s), and publish them. Visitors do not see the content as it is propagated from the root site.
The most important limation of Variations is that it does not cover all the multilingual aspects. It only applies to publishing page content. SharePoint list and libraries are not in scope, and still need to be handled manual.
Another major limitation is that Variations cannot be applied as an afterthought to an already existing SharePoint site. It must be applied at start of provisioning the information architecture of the source SharePoint site.
In SharePoint 2007, the Variations functionality suffers from multiple issues that in general gave it a bad press; and there are very limited organizations that actually (trust to) apply it. In SharePoint 2010 the robustness of Variations behaviour should be a bit improved; but there are still only limited known practisioners in real world. Also noteworthy is that the STSADM VariationsFixUpTool is reported by Microsoft self to be still needed for fixing corruption in Variations Relationships list.

Multilingual in-place in the SharePoint site

The idea of this is that every localizable artefact (text + images) in the SharePoint site is inherent multilingual itself; and it server-side renders in the language of choice of the website visitor. Major benefit of such approach is that you avoid the need to maintain individual language site-copies consistent with each other. Most important limitation however is that it alone is not sufficent. You cannot apply this concept to out-of-the-box SharePoint menu’s, webparts and application pages, and also not without modification in SharePoint Web Content Management (Publishing Pages).

Multilingual via SharePoint Language Store

Basically a similar concept as ‘Resources’, and also somewhat comparable to the ‘in-place’ approach. Essence is that all text-parts are administrated in a Language Store that includes per text-part its translations in the multiple languages. The Language Store can be downloaded from CodePlex, however only for SharePoint 2007. It is not upgraded to SharePoint 2010. This approach has same limitations as the ‘in-place’ approach: does not work for SharePoint out-the-box ui-controls, menu’s, application pages, nor does it combine with SharePoint publishing.

Multilingual in SharePoint site via off-the-shelf products

Well, this in itself is a clear proof that multilingual is a complex matter in SharePoint context… The only 2 products found are from smaller and unknown companies (Oceanik and IceFire). Short evaluation of both propositions already demonstrates multiple shortcoming and functional-robustness issues – examples: mix language content on page, translation support not applied to standard SharePoint aspects (e.g. Reusable Content).

Conclusions

  1. SharePoint multilingual is a complex matter.
  2. Multilingual always implies extra work for SharePoint content manager → automatic text translation is yet immature to trust upon for public facing website
  3. There is no 100% automatic solution for supporting all the SharePoint multilingual aspects

Saturday, March 9, 2013

DataForm and DataView WebParts cannot display data from other site when on SitePage

DataFormWebPart is a powerful webpart to query and render data from SharePoint content sources on pages in the website. In our scenario we intend to utilize it to conditionaly display reusable content on pages. For the reusable aspect we use the standard SharePoint ‘ReusableContent’ capability. For the conditional part, we aimed to utilize the DataFormWebPart, and via ParameterBinding + Query direct it to retrieve specific ‘ReusableContent’ listitem based on a querystring parameter-value.
The ‘ReusableContent’ list is located on rootweb of the sitecollection. A mature website typically consists of a hierarchy of subsites within the sitecollection. To display through DFWP ‘reusablecontent’ in a subsite it is necessary to set it’s WebUrl parameter to ‘{sitecollectionroot}’.
To check the correct behaviour (reusable + conditional) I used SharePoint Designer to create a page in the website, add the preconfigured DFWP to it, and validate the effect in browser. It turned out that it works (reusable + conditional) when the page is created in the rootweb, but for subsites the DFWP gives an error when opening the page in browser (ULS: List Not Found). Strangely in SPD the DFWP on page in subsite can connect to the list in the rootweb, and you do not experience any problem. I was rather puzzled by this, moreover because I followed the exact directions of posts that instructed this is the way to enable DFWP on pages in subsites. Just when I was about to submit a support call to Microsoft, I decided as last attempt to create a Publishing Page (thus in browser, not possible via SPD), and add the preconfigured DFWP. And guess what; then it works!
As our scenario concerns external facing and thus publishing websites, this is sufficient. Just be aware that you cannot use the DFWP on SitePage to connect to SPList in another site.

Thursday, March 7, 2013

Duet Enterprise 2.0 Authentication flow

One of the changes in Duet Enterprise 2.0 is the Single Sign-On handling. Duet Enterprise 1.0 applies SAML to seamlessly authenticate and authorize the SharePoint authenticated user within SAP NetWeaver Gateway. For Duet Enterprise 2.0 this is no longer possible, result from the internal usage of OData for SAP - SharePoint data exchanges. The Duet Enterprise 2.0 SSO approach is X.509 based. On the Microsoft site you can find information that outlines the essential details of the authentication flow:

Saturday, February 23, 2013

Duet Enterprise 2.0 Online / Extends reach of SAP data into Microsoft Cloud

SharePoint Online

Most significant in SharePoint 2013 is that it is ‘built for the cloud [1] [2]’. The motiviation for Microsoft to focus on cloud-enablement is market-demand: a considerable subset of its SharePoint customer-base struggles with hosting and operating SharePoint themselves on premise. And because of this struggle, a lot of SharePoint-using organizations are reluctant to upgrade their SharePoint deployment. Via the Cloud / on demand offering, Microsoft aims to reach 2 goals: relieve organizations from the burden of SharePoint operations, and enable those same organizations to stay on par with the feature progress in the SharePoint and Office platform. Microsoft plans an upgrade release cycle of every 90 days for SharePoint 2013 Online.

Duet Enterprise 2.0 Online

The SharePoint 2013 cloud offering also has positive implication for Duet Enterprise deployment options. In addition to the existing on premise installation option, Duet Enterprise 2.0 offers a cloud / on demand alternative. Extend the reach of your SAP data into the Microsoft cloud. In such infra architecture, the SAP landscape is on premise, but Microsoft SharePoint 2013 is serviced in the cloud.
The Duet Enterprise hybrid infrastructure setup imposes following requirements on the local and remote + shared environments:
  • Each customer organization must [still] deploy the Duet Enterprise 2.0 SAP Add-On in own on premise SAP landscape; installed on the SAP NetWeaver Gateway 2.0 system
  • SharePoint 2013 Online must include the Duet Enterprise 2.0 SharePoint Add-On
  • The connectivity between on premise SAP Add-On and cloud hosted SharePoint Add-On requires a ‘light’ local SharePoint 2013 node. This node functions as intermediate layer ('broker') from internal SAP Gateway to SharePoint online and vice versa.
  • Mutual trust-relationship between the on premise SAP Add-On and the on premise or cloud hosted SharePoint Add-On is based on X.509 certificates; no longer on SAML. The X.509 certificates are provisioned by the Duet Enterprise SSO Generator on the ('light') SharePoint 2013 node on premise.

Availability

Duet Enterprise 2.0 is already General Available for on premise deployments, but not yet for the cloud. It necessary follows upon the public availability of SharePoint 2013 Online, with the inclusion of Duet Enterprise 2.0 SharePoint Add-On. Microsoft has not yet announced the release date, but realistic expectation is ‘somewhere’ Q1/Q2 2013. Then also more details will be communicated on how to configure and operate Duet Enterprise 2.0 in a potential multi-tenancy SharePoint Online context.

Sunday, January 20, 2013

My first thoughts on SharePoint 2013

With SharePoint 2013 reaching it’s General Availability (the expectation is somewhere in 2013-Q1), it is time to give it some first thoughts. I will/cannot go deep into it [yet], as my knowledge is strictly based on hearsay. Next month I will visit European SharePoint Conference in Copenhagen, and hope to increase my understanding another level.
From what I’ve read and heard so far about SharePoint 2013, I consider these the most significant differentiators wrt SharePoint 2010:
  1. For end-users: (Enterprise) Social-enablement
  2. For operations: Cloud-enablement
  3. For developers: New App model, favouring client-side development above server-side
For business: well, all of the above.
Although ‘social’ was already introduced in SharePoint 2010, it was just not good enough and cannot live up to the competition (products like Facebook, Twitter, Yammer (…), NewsGator, SocialText). The social capabilities are now re-designed and developed, and promise to get more on par, with the distinguishing emphasis on applying social within the enterprise. So you can like a sales quote from your colleague, micro-blog on a customer case, follow colleagues that are involved in same type of workactivities as yourself (or as you desire to…).
With Cloud, organizations can be rescued from SharePoint operations, an often cumbersome exercition to perform self as a business organization. Also it may ease migration to SharePoint 2013 as it relieves organization to buy, install and maintain own infra (hardware + software).
With the new App-model, it will be easier to custom-build extensions to the SharePoint platform; relying less on in-depth SharePoint internals knowledge. And due the hosting model of SharePoint Apps, they will not be able to corrupt and destabilize the SharePoint server platform itself.

Matrix overview of what’s new in SharePoint 2013

Of course the 3 aspects discussed above form only a small portion of the total set of what is new or improved in SharePoint 2013 compared to SharePoint 2010. In below matrix I included several more (but I’m sure that it still is not the complete picture), again per functional, developer and operation perspective.