Quantcast
Channel: SharePoint Trenches
Viewing all 46 articles
Browse latest View live

Error when access Nintex Live Management page on fresh Nintex Workflow 2013 installation.[Tip]

$
0
0
Today I hit a strange issue with a fresh installation of Nintex Workflow 2013 with Nintex Live.
Nintex Live solution was deployed by the installer. However when I tried to access the Nintex Live Management page in the Central Administration I received below error.

The resource object with key 'LiveAdmin_Page_Management_Title' was not found.

If you have done everything correctly so far the solution is very simple.
Open SharePoint Management Shell as Administrator and launch the command Install-LiveService , test if the page is now available, perform IISRESET on your CA server(s).

Display and Fill related item fields in workflow task with Nintex for Office 365

$
0
0
A couple of weeks ago I posted an article called "Move away from the Workflow Tasks with Nintex for Office 365". The goal of this post was to show some key techniques that you can use if you want to automate a process without using tasks and do all the work in the form so users can see what they are approving.
However, if you want or need to stick with the tasks there is way to display and fill fields from the related item in the task form.
To do this you will need to edit the task template from the workflow designer. This feature was introduced with July 2015 release of Nintex Forms for Office 365.
To illustrate how this is working I am going to use the familiar simple Vacation Request form. See how it looks in the form designer below.

Vacation Request

You can see a field with label "Manager Comment" this field is visible only when the form is not in "New Mode", it should contain some information from the manager that will approve the request.
In Nintex Workflow for Office 365 there is no Request Data action, but we can include the field that requires information on task completion in the task form.
If you open a Assign a Task(or Start a Task Process) action configuration page you will see "Edit Task Form" in the action ribbon. By clicking it you will get to the familiar Nintex Forms Designer interface.


In the picture above I am editing the default Nintex task content type and once you open the form designer the initial form includes task columns and item column. Initially all related item columns are disabled, but you can change that for any column you want and make it editable in the task, you can also connect control from the task form to related item column. With some rearrangement you ca see how the Vacation Request approval task looks below.


There is something you should consider if you have many tasks with modified forms. This is something I hit with a customer and later reproduced. This is not yet confirmed by Nintex.
When you edit a task form, the task will be created with Nintex content type(ending with GUID), even if you have specified your own custom content type. This is not an issue because the Nintex content type will have the same columns as the original. However you will have to edit all Task Action forms in the workflow regardless of what content type they are using. If for example two tasks have the same initial custom content type, they will be created under the same Nintex content type after you edit both forms.
This is still not a big deal. However, every time you edit a task form two hidden variables are created and you can easily end up hitting the variable count limitation in the Workflow Manager which is 50 in SharePoint Online. See below error when you try to publish a workflow that has more than 50 variables.

Error publishing workflow. Workflow XAML failed validation due to the following errors: Activity 'DynamicActivity' has 52 arguments, which exceeds the maximum number of arguments per activity (50). HTTP headers received from the server - ActivityId: 9e1fc3bc-5a7c-4821-9605-d595acea851d. NodeId: . Scope: . Client ActivityId : 29ca419d-d068-2000-213e-aae9dfcc2677. The remote server returned an error: (400) Bad Request.



This is not an issue with Nintex, this is just the way the things are working in the background.
In my humble opinion Microsoft should rethink this limitation in SharePoint Online!

I hope that this was helpful!

Copy List Views in SharePoint and SharePoint Online with PowerShell

$
0
0
In the last couple of weeks I am working with a customer that mainly uses SharePoint Server as DMS(Document Management System). I had to move a large number of documents from one library to another   due to  corruption in some of the files caused by excessive use of unique permissions (~ 32 000).
In this post I will not talk about why you should limit the usage of unique permissions especial in big libraries, it's a long story.
As part of the work I had to copy many list views to the new library. I am not a fan of "Save as Template" approach, my solution was to use a PowerShell script and copy the views programmatically.
The script did its job and I thought that it will be nice to have something like this for SharePoint Online.
I was unable to find any ready script that can do this, so I wrote one.
Both scripts are doing basically the same thing, getting the source and destination webs, getting the source and destination lists, getting the source and destination View collections and create new view in the destination using properties from the source.
It was a bit tricky to load what I need and not making the script extremely slow with CSOM, since we cannot use simple lambda expressions syntax in PowerShell. My solution to this was to use a function written by the SharePoint automation superstar  Gary Lapointe. You can check it out in his article for ItUnity, where he explains how to workaround the limitation in PowerShell concerning the lambda expressions. I highly recommend to read the entire series dedicated on using PowerShell against SharePoint Online.
You can download both scripts (on premises and online) below. Please, Test, Rate and use Q&A section in the Gallery!


 Download On Premises script from: TechNet Gallery

Download SharePoint Online script from: TechNet Gallery

Get Application Pool Identity credentials[Tip]

$
0
0
In this short tip I am going to post a PowerShell one-liner from my list of extremely useful one-liners. It can get the the credentials of the IIS application pools identity.
I use it mainly in two scenarios:
  1. Imagine that you will do a remote work on customer where you have only temporary access and credentials. Many times the account that is provided is Farm Admin, has local admin permissions on the SharePoint boxes, however it does not have permission to use PowerShell against SharePoint(no Shell Admin). You can use this short PowerShell script and get the Farm account(STS is running under it), many times it is left in the local admin group and you can log in with it and do what you need to do. Not a best practice, but it is a massive time saver.
  2. Imagine that you are working on issue where you need to restart the User Profile Synchronization Service instance and you need the Farm account password. You can get it with this script.

In order to use it you will need to have local admin permission. I can confirm that it is working on IIS 7.5,8.0 and 8.5.

Get-WmiObject-Namespace"root\MicrosoftIISV2"-Class"IIsApplicationPoolSetting"|SelectWAMUserName,WAMUserPass

Get IIS Application pool credentials

Get a quick report of the SharePoint Databases with PowerShell [Tip]

$
0
0
Here comes another useful PowerShell one-liner I often use.
It will give you a quick overview of the SharePoint databases with properties like: Name, Server(Alias), TypeName, Web application name, Web application URL, Site collection count and the Size.
The size is actually the amount of disk space required for uncompressed backup. It might look something like a script, but actually it is a long and simple one-liner. You can see it below, I have used grave-accent(`) escape characters to fit it better in the blog. You can see it in one line here. Instead of piping to Format-Table you can generate CSV by piping to Export-CSV and later work with it in Excel.

Get-SPDatabase|Select-ObjectName,@{Expression={$_.Server};Label="Server"},TypeName,@{Expression=`
{
$_.WebApplication.Name};Label="WebAppLication"},@{Expression={$_.WebApplication.Url};Label="WebAppLicationUrl"}`
,@{Expression={($_.WebApplication.Sites|Measure).Count};Label="SC Count"},@{Label="Size in MB";`
Expression={$_.disksizerequired/1024/1024}} |Format-Table-AutoSize


Get SharePoint database report

Pause a Nintex workflow for less than 5 minutes in SharePoint Server and Office 365

$
0
0
Last week I spent almost 2 days fixing a complex Nintex/SharePoint 2013 issue with one of our customers. The customer was not very big in terms of headcount, but they were using Nintex workflow to automate all sorts of processes(one of the biggest I have worked with).
There were Workflow timer jobs stucking/failing, workflows exiting with errors for no obvious reason and more. The issue was resolved with a couple of fixes and at least the workflows were executed when they should and were ending as they were designed to.
Some of the issues and improvement points I flagged were: not properly scaled Nintex deployment, incorrect service topology, outdated product versions and poor workflow design.
Now, the fourth (poor workflow design) was partially dictated by the inadequate scale of the deployment. They were using a lot Pause and Commit pending changes actions. Many of the workflows were designed to have two minute pause after the first couple of actions.
As maybe you know the pause action actually pauses the workflow instance for the defined period of time, but the workflow will not resume immediately, it will be resumed when the "Workflow" Timer Job is executed. The default schedule of this job is every 5 minutes. This means that you cannot pause a workflow for less than 5 minutes or pause it for exactly the time you have set. You can change the schedule of the Workflow timer job to workaround the first limitation, but this can put additional load on your system.
This is why I demonstrated an alternative of the Pause action that do not pause the workflow instance, but just waits a certain amount of time before continuing the execution. I have not seen this approach in other sources and this is why I decided to share and explain it in this post.
There is another alternative to pause a workflow for less than 5 minutes. It is described in this article.
As you can see this alternative requires "NTX PowerShell Action". This is great, but this action is open source, it is deployed with Farm solution and although developed and published by Nintex Employee this addon is not backed and supported by Nintex. The PowerShell action is fantastic, but in my opinion it is not worth to deploy it just to use it as Pause alternative. Also you cannot use it in Office 365(SharePoint Online).
The PowerShell example works by executing the powershell code that will just wait a certain amount of time, then it will continue the execution. Obviously to pause a workflow we need to do some sort of waiting. There is no out of the box action that just waits, as we know Pause action is not doing anything, but actually idling the instance execution at certain point and waits for the timer job to resume it after the time is elapsed. With the powershell example we use the powershell (.Net) framework to achieve wating without doing anything for certain time. The same thing can be achieved with T-SQL statement execution and in Nintex Workflow both On-Prem. and Office 365 we have "Execute SQL" action.
If we want to put a wait in our SQL query for two minutes we can use the code below:

WAITFOR DELAY '00:02'

I am not a SQL guy and was surprised to find out that this statement works outside of the SQL Management tools.
Below is the designer look of my demo workflow in SharePoint 2013. This should also work for SharePoint/Nintex 2010.

WorkflowDesigner

As you can see it is pretty simple just for PoC. Below is the configuration of the "Execute SQL" action.
Execute SQL Action Configuration

I am using a connection string that is using "SQL" as server, this is alias to the SQL instance that hosts my SharePoint, I am using SSPI security with Windows credential that are actually my farm account saved as global constant.
Below are the details from the execution. You can see that Execute SQL actions took exactly 2 minutes to complete.

Workflow Execution Detail
Unfortunately you cannot use this approach on-premise  for pauses longer than 5 minutes without doing a loop and in this loop execute multiple times delays that are less than 5 minutes. If you do set delay more than 5 minutes the workflow will fail with error "Error performing database operation. Timeout expired.  The timeout period elapsed prior to completion of the operation or the server is not responding."even if you set connection timeout in the connection string to be more than the 5 minute delay. I will do some more tests/research and might report this as bug.

As described in the MSDN documentation of the WAITFOR statement, it should work against Azure SQL Database. In Nintex workflow for Office 365 we also have Execute SQL action. I actually tested this and noticed two things, the connection timeout you specified in the connection string will be set to 365 if the number is bigger than that, also if you set a delay longer than 4 minutes you will get some unexpected http errors during the execution, the workflow manager will do a couple of retries and then it will fail. I think that both are issues with the Workflow Manager in SharePoint Online.
This is not so important because the Pause action in the SharePoint 2013 workflows (Workflow Manager) are not depending on SharePoint timer jobs and you will not get the same pause issues as in on-premise 2010 framework, but it is still an option. See example configuration of the action below.

Execute SQL Office 365


My final words are that this might be extremely useful if you need to put some short pauses(not more than 5 min.) in your on-premise workflows. I hope you find this useful!

Disable SharePoint Event Firing in PowerShell process

$
0
0
Last week I worked with a customer that is using SharePoint 2010 as part of their enterprise DMS solution. Only a small part of the users are accessing the SharePoint sites directly, but they are accessing and adding documents using 3rd party Outlook integration product and in-house legacy LOB system integrations with SharePoint. If you have dealt with such DMS solutions(which is not uncommon in some industries) you will know that during the exploitation you might end up with a complex folder structures created based on the document metadata. You might also end up with large numbers of empty folders.
The empty folders are an issue for my customer and they reached me with a request to write a PowerShell cleanup script that will run on schedule and will delete the empty folders.
The catch is that there is an ItemDeleting event receiver that is preventing the deletion of any item including folders. You can see in my dev. machine a similar event receiver in action. It is also stopping the operation when I call Recycle() on item in PowerShell.


If you are a developer, most probably you know how to disable the event firing inside an event receiver in order to prevent the firing of other event receivers. This is done by setting the value of property SPItemEventReceiver.EventFiringEnabled. We can do the same thing in powershell with below code and prevent any events from being fired.

$assembly = [Reflection.Assembly]::LoadWithPartialName("Microsoft.SharePoint");
$type = $assembly.GetType("Microsoft.SharePoint.SPEventManager");
$prop = $type.GetProperty([string]"EventFiringDisabled",`
[System.Reflection.BindingFlags] `
([System.Reflection.BindingFlags]::NonPublic -bor [System.Reflection.BindingFlags]::Static)); 
#SET EVENT FIRING DISABLED.
$prop.SetValue($null, $true, $null); 
 
<#
DO WHAT YOU NEED TO DO
#>


#SET EVENT FIRING ENABLED.
$prop.SetValue($null, $false, $null); 

This code will disable the event firing in the current PowerShell thread and I am able to delete/recycle any item without executing the event handler. I have tested this with SharePoint 2010 and 2013, haven't tested it on SharePoint 2016, but I assume that it will work there too.
This is very powerful technique use it carefully on your own responsibility. I hope that this was helpfull!

Get All Items in 5000+ large list with CSOM in PowerShell

$
0
0
Last week I had to write a script that needed to take all items in large SharePoint Online list.
By large I mean above 5000 items. This means that the list is above the list view threshold in SharePoint Online, which is 5000 and we cannot change that. The way to get all items in SharePoint Online is to use CAML query. However if it is just an empty query without any filtering it will fail, if you use unindexed column for filtering or ordering the query will fail, if you filter/order by indexed column and the query returns more than 5000 items it will fail again. The error in this and other scenarios is similar to the one below.

Exception calling "ExecuteQuery" with "0" argument(s): "The attempted operation is prohibited because it exceeds the list view threshold enforced by the administrator."

The way to workaround this is pagination of the view. This means that we will have a row limit of the result that the query can return that should be less or equal to 5000. Once we get the first 5000 items we can do another query for the next 5000 starting from the position where the first result(page) ends. This is the same with what we do in the UI scrolling foreword in the list view. Below is an example PowerShell snippet that will take all items from a list using 5000 items page size ordering the items by ID.

$list = $ctx.Web.Lists.GetByTitle($DocLibName)
$ctx.Load($list)
$ctx.ExecuteQuery()
## View XML
$qCommand = @"
<View Scope="RecursiveAll">
    <Query>
        <OrderBy><FieldRef Name='ID' Ascending='TRUE'/></OrderBy>
    </Query>
    <RowLimit Paged="TRUE">5000</RowLimit>
</View>
"@

## Page Position
$position = $null

## All Items
$allItems = @()
Do{
    $camlQuery = New-Object Microsoft.SharePoint.Client.CamlQuery
    $camlQuery.ListItemCollectionPosition = $position
    $camlQuery.ViewXml = $qCommand
## Executing the query
    $currentCollection = $list.GetItems($camlQuery)
    $ctx.Load($currentCollection)
    $ctx.ExecuteQuery()

## Getting the position of the previous page
    $position = $currentCollection.ListItemCollectionPosition

# Adding current collection to the allItems collection
    $allItems += $currentCollection
}
# the position of the last page will be Null
Until($position -eq $null) 

Few word about the query, I am using RecursiveAll because I used it against library and I wanted to get all items in all folders, the size of the page is 5000, just on the edge of the threshold and I am ordering the result by ID because this column is always indexed.
I am using Do-Until loop to get all pages and setting the position to be the position of the last item collection that was retrieved.
This is really a powerful and quick way to workaround the annoying 5000 list view threshold. I hope you find it useful!

Useful file handling commands in the SharePoint PnP PowerShell module

$
0
0
Last week I got a request from one of our customers to help them to move some of the files from one SharePoint library to another in different web. Sounds an easy and quick task, but the catch was that the library had ~ 12000 documents and 1500 folders and the customer also wanted to keep the Created, Created by, Modified and Modified by column values. The number of items that had to be moved was ~ 3500. Pointless to say that with such number of items the Explorer view is not working, the new OneDrive client does not support sync from SharePoint libraries yet and I still had to figure out how to effectively copy the metadata. The way to accomplish this is with PowerShell or with 3rd party migration tool. Since the customer had only this requirements and not migration of the version history for example, my weapon of choice was powershell.
In this post I wont to share a couple of SharePoint PnP PowerShell cmdlets that greatly helped me in writing my migration script. Bluesource is also a contributor to the PnP project, thanks to my colleague Pieter Veenstra.
The first thing I want to share is a new feature that came with the June 2016 release. It is the option to map SharePoint site as PSDrive. This is done at the begging using Connect-SPOnline with CreateDrive parameter. You will then get a PSDrive called SPO and a PSProvider also called SPO. See how it looks below.

As mentioned above you will connect to the site and you will be looking at the root web. The sub webs will be shown as folders. You can do many standard things in this PSDrive like listing items, copy, move and more. One thing I was unable to do is listing the items in lists with 5000+ items, it seems that the view threshold limitation is kicking in. The other thing is copy items from SPO drive to the local file system, this is because you cannot copy items from one drive to another if the PSProviders are different. Also it would have been nice if this SPO drive is persistent and you can access it in Windows Explorer, this is not available and I am not sure if it is possible. 
I did not used this in my script, but I think that it is nice to have and you can learn more about this and other improvements in the June 2016 Community Call.

Get-SPOFile - This is one very useful command that will help you to download a file by supplying the server relative url to it. This was easy task after I retrieved all items and their FileLeafRef and FileDirRef fields using the technique from my previous post

Ensure-SPOFolder - With this command you can get a folder by giving a web relative url to the folder. If the folder does not exist it will create it even if the folder is nested in other non-existing folders, they will be created as well.

Add-SPOFile - With this command you can upload a file by supplying web relative url of the folder, the file will be uploaded with the same name. If the folder does not exist this command will create it before uploading the file. Really nice command!

I hope that this was helpful!

Display related item repeating section in Nintex Workflow task form

$
0
0
Last week I worked with a customer that had repeating sections in Nintex Forms 2013 item form and Nintex Workflow 2013 workflow associated with the list. The customer had the requirement to be able to properly display the repeating section data in the workflow task forms. This requirement does not seems to be a straightforward to accomplish, but in this post I am going to demonstration that this is actually very simple and since I haven't found this in other sources I am sharing my solution and other useful links in this post.
The issue with the repeating section is that it is living as "section"only in the form. You can connect the entire repeating section to a field with type "Multiple lines of text" and you will see that our repeating section value is actually saved as XML.

Nintex Repeating Section

The item actually looks like this:

Nintex Repeating Section  Form

The first nice thing that is not directly related to the title of this post is to make the XML data looks better in List View. To accomplish this I am going to use the CSR (Client-side rendering) approach demonstrated in this post "Displaying Repeating Section as table in List View - the CSR approach". Adapting and applying the script to my list view gives me below result that is way better than the XML.

Repeating Section CSR


The way to make sense out of the repeating section in workflow is by querying the XML from the field. I will not go in dept since there are many resource on the subject. One thing that can help you in this task is this article "Nintex Forms/Workflow - Parsing Repeating Section Data" by Vadim Tabakman.
Now to the reason to write this post. If you have tasks in your workflow it will not be unusual you or your customer/users to want to see the related item properties right in the task form instead clicking on links. If you leave the form as it is, the best you can get is to view the repeating section as XML. You can edit the task forms with Nintex Forms for most of the task templates you will get a good starting point and all item properties controls will be created. However check out how this controls look like in three common tasks. From left to right Flexi Task, Request review and Request data.

Nintex Task Forms

As you can see the data from the repeating section is displayed as XML. Even in the "Request data" template where I have used "List item" control to display the related item. 
This was also the case with the customer, they had many "Request data" tasks and all of them were using "List item" control to display the related item.
The solution to this is very simple, just create a new repeating section in the task form, recreate all child controls by replicating the data type and the Name of the controls. Then connect the repeating section control to the related item field that contains the XML from the related item. Checkout how a Flexi task looks like if you recreate the repeating section as described.

Flexi Task Form


The data from the repeating section in the related item is represented as repeating section in the task form as well. Just make sure that the names of the controls are as in the original item, make the repeating section read only in the task form and you will be completely fine.
I tested the same approach in SharePoint 2016 and Office 365. However something interesting is happening with the XML as you can see in the screenshot below (the field is called Rep)

Task Form Office 365

The important thing is that the repeating section is visualized as expected. If I found what is happening with the XML might blog about it.
I hope that this was helpful! 

Migrate SharePoint 2010 Term Store to SharePoint Online with PowerShell

$
0
0
Last week I worked with a customer on migrating one SharePoint 2010 site to a new SharePoint Online.
I can qualify the site as Knowledge Base designed for optimal discoverability of the documents that are uploaded. To achieve a good discoverability you will need some good metadata describing the resources. Many times the metadata that is used is actually managed metadata that needs to be migrated/recreated in SharePoint Online.
If you have 10 or 20 terms it will not be an issue to recreate them, but if you have 400 for example it will not be very practical to manually recreate all terms.
There are many powershell scripts out there to export/import terms, but the success rate and the complexity might vary. This is why I would like to share how I did it and it worked out pretty well for me.
For the purpose we are going use the custom cmdlets provided for free by Gary Lapointe.
For demonstration purposes I will export one term set group with one term set that has limited number of terms. You can check it out below, it also has some parent/child terms.

SharePoint 2010 Term Store

In order to export the term set group you will need to deploy the WSP that will add the custom SharePoint Server 2010 commands. By doing so you will add the additional 2010 commands directly to the Microsoft.SharePoint.PowerShell snap-in.
To export taxonomy object as xml we are going to use Export-SPTerms. You will need to supply some taxonomy object as input parameter, this will be a taxonomy session if you want to export everything, for more examples see the cmdlet help. You can see how the Legal term set group looks as  xml below.

Input XML

As you can see all essential information that is needed is exported, even some that will be an issue if you are importing the terms to a different environment or SharePoint Online. This is the Owner or every attribute that represents on-prem identity that you might have. The import command will also try to set this properties with the same values and it will fail because the identity as it was exported cannot be found. The way to workaround this is just to set different value for Owner that will be a valid Online identity. Now it is up to you to decide if you want to do this tradeoff and migrate the objects with different Owner than the source. Below are two lines (3 to make it fit better) that will take the content of the exported XML and will set new Owner for each XML node where the Owner attribute is not empty and later the same XML object can be used for input of the import command.

[xml]$termXML = Get-Content "C:\Legal.xml"
($termXML.SelectNodes("//*")) | Where {$_.Owner -ne $null} | `
ForEach-Object {$_.SetAttribute("Owner", "i:0#.f|membership|admin@MOD******.onmicrosoft.com")}

To import the taxonomy objects in SPO you will need to download and install the SharePoint Online Custom Cmdlets
This will actually install a new module called  Lapointe.SharePointOnline.PowerShell.
The command that we are going to use for the import is Import-SPOTaxonomy. For InputFile parameter we are going to use the variable from the above lines after we have set all identity attributes. If you are importing an object that is not a top level term store you should specify ParentTermStore(can get it with Get-SPOTermStore), if not you should switch on the parameter "Tenant". Before all that, you should connect to a site in your target tenant using Connect-SPOSite. Below are the lines to import the Legal term set group.

Connect-SPOSite -Url "https://mod******.sharepoint.com"
Import-SPOTaxonomy -InputFile $termXML -ParentTermStore (Get-SPOTermStore)

And this is it. Our Legal term set group is recreated and available in the entire tenant. One nice thing is that the GUIDs will be copied as well.

SharePoint Online Term Store

I hope that this was helpful and big thanks to Gary Lapointe for writing this great tools! The same approach should work for SharePoint 2013, but I have not tested this.

Set Managed Metadata field value with PowerShell and CSOM

$
0
0
In the previous post I demonstrated an easy way to migrate managed metadata term store objects to SharePoint Online with PowerShell.
Now when you have migrated the terms you might need to migrate some documents and set the metadata fields in SharePoint Online. In the same project I had to migrate around 600 documents to SPO including the metadata which had 6 managed metadata fields, 4 of them were multi-valued.
In this post I will share a powershell snipped to make TaxonomyFieldValueCollection and use it as value for field of type Managed Metadata.
I am showing this method because I got some mixed results when I used simple string as value. It is hard for me to explain why simply updating with taxonomy string did not worked in all cases.
For example, if the document was created in Office Web Apps I was unable to set the fields using a simple string. You can try using the string method and then cross-check  if everithing is set, because if you feed only metadata string(multi-valued) or just guid(for single-valued) you might not get any error, but the field will be left blank.
The challenge for me in the "TaxonomyFieldValueCollection" approach was to create TaxonomyField object instance, because I had to use the generic client context method CastTo and PowerShell don't work well with generic methods. This is why I decided it is worth sharing this example. You can see the code below.

Now a couple of words about the string that is used. In the example above I am setting multi-valued MM field with collection of two terms. The string is in format "<int>;#<label>|<guid> " with ;# delimiter between the terms. The integer is the item id of the term in the Taxonomy Hidden List, if you are using the term for first time or you do not know this id you can use the default value "-1".
The label part speaks for itself, this is the label of the term and the most important part is the guid of the term. If something is wrong with the format of the string you will see below error message.

"The given value for a taxonomy field was not formatted in the required <int>;#<label>|<guid> format."


This method is working every time for all items. I hope that this was helpful!

Build slider bar graph date time search refiner with custom intervals

$
0
0
A couple of weeks ago I worked with a client that had this requirement for their search center in SharePoint Online. They had a repository with different research documents and these documents had a Publishing Date date/time field with values up to 30 years ago.
The client wanted to build a result page for this documents and have a slider bar refiner with custom intervals up to 10 years ago. 
If we have a numeric based managed property we can specify a custom refiner interval like the one below.
Unfortunately the Custom option is missing for date and time datatype. We have predefined intervals that are up to one year ago and "Defined in search schema" which I am not sure what is suppose to mean, but this will be the error you will get if you select this option.

For this Display Template you must specify custom intervals for the values that will be shown. Please change the refinement settings to use custom intervals.

It really does not tell us much if you don't have an option to specify custom interval in the UI.
Luckily if  you export the Refinement webpart you can see more refiner settings. All selected refiners are represented as JSON and below are the settings of our Publication Date refiner(formatted).



There are two settings that grab the attention and they are highlighted in the picture above. They are "useDefaultDateIntervals", which obviously means if the default intervals that cover only one year should be used and "intervals" that should represent the custom intervals. After some research on the web I found that the intervals value should be array of integers that are representing the intervals in days. I came up with these intervals for my client: Ten Years Ago, Five Years Ago, Three Years Ago, One Year Ago, Six Months Ago, Three Months Ago, One Month Ago, 7 days Ago and Today. This will be set with flowing intervals value:

[-3650,-1825,-1095,-365,-180,-90,-30,-7,0]

The first step will be to update the values for "useDefaultDateIntervals" and "intervals". Set the "useDefaultDateIntervals" to false and for "intervals" use your interval array like the picture below.


Then you will need to import the webpart  and use it in your page. The result is below.


We have our custom intervals and they are working as we expect(at least with me). However we can see one big issue and it is that the intervals are not labeled appropriately. This should be fixed in the refiner display template.
As it is not a good practice or practical in this case to edit the out of the box display template I created a new display template based on the out of the box "Slider with bar graph".
In the new template I have specified values for the Label and the NextIntervalLabel of all "filter boundaries". In this example we are going to have 10 boundaries. NextIntervalLabel is used when you move the mouse over the bar and the Label is used for boundary label in the slider. You can see the entire template below.

On line 104 we can see how to get all boundaries and their values for Label and NextIntervalLabel.
After deploying and setting the new display template we can see that the labels are much more accurate.


There is small detail that should be updated and it is the start and end labels of the bar graph.
Unfortunately my solution to that is to change the text by selecting the elements by class name and this is not the most elegant solution if you have more than one slider bar refiners, in that case you will need to change the index number to get the correct elements. You can see the code below.

With this final touch this is how our custom slider bar graph refiner looks like.



It looks really cool and useful. If you check the refiner settings in the UI now you will see that "Defined in search schema" is selected. I found this misleading since I have done nothing special in the search schema.
I hope that this was helpful!

Run PowerShell script on Windows 10 PC through the MDM Channel in Intune

$
0
0
In the last couple of weeks I've been working on an internal project that includes software distribution of Windows apps on MDM enrolled Windows 10 PCs using cloud only Intune deployment. Yes that's right, no SharePoint in this post, but a real world EMS story.
The easiest way to publish classic windows apps on Windows 10 PC that is MDM enrolled will be by publishing Windows Installer through MDM (*.msi) installer.
The important thing with this installer type is that the installation should go without any user interaction required especially when you use "Required Install".
The issue I hit is related to Dell software that is essential for the remote work and almost everyone in the company is using it. However for some reason the software publisher (Dell Software) is considered not trusted in Windows 10. There is this great thing in Windows (since Win 7) called SmartScreen that will pop-up a question asking if we trust the software publisher, before running executable with not trusted publisher. If we manually install the software we are clicking Yes and everithing is fine, the signing certificate is added to the Trusted Publishers certificate store.
However, when we are deploying the package over Intune this issue will cause the installation to fail with exit code 1603.
The way to fix this is to extract the signing certificate and install it in the Trusted Publisher on the target computer or to turn off SmartScreen with a policy, but the second is not a good security practice. The issue is that we cannot deploy cert to the Trusted Publishers store using Intune configuration policy.
My solution is to use PowerShell script that will be deployed and executed over the MDM channel.
The issue is that Intune does not support direct script deployment. There are some articles on the net that are demonstrating how to package batch script in self-extracting executable using IExpress.
However we need to wrap PowerShell script in MSI package suitable for MDM deployment.
I think that this is a very useful technique and I will try to put all the peaces together in this post, so you can deploy and run every PowerShell script on Windows 10 MDM PCs.
The easiest(and free) way to do this will be to create a self-extracting exe with IExpress, wrap the exe in MSI and publish it to Intune.
In order to reliably wrap the PS script in exe I used a script that I found in the TechNet Gallery called Create-EXEFrom.ps1. It will do a really good job wrapping the PS script and you can also add additional files in the package, like in my case I will need Dell Software certificate that should be installed on the target machine. Below is an example line for wrapping MyApp.ps1 script (this will be name used in all sample code) including the certificate we need.

.\Create-EXEFrom.ps1 -PSScriptPath "C:\MyApp.ps1" -SupplementalFilePaths "C:\Certificate.cer"


The exe will be created in the same folder and will be called "MyApp.exe".
The tricky part is that our exe and msi package should also be executed without any interaction required, including bypassing of the SmartScreen. This can be done by properly signing both packages with valid code signing certificate. In my case I have used the certificate that we normally use in bluesource for signing mobile apps. In order to sign the packages you can use the signtool.
If the signing is successful you should see below in your file properties and you should be able to run the exe without SmartScreen alerts.
Publisher Certificate

Now when we have signed exe we should wrap it in MSI package.
The easiest free way for me was to use WiX Toolset to do that. I put together really simple WiX project with only one custom action that will execute the exe. Below you can see the sample xml with the cmd commands I used to compile the WiX project.
If you are new to WiX  you should have your WiX bin folder in the PATH variable to make the cmd script work as it is. In my case it is "C:\Program Files (x86)\WiX Toolset v3.10\bin"(I know it is not the newest version).
Note that the package will be installed under the SYSTEM account and you should consider that in your scripts. You can find how to test MSI package in following article.
Next step is to publish the MSI as "Windows Installer through MDM (*.msi)" installer type as it is shown below.
Intune Publish MSI

The last thing that's left is to deploy the newly published app and maybe running some tests won't be a bad idea :).

I hope that this non-SharePoint post, written by a SharePoint guy will be helpful!

Trust failed error when browsing the Central Administration

$
0
0
In this quick post I am going to share an issue that I recently hit with one SharePoint Server deployment. While browsing the Central Administration I got below error when clicking on the Manage service applications page.


The key thing with this error is to know the background story, something I was missing.
The story is that this farm was migrated from one domain to another.
Everything was working fine the new farm was in production when we started to get this error.
There was one small detail that we were not aware of and it is that there was domain trust between the new and the old domain during the migration. This is why everithing was working fine until the network link between the old and the new domain was cut.
With this small detail the error below started to make sense. You will see this error in different .NET apps if the app is trying to do something with identity from a trusted domain but no domain controller from the trusted domain can be reached.

The trust relationship between the primary domain and the trusted domain failed.

By looking at the "Delegated Administrators" I concluded that there are accounts from the old domain that have permissions over some of the service applications. I was even unable to get the service applications using Get-SPServiceApplication in powershell. It seems that there is some identity checking when we access the service application management page and it is failing because the trusted domain cannot be reached. The same exception can be reproduced if you try to translate username from the trusted domain to SID. The lines below are a good test to check if there is an issue with the trusted domain with PowerShell.

$userName = "DOMAIN\User"
$objUser = New-Object System.Security.Principal.NTAccount($userName)
$strSID = $objUser.Translate([System.Security.Principal.SecurityIdentifier])
$strSID.Value

Here are some of the things that might not work if you are in this situation:

- You will not be able to access the service management page
- You will not be able to enumerate the service applications in powershell
- In my case the Search and UPA was the SAs with administrators from the trusted domain and you will not be able to restart the service instances
- The UPA might stop working working completely
- If you clear the configuration cache the Timer Service will fall in a loop of rebuild attempts and crashes and no timer jobs will be executed.

As you can see this situation might not be a good place to be :).
The solution to this will be to restore the connection to the trusted domain and I am talking about a physical availability to a DC from the trusted domain or just remove the trust from the current domain. Sometimes the second solution might be the only possible solution, or just maybe this relationship was just not removed by the domain admins when the connection was cut.
If you remove the domain trust the error will be fixed and the translate method in powershell will fail with "Some or all identity references could not be translated." which it seems is handled better. Than you will be able to do some proper cleanup, if you wish.
I hope that this post was helpful! 

Conditionally Show/Hide and Require SharePoint fields with JavaScript

$
0
0
In my last project I worked on DMS system based on SharePoint Server 2016 and had a very common requirement for the document forms. The requirement was to have Document Status field, Approval Reason and Rejected Reason fields. It is logical to want to hide Approval Reason when the document status is Rejected or Rejected Reason when the document is Approved.The client also wanted to to make the fields required when the corresponding document status is selects. 
All customizations were going to be deployed using the classic way, with farm solution. I also had Nintex Forms, but I was unable to use it in this case due to other technical constrains.
This is why I decided to solve this requirement using JQuery and JavaScript script that are deployed with the WSP solution and included in the mater page.
The end result was pretty good and this is why I decided to share the script and a way to safely deploy it in SharePoint Online without the need to edit the master page or the list forms. The script is really simple and can be used/deployed as it is or with minor changes by person with moderate JavaScript experience.
Few notes on the SPO environment that was used. There are 3 custom site columns with following Title, InternalNames and Type: 
- Title: Document Status, InternalName: DocumentStatus, Choice, radio buttons
- Title: Approval Reason, InternalName: ApprovalReason, Multi-line text, Not required by definition
- Title: Rejected Reason, InternalName: RejectedReason, Multi-line text, Not required by definition

You can see the script below:

There are two main functions, showOrHideFields that will be started when the page is loaded and will show or hide Approval Reason or Rejected Reason field depending on the value of the Document Status field in New,View and Edit forms. The second function is with the specific name PreSaveAction, it will be launched when Check In or Save button is clicked and will not allow the form to be saved if Approval Reason or Rejected Reason fields are empty and will pop-out an "error" message below the field, again depending on the Document Status field value.
Below you can see how the form looks like in New,Edit and View mode.

New, Edit and View forms

In order to deploy the script without modifying the forms or the master page we are going to use the SharePoint PnP PowerShell module for SharePoint Online. In order to "inject" the javascript links we are going to use Add-PnPJavaScriptLink command. In the example below I am uploading the JS file, adding ScriptLink to it and adding ScriptLink to Google hosted JQuery library.



Add-PnPFile -Path "C:\Users\Ivan\Documents\ShowHide.js" -Folder "Style Library/scripts/" `
    -Checkout -Publish -CheckInComment ""

Add-PnPJavaScriptLink -Name JQuery `
    -Url "https://ajax.googleapis.com/ajax/libs/jquery/3.1.1/jquery.min.js"

Add-PnPJavaScriptLink -Name ShowHide `
    -Url "https://mod44444.sharepoint.com/Style%20Library/scripts/ShowHide.js"


Once the commands are executed the scripts will be loaded in all classic pages in the web(default scope for this command).
Unfortunately this really useful way of injecting JavaScript will not work with the modern pages and the new Library and List experience. More info on this huge gap can be found in this post.

I hope that this was helpful!

Troubleshoot PowerShell Add-Type Load Error [Tip]

$
0
0
In the last couple of days I am working with a client that has a DMS solution based on SharePoint Server 2010. We inherited the solution so it has it's specifics. One of those little things (that make life exciting) is that they have fields with custom data types. 
I had to do a powershell script that will edit some document metadata. I had to update standard SharePoint native data type fields, but after updating the item in powershell I lost the values of the custom data type fields. I realized that I need the custom data type loaded in the powershell session. So I started to import some DLLs, in the order that I thought it makes sense as we do not have the source code of the solutions.This was fine until I received the error below:

Add-Type : Unable to load one or more of the requested types. Retrieve the LoaderExceptions property for more information.

This error is completely generic and the only useful thing is that it tells us where we can find more useful information.
This simple error handling script turned out to be life saver for me as it showed exactly what the load error is and which dependent assembly I need to load first :)


The nice blue text is our LoaderExceptions property of the exception. I hope that you find it useful!
LoaderExceptions Powershell

Nintex Workflow UDA Usage report script

$
0
0
Yesterday I worked with a client that have many Nintex workflow published with heavy usage of UDAs(User Defined Actions). I wanted to get a detailed report on the UDA usage. Unfortunately I am not aware of any  out of the box Nintex tool that can do that. The Analyze button can give you some information, but you need to click on the workflow to find out where it is located, you need to be in the scope where the UDA is published, you can get information for one UDA at a time and the information is not really "exportable".
This is why I created a powershell script that will give you information for the UDA usage across the farm on all levels. It will give you useful information like UDA Name, Workflow Name, Defined At, List, Web, Site, WebApplication, WorkflowType, Author, UDA Version Used, Workflow Id.
There are two "modes" of the script, the default will give you just the GUIDs of the list,web,site and the web apps. If you want to get the name of the list and the URLs you need to use the second mode that will require more time to complete but will give you nice looking URLs instead GUIDs. If you want to get the URLs just use switch parameter GetUrls. The result can be saved in CSV format or it can be outputted in powershell. If you give value for CSVPath the Grid View will open at the end to visualize the data. The main source of information is the Nintex Configuration database and you can use the script with SQL authentication if you have an account with enough permissions and your SQL supports it.
I have tested the script with SharePoint 2016,2013 and 2010 and the oldest Nintex Workflow version I tested was 2.3.7.0.
You can see the code and the output examples below. I hope you find it useful!

Output with URLs retrieved:
Nintex Workflow UDA Usage report with URLs

Quick output with GUIDs:
Nintex Workflow UDA Usage report with GUIDs

Build TreeView with XML data in PowerShell [Tip]

$
0
0
In one of my recent scripts I worked on, I had to visualize XML data in a tree view manor.
I achieved this by using the TreeView Windows Forms control and I think that the result is good and can be used as an example if you have to do something similar.
I tweaked the function and made it a standalone "XML Browser" script that is visualizing the XML document and if you double click on the element you will copy the Outer XML text to the clipboard.

PowerShell XML Browser

In order to visualize xml you need to supply the path to it and the starting element. On the screenshot above I have a web.config file loaded and the starting element I want to visualize is "configuration". You can find the code and the example below. I hope you find it helpful!


Fantastic 40 is back in SPFx form

$
0
0
Well, actually not, but I hope that the title of this post will make you want to check it out.
I would like to share a great project called "SPFx Fantastic 40 Web Parts". Except the name, this project has almost nothing to do with the old "Fantastic 40 Application Templates for SharePoint (WSS & MOSS)" because it is a collection of 40 SPFx Client Side Web Parts. It is open source project so you can use it as you find for useful. It was created by Olivier Carpentier, but the code is provided "as is" without support from Microsoft.
I am so excited because the web parts look great and they might be a solution to common needs that the out of the box modern webparts don't fulfill.
It is also a great learning resource and demonstration of what can be done with SPFx.
If it is not an issue for you to consume the web parts assets from CDNs you do not control you can download and deploy the *.sppkg as it is and it will work. I also tried most of the web parts on SharePoint Server 2016 with classic pages and they work and look great.
The web parts are grouped in seven categories and below you can find some samples.

Menu & Carousels & News Management: News Slider

The News Slider Web Part renders a simple news slider carousel to your page. You can manage your active news, manage the layout and easily render a cool slider to enhance your pages.


Social Tools: Tweets Feed

The Tweets Feed Web Part is a SharePoint client side web part built with the SharePoint Framework (SPFx). This web part generates a Twitter Feed on the page, based on the specified account. This Web Part uses the Twitter API to integrate the timeline.



Maps, Charts & Graphs: Bar Chart

The Bar Chart Web Part is a SharePoint client side web part built with the SharePoint Framework (SPFx). This web part insert a Bar Chart in your pages, and you can manage the bar chart settings as items, color, title, legends, etc. This web part uses chart.js lib.


Images Galleries & Tools: Grid Gallery

The grid gallery Web Part renders a pictures slideshow with grid of thumbnails. This web part implements unitegallery.js (a popular jquery script) as a client side web part for SharePoint.


Video & Audio: Media Player

The Media Player Web Part is a SharePoint client side web part built with the SharePoint Framework (SPFx). With it, you can insert a video in your pages that is a HTML 5 compatible video or audio files, a YouTube video or a Vimeo video. This web part uses Plyr.js lib.


Text Tools: Tabs

The Tabs Web Part is a SharePoint client side web part built with the SharePoint Framework (SPFx). This web part helps to create a tab (you manage, add, delete, edit or move a tab dynamically) and the web part editor can easily modify the tabs content thanks a HTML editor (WYSIWYG). This tab control is responsive, so the layout will adapt the render with the screen size.



Tools: Stock Info

The Stock Info Web Part is a SharePoint client side web part built with the SharePoint Framework (SPFx). This web part generates a stock graph picture for a specified stock. This web part uses the Yahoo Financial service.


Viewing all 46 articles
Browse latest View live




Latest Images