Category Archives: K2

Unable to create new/edit existing Oracle Service Service Instance after changing K2 installation path

Recently I bumped into a problem which was super obvious in retrospective, yet took me some time to untangle it. K2 environment was upgraded from 4.6.11 to 4.7 and K2 installation path was changed in the process (drive letter). After upgrade was completed without warnings or errors, we did some more testing and found that one of the forms which was using Oracle Service Instance based SmartObject started to throw an error similar to this one: 

Could not load file or assembly – SourceCode.SmartObjects.Services.Oracle.dll

Essentially it was very clear from the error message that Oracle Service instance keep looking for related assembly in old installation location (wrong drive letter). We switched to SmartObjects Services Tool only to see that there we are unable to edit or create new service instance of this service type. At this point I looked at old cases mentioning similar error message and surprisingly large amount of them was proposing workarounds and things not quite related with the root cause. We spend some time addressing missing prerequisite for this service type – 64-bit Oracle Data Access Components (ODAC) version 2.121.2.0 or higher, which mentioned as such in 4.7 user guide (_) and checking some related settings and so on.

But I next paid attention to the fact that environment had 2 service type for Oracle one of them was working, while another one does not. I next dropped assembly mentioned in error message in old installation location and restarted K2 service – it then fixed first Oracle service instance, but broken another one – it started to say that assembly SourceCode.SmartObjects.Services.Oracle.dll has been already loaded from another location, and this brought my focus back to the real problem – somehow one of the Oracle service types was not updated by K2 Setup Manager to use new installation path. Probably it was somehow “custom” and somehow was skipped by installer because of that. Anyhow my next step was finding where this path is defined. As soon as I confirmed that I cannot see/edit Service Type definition XML from SmartObjects Services Tool I switched to K2 database to check it there.

Necessary word of warning: Backup your K2 database before attempting any direct manipulations in it, and make sure you understand what you are doing before starting doing that 🙂

Service type definitions live in the follow [SmartBroker].[ServiceType] table, so I located “problematic” service type to check on its XML which is stored in ServiceTypeXML column. Here is the sample query to quickly search for service instance definition based on its Display Name:

Than will return you XML column value, on which you can click to view it as a formatted XML, here is an example of how it looks like:

Service Type XML

As you can easily service type definition contains assembly path parameter in its XML. So now it is only a question of updating it with correct value. Here is sample script to do that:

That will iron out problem with misbehaving service type. I don’t think that it can be very frequent problem as normally installer updates all the assembly paths definition with new path. But, especially if you have some custom service type, you may want to scan your service types definitions for any vestiges of old installation path. Here is a sample script which will display all Service Instances definitions which contain old drive letter reference (my example uses “D:\%” as a search criteria):

I hope that this blog post may help someone who may bump into similar error in K2 and if not, then maybe you can make use of SQL script samples which use filtering based on values within XML columns.

Facebooktwittergoogle_plusredditpinterestlinkedinmail

K2 – How to identify process name by process instance ID

Somehow I kept forgetting this thing frequently enough to expend some effort to write this 🙂

At times when you troubleshooting something in K2 you need to identify process having only process instance ID and frequently knowledge of the solutions and workflow is a missing part (developer is away on vacations or , in the worst case scenario, nobody even knows if there was a developer in the first place 🙂 ). As a sample scenario, you can think of troubleshooting failed process escalation or process instance which stuck in Running state.

Let’s look at this in more details. For failed escalation you will definitely have error in K2 host server log and entry in K2 Server.Async table – that will give you  ProcInstID value, and your next steps are: A) Find out which process this instance belongs to and B) Status of this instance. Finding (B), at least if your process is in error state is easy as it supposed to be listed in  Error Profiles View where you can retry error and also see Process Instance ID and process name.

But in case your instance not listed in Error Profiles View, or let’s say you going step by step before jumping into Error Profiles, then you still have 2 options to get Process Name process instance ID:

(1) Using Workflow Reporting SmartObjects. You can use Process Instance SmartObject (Workflow Reports > Workflow General > Process Instance) to get list of Process Instances – you just feed ProcInstID to it to get back ProcessSetID:

Process Instance SmO Get List

Process Set ID in turn can be feed to Process Overview SmartObject (Workflow Reports > Workflow General > Process Overview) which will give you Process Name:

Process Overview SmO Get List

(2) Querying K2 database (in case you already in SSMS and too lazy to switch over too K2 Server/Tester Tool 🙂 ). Here is a SQL query you need to run:

SQL Query – Get Process Name by Process Instance ID

Facebooktwittergoogle_plusredditpinterestlinkedinmail

GA of K2 5.2 – time to try new version

Today 17.10.2018 K2 5.2 went into GA stage meaning news about release were sent to all clients and partners and starting from now we can download this new and shiny version from K2 portal. So it is a perfect time to do a little review. Without further ado let me start with this.

You can download 5.2 installer from K2 portal. And providing you have test VM with current version of K2, update manager will get you to new version  withing 30 minutes or so. Once installer completes extraction of files you presented with splash screen:

Splash screen provides you with essential information (.NET 4.6.1 requirement, where to run an so on) and allows you to kick off installation process (conservative people like me can still locate Setup.exe and run it from Installation folder).

In case of existing installation detected K2 Update manager detects that and gets you upgraded just in few steps:

In case you run with multiple security labels you will immediately notice improved label selection UI which is no longer looks like something from the past and fully aligned with modern K2 UI design:

Additionally you will notice increased number of available OAuth resource types:

My favorite under the hood improvement, which is really huge thing, is completely rebuilt identity cache and sync architecture which was brought into on-prem product from its cloud version (if I employ Microsoft-speak “battle-tested in the cloud” and so on). At this stage all the internal infrastructure of new Sync Engine is already here in 5.2 RTM, yet it is disabled – stay tuned for official news for when this feature will go live for all customers. At initial stage K2 will work with selected customers to assist them to enable and transition to the new Sync Engine. But like I said, you already can see that underlying infrastructure for New Sync engine is already here in 5.2 release. In case you familiar with back end/underlying tables you can tell that number of Identity tables has increased:

And Identity.Identity table has been expanded too:

Long story short with all these changes and new sync engine enabled your Identity cache sync speed will be greatly improved and, for example, even your URM Get Users SmO call against Azure AD label can be served from cache without doing query to AAD.

There is more improvements and new features and I will try to cover them in greater details a bit later.

Additional resources / next steps:

Download K2 5.2

5.2 Release Notes

5.2 User Guide

5.2 Installation and Configuration Guide aka ICG

K2 5.2 Developer Reference aka DevRef

K2 Five (5.2) Fix Packs

Facebooktwittergoogle_plusredditpinterestlinkedinmail

Unable to run ConnectionStringEditor.exe – “Did you run the tool from the K2HostServer directory?”

Sometimes you may get quite strange errors with simple solutions and silly reasons. Here is an example – you get a complaint that K2 ConnectionStringEditor.exe cannot be started with the following error message:

And that may really confuse you, especially when you are sure that it is clean, new and shiny, correctly performed installation 🙂 You start wondering what cofig is required and why on earth it had disappeared when there are seemingly no people tend to kill configs during the log clean up nearby. But what you need here before go into panic mode is second look at larger screenshot maybe… like this one:

So the error is actually caused by attempt to run the tool from search results instead of doing that by locating it in its default location (“C:\Program Files (x86)\K2 blackpearl\Host Server\Bin\ConnectionStringEditor.exe”).

Another quite typical error with this utility is attempt to run it without elevation on a server which has UAC enabled (I assume it should be any production Windows server, but you may see it disabled in some environments still).

Facebooktwittergoogle_plusredditpinterestlinkedinmail

K2 Configuration Analysis – Could not create certificate: Keyset does not exist

I was doing K2 5.1 installation recently and bumped into the following error reported by K2 installer:

The following was recorded in installer trace:

>> Certificates.FindCertificate: Find cert start
>> Certificates.FindCertificate: Find certificate: D9F8C4EF74BD33B6C9FCA421E68222E77B1A3315 in LocalMachine – My
>> Certificates.FindCertificate: Found 1 certificates
>> Certificates.ExportCertToString: Exporting…
>> EnsureCertificate.Execute: Logged Error: Could not create certificate: System.Security.Cryptography.CryptographicException: Keyset does not exist

at System.Security.Cryptography.CryptographicException.ThrowCryptographicException(Int32 hr)
at System.Security.Cryptography.X509Certificates.X509Utils._ExportCertificatesToBlob(SafeCertStoreHandle safeCertStoreHandle, X509ContentType contentType, IntPtr password)
at System.Security.Cryptography.X509Certificates.X509Certificate.ExportHelper(X509ContentType contentType, Object password)
at SourceCode.Install.Security.Certificates.ExportCertToString(X509Certificate2 cert, String password)
at SourceCode.Install.Package.Actions.Security.EnsureCertificate.Execute(Target target)
>> EnsureCertificate.Execute: End EnsureCertificate, success: False
>> Target.Execute: !Completed: Server – EnsureCertificate, Success: False

It was really strange as everything was working fine and seemingly all required certificates were in place. I tried to check on “C:\ProgramData\Microsoft\Crypto\RSA” folder permissions but they were seemingly OK and updating/editing them didn’t resolve K2 installer error for me. Though as I later found out I was looking into the right direction. I next rebooted the server a few times and eventually lost RDP connectivity to it with the following error message displayed by MSTSC client:

At this point it was took over by Windows server admin who resolved MSTSC error after stumbling on the blog post at Ask the Performance Team Blog and correcting RSA folder permissions. As I understand final solution was… …drum roll… ensuring/fixing correct permissions on “C:\ProgramData\Microsoft\Crypto\RSA” folder. Specifically NT AUTHORITY\NETWORK SERVICE account didn’t have rights on “C:\ProgramData\Microsoft\Crypto\RSA\MachineKeys” folder. Once permissions were adjusted it fixed both MSTSC connectivity and K2 installer error mentioned above.

My takeaways from this case and aforementioned blog post are the following:

– For RDP problem there was correlating server side even with ID 36870: A fatal error occurred when attempting to access the TLS server credential private key. The error code returned from the cryptographic module is 0x8009030D. The internal error state is 10001.

– In general if I’ll see this error next time I will run Procmon while reproducing this error/running installer with expectation that it will show an “Access Denied” error while trying to access some of the MachineKeys, e.g.:

“C:\ProgramData\Microsoft\Crypto\RSA\MachineKeys\f686aace6942fb7f7ceb231212eef4a4”

Once we have access denied error spotted it should be possible to correct permissions with certainty that problem is indeed caused by them. As we had certificate GUID mentioned in K2 installer trace I assume there should be corresponding access denied error related with accessing this specific machine key which can be caught with help of Procmon utility.

Facebooktwittergoogle_plusredditpinterestlinkedinmail

Simple walkthrough: Using K2 Database Consolidation Tool

Purpose of this blog post is to outline K2 databases consolidation process using K2 Database Consolidation Tool.

When you may need it? For older K2 deployments when initial installer used to create 14 separate databases instead of one “K2” database we expect to see with current K2 versions. Such environments even after upgrades to newer versions carry on to have these 14 databases and only starting from K2 4.7 databases consolidation is enforced and you cannot upgrade till you consolidate your databases into one. So you can still see non-consolidated K2 database en environments which run any version of K2 up to 4.6.11 including.

To perform consolidation of these 14 K2 databases into one you need to obtain appropriate version of K2 Database Consolidation Tool from K2 support. Below you can see basic steps you need to perform while performing K2 databases consolidation using this tool.

1) First we need to check collation of your existing K2 databases, this is necessary because consolidation tool won’t handle conversions from one locale to another and consolidation will fail. You can run this script to see collation of your non-consolidated K2 DBs:

As you can see on the screenshot below output of this script shows that my non-consolidated databases have Ukrainian_100_CI_AS collation:

2) Make sure that your target SQL Server service instance has the same collation as your databases either via GUI:

or script:

and copy your non-consolidated databases to the target server which will be hosting consolidated database (unless it is not the same server which was hosting them initially).

2) Obtain K2 Database Consolidation Tool from K2 support, extract it on your SQL server which hosts your K2 databases and launch SourceCode.Database.Consolidator.exe, once you start it you will be presented with the following UI:

3) It will detect your non-consolidated K2 DBs (<No Instance> in the Instance drop down means that you are connecting to default, not named SQL Server instance) and here you need to select your target DB – just select <New Database>, specify “Create Database Name” (I’m using default name used by K2 installer which is K2) and click Create:

4) Once you click Create, database K2 will be created in the same collation as your SQL Server instance (your target DB will contain all the required tables and structure but no data) and Start button become available to you so that you can start consolidation process:

5) Before clicking on Start make sure  your K2 service is stopped. Despite we just created our target “K2” database we still getting warning that all data in target DB will be truncated and we have to click Yes to start consolidation process:

Once you clicked on next you will have to wait for a while till consolidation completes (in the bottom of the tool window in its “status line” you will see current operations which are being performed during databases consolidation process. Time which is necessary to complete this process is heavily depends on your server performance and volume of data in your source databases.

In some scenarios (e.g. source and destination collations have different locale IDs or you moved source databases to another SQL server without re-creating their master key) consolidation process may fail leaving your non-consolidated databases databases in read-only state:

In such scenario you need to review consolidation log to identify and address errors and once done. Switch your source databases back to RW mode (as explained here), delete your target database and start again from step (2). When consolidation completes successfully source non-consolidated databases also stay in read-only mode.

If consolidation completes without errors you will get a message confirming this and also informing you that ReconfigureServer.ps1 script has been created:

You can also click on Log Directory link which will open consolidation log file location – as usual you can open it and make sure than neither ‘Logged Warning’ or ‘Logged Error’ can be found anywhere in this log beyond Legend section in the beginning.

6) In the directory which contains K2 Database Consolidation Tool you will need to take ReconfigureServer.ps1 script and copy it over to your K2 server. This script fires off K2 blackpearl Setup Manager while instructing it to connect to your new consolidated DB:

Here is this script code which you can copy/paste:

Once you run this script on K2 server it will start K2 Setup Manager where you need to go through all pages of “Configure K2 blackpearl” process:

You will see on the database configuration step of the wizard that thanks to PS script we already targeting our new consolidated DB:

Once reconfiguration process is completes (without errors and warnings) you can start testing how your K2 environments behaves after K2 consolidation process.

Facebooktwittergoogle_plusredditpinterestlinkedinmail

SQL script to attach detached non-consolidated K2 DBs

I keep playing with SQL and non-consolidated K2 DBs and in previous post I covered bringing “these 14” back online, now I realized that other case where SSMS requires way too many click is attaching “these 14” back (let’s say after you rebuild your SQL instance system DBs to change instance collation).

Quick google allowed me to find relevant question on dba.stackexchange.com where I took script which generates CREATE DATABASE FOR ATTACH for all existing user databases. Next having my 14 non consolidated K2 DBs I generated the following script to attach them back in bulk:

You can either use this CREATE DATABASE FOR ATTACH for all existing user databases script while your K2 databases are still attached, of it they are not just replace paths in script listed above and execute modified script to attach them quickly.

Facebooktwittergoogle_plusredditpinterestlinkedinmail

SQL Script to switch all currently RO databases to RW mode

I was doing some testing of K2 databases consolidation process which required me to re-run database consolidation process more than once to re-try it. Unfortunately K2 Database Consolidation Tool leaves all databases in read-only mode if something fails during consolidation process. If you remember K2 used to have 14 separate data bases prior to consolidated DB was introduced (see picture below).

Typing 14 statements manually to bring all these database to read-write mode is a bit time consuming so I came up with the following script:

Essentially it will select all databases currently in RO state and will output bunch of statements to bring all of them to RW state as an output:

Just copy-paste this script output into new query window of SSMS and press F5 🙂

It may be useful for you once in a while (and if not for this specific use case, then as an example of generating some repetitive statements which contain select statement results inside).

Facebooktwittergoogle_plusredditpinterestlinkedinmail

K2 5.0 unable to read CORS settings from SmO when using the Workflow REST API in JavaScript

If you are trying to use K2 Workflow REST API in JavaScript (as described in product documentation) you may see the issue described below (and you may want to upgrade to 5.1 to resolve it 😉 ).

You have CORS settings configured correctly for domain which hosts your JavaSript, i.e. you have settings similar to these:

Workflow REST API Settings

Screenshot above assumes that your JS resides within js.denallix.com domain, upon attempt to execute JS code you will be getting errors.

Using Chrome you will be getting the following error:

1 Failed to load https://k2.denallix.com/Api/Workflow/preview/workflows/?_=1523444398270: Response to preflight request doesn’t pass access control check: No ‘Access-Control-Allow-Origin’ header is present on the requested resource. Origin ‘https://js.denallix.com’ is therefore not allowed access. The response had HTTP status code 400.

IE will also give you an error but less clear one:

SCRIPT7002: XMLHttpRequest: Network Error 0x80070005, Access is denied.

Here is the screenshot of error message from Chrome browser:

And here is what you can see in Fiddler:

In case you want to reproduce this you may use sample code which returns a list of Workflows either owned or startable for the user credentials supplied from K2 documentation).

So you would expect CORS settings configured for Workflow REST API supposed to ensure we have it working fine, but it does not work. What’s wrong here?

If you enable Workflow API logging you can see the following in this log:

w3wp.exe Warning: 0 : Failed to retrieve CORS settings.
System.InvalidOperationException: Failed to determine user principal name
at SourceCode.Forms.AppFramework.ConnectionClass.HandleIdentityImpersonation(Boolean asAppPool, Action action)
at SourceCode.Forms.AppFramework.ConnectionClass.TryCredentialToken(BaseAPIConnection connection, String credentialToken, Boolean asAppPool)
at SourceCode.Forms.AppFramework.ConnectionClass.GetPoolConnection(Boolean asAppPool, Boolean& tokenApplied, String& credentialToken)
at SourceCode.Forms.AppFramework.ConnectionClass.Connect(BaseAPI baseAPI, Boolean asAppPool)
at SourceCode.Web.Api.Common.DataContexts.K2DataContext.EnsureConnectionIsOpen[T](T api)
at SourceCode.Web.Api.Common.DataContexts.K2DataContext.GetCorsSettings()

This means that when you send the request, the workflow API tries to retrieve the CORS settings from the SmartObject. When it does that, it makes a connection to host server. For some reason the connection is failing with an error “Failed to determine user principal”.

Because of this exception CORS settings are not retrieved and the list of allowed origin is empty on the web API side and this lead to the error mentioned above (The response had HTTP status code 400). In K2 5.0 something in the stack is not parsing/decoding the authentication credentials (from the state it was coming from the AJAX call) correctly and thus the identity isn’t recognized causing a failure in the connection.

If you are still on K2 Five (knowing how many people running older version I’m not very comfortable with this wording 🙂 ) your workaround for this issue is to remove the authorization header from the AJAX call and let the browser prompt you for username and password. Here is sample HTML code for that (essentially we just removing var username and war password in $.ajaxSetup):

But in case you running new and shiny 5.1 or ready to upgrade your 5.0 environment to this version, it will work just fine there without need to employ any workarounds.

And here is the ling to official KB covering the same issue: “Failed to load….No Access Control Allow Origin.”Facebooktwittergoogle_plusredditpinterestlinkedinmail

Reading list: K2 Authentication and Authorization

This is a list of links to K2 documentation which covers K2 Authentication and Authorization topics. In case you have some time to read something for fun 🙂

Authentication

Authentication and Authorization in K2

Claims-based Authentication in K2

Outbound Authorization and OAuth in K2

About K2Trust

Troubleshooting Claims-based Authentication Issues

Identity and Data Security in K2 Cloud for SharePoint

SharePoint Hybrid, Multiple Identity Providers & K2

AAD Multi-Factor Authentication Considerations

Enabling AAD Multi-Factor Authentication Requires Changes in K2 4.7

Authentication Modes

Authentication (in Management)

Integrating with Salesforce

Azure Active Directory Management (Read/Write to AAD)

Claims and OAuth Configuration for SharePoint 2013

Standard SmartForms Authentication

Multi-Authentication Providers

Consolidation to Multi-Auth

IIS Authentication

Authorization

Authorization Framework Overview

Outbound Authorization and OAuth in K2

REST Broker

Resources for Working with the REST Service Broker

REST Swagger File Reference Format

REST Broker and Swagger Descriptor Overview (video)

Endpoints REST Service Type

OData Broker

Using the OData Service Broker (including Serialization and Deserialization)

Endpoints OData Service Type

Workflow and SmartObject APIs

APIs (in Management)

Configuring the Workflow REST API

Configuring the SmartObject OData API

How to Use the K2 Workflow REST Feed with Microsoft Flow to Redirect a K2 Task

How to Use the K2 Workflow REST Feed with Microsoft Flow to Start a Workflow

How to: Use the K2 OData Feed with Microsoft Excel

How to: Use the K2 OData Feed with Microsoft Power BIFacebooktwittergoogle_plusredditpinterestlinkedinmail