How to enable TDE (SQL 2017/Azure SQL Database)

TDE is a SQL Server feature which encrypts your data at rest, i.e. your database files. When TDE is enabled encryption of the database file is performed at the page level. The pages in an encrypted database are encrypted before they are written to disk and decrypted when read into memory. TDE does not increase the size of the encrypted database. Here is TDE architecture schema from MSFT documentation:

Transparent Data Encryption Architecture

Transparent Data Encryption Architecture

This blog post explains how to enable Transparent Data Encryption (TDE) for SQL Database (on-premise/Azure).

Scenario 1. On-premise SQL Server 2017 (this will also work for SQL Server in a Azure VM). You can use the following SQL script to enable TDE:

Be sure to replace ‘K2’ with your target database name and adjust password value. Script uses IF clauses to avid creating things which already exist (which are missing in the sample script you can find in MSFT documentation). Once TDE is enabled you can confirm this in the database properties using SSMS GUI:

Scenario 2. Azure SQL Database. Script mentioned above won’t work here. Easiest/default approach to enable TDE for Azure SQL Database is to do so from Azure Portal:

This approach called service-managed transparent data encryption and by default database encryption key is protected by a built-in server certificate. All newly created SQL databases are encrypted by default by using service-managed transparent data encryption.

Other approach called Bring Your Own Key and requires use of Azure Key Vault.

TDE can also be managed with PowerShell, Transact-SQL and REST API. PowerShell contains number of cmdlets for that:

 

And using T-SQL you can use ALTER DATABASE (Azure SQL Database) SET ENCRYPTION ON/OFF command (encrypts or decrypts a database) and two dynamic management views:

  • databasesys.dm_database_encryption_keys which returns information about the encryption state of a database and its associated database encryption keys
  • sys.dm_pdw_nodes_database_encryption_keys which returns information about the encryption state of each data warehouse node and its associated database encryption keys

Once TDE has been enabled there is also options to check whether it is enabled or not using T-SQL:

For further information refer to official MSFT documentation:

Transparent Data Encryption (TDE)

The SQL Server Security Blog on TDE with FAQ

Facebooktwittergoogle_plusredditpinterestlinkedinmail

Un mercenario al que pagas bien no te deja en la estacada

I haven’t been writing any language learning related blog posts for quite some time now. It is not because there is nothing to write about, on the contrary I have a lot of ideas big and small in language learning department, but I’m too busy with technology/work and other things.

Anyhow I’m very actively learn Spanish language and the moment, keeping on hold French and Afrikaans and postponing desire to learn other languages 🙂 I’m about to receive (unless I failed my exam) my DELE B1 certificate. Subjectively I can say that my writing capability still requires a lot of work as well as speaking lacks control of tense system though I can say a lot using limited amount of tenses and doing a lot of mistakes 🙂

My learning strategy includes loads of input from day-0 (listening, reading) and I’m currently reading “La carta esférica” by Arturo Pérez-Reverte and in this book I stumbled upon the following idiomatic expression – “dejar a alguien en la estacada”. Here is the passage from the book:

Además, siempre preferí contratar a asalariados eficientes antes que a voluntarios entusiastas… Un mercenario al que pagas bien no te deja en la estacada.

Pérez-Reverte, Arturo. La carta esférica (Spanish Edition) (Kindle Location 3386). Penguin Random House Grupo Editorial España. Kindle Edition.

So I decided to read up a bit on the phrase and below you may find what I learnt. Disclaimer: most of the post talks about etymologies of phrases/words and these are frequently contested, I  have to warn you that I didn’t do rigorous scientific check/verification and you are more than welcome do double check these theories. 🙂

I quickly found English equivalent for this expression – “to leave someone in the lurch”, and while meaning was clear both expressions required some extra checks in dictionary to understand where they came from. So basic modern meaning of both expressions is to abandon someone in difficult situation.

Let’s start from the Spanish one – “dejar a alguien en la estacada” if you are in a mood for definition of meaning in Spanish here you are – “La expresión ‘Dejar a alguien en la estacada’ es comúnmente utilizada para señalar cuando a una persona se la ha dejado abandonada a su suerte en una situación que podría ser peligrosa, apurada o de difícil solución, no brindándole la ayuda o auxilio que precisa” (source). But what is this “estacada” where our troubled person left? It actually comes from medieval jousting tournaments, martial game based on the military use of the lance by heavy cavalry.

Tournament between Henry II and Lorges, 16th century

Tournament between Henry II and Lorges, 16th century

Tournament field for this competition was fenced by “estacas” – wooden posts which formed sort of palisade and land within this fence was called “estacada” (tournament’s arena sort of). During tournaments, after competition was over only knight which felt from his horse (often heavily wounded) left on that land and victorious knight used to leave arena without helping or paying attention to one which stayed on the field. From this takes origin phrase “dejar a alguien en la estacada” which in modern language used to refer to “leaving someone in difficult or dangerous situation”.

What about English version? As you can see Spanish idiom has rather military origin and its English equivalent despite having similar meaning in modern usage has completely different origins. It also revolves around of the place where you left the troubled person – “lurch”. And honestly I had to look it up as I haven’t had an idea about what it could be. Dictionaries list number of meanings of which, knowing sense of the phrase, you may guess that one which we have in the phrase “to leave someone in the lurch” is this:

“a decisive defeat in which an opponent wins a game by more than double the defeated player’s score especially in cribbage

I also found interesting blog post which offers more interesting and fitting options for origins of lurch, such as:

1. Lurch is a noun that originated from lich – the Old English word for corpse. Lych-gates were the roofed churchyard entrances that adjoin many old English churches and are the appointed place for coffins to be left when waiting for the clergyman to arrive to conduct a funeral service. Hence ‘left in the lych/lurch’ supposed to mean “left in a quite difficult situation”…

Lychgate at the Church of St. James the Less, Philadelphia

Lychgate at the Church of St. James the Less, Philadelphia

2. Second theory states that jilted brides would be ‘left in the lurch’ when the errant bridegroom failed to appear for a wedding.

Those two seems to be apt/interesting yet only listed as suggested explanations with no evidence to support them.

And while most of the dictionaries link the lurch with losing/bad situation in cribbage aforementioned blog post mentioned above suggests that word/phrase “originates from the French board game of lourche or lurch, which was similar to backgammon and was last played in the 17th century (the rules having now been lost). Players suffered a lurch if they were left in a hopeless position from which they couldn’t win the game.” But again, looking at illustration they have there game board looks similar to the one for cribbage.

And after looking at both Spanish and English idioms which convey one idea yet have different origins I realized that both cribbage board and jousting tournament field have something in common…

Modern 120-hole cribbage board

Modern 120-hole cribbage board

Giovanni Ferri, Saracen joust in Piazza Navona in the 25th of February 1634 (Seventeenth century)

Giovanni Ferri, Saracen joust in Piazza Navona in the 25th of February 1634 (Seventeenth century)

Don’t you think?

Russian version anyone? If you interested in a Russian equivalent of “dejar a alguien en la estacada” / “leave someone in the lurch” I think it will be “бросить на произвол судьбы”, phrase which literal translation goes as “to leave someone to the arbitrariness of fate”… As you can see yet again completely different phrase to convey the same idea. Russian phrase centered around “fate” which is blind and not in a sense of unbiased Themis (known to Russian speakers as “Фемида” [Femida] and aka Justitia aka Lady Justice), but rather blind in its cruel arbitrariness. So to leave on  to the arbitrariness of fate would be leaving vulnerable person in really difficult situation.

Facebooktwittergoogle_plusredditpinterestlinkedinmail

How to: Connect to Azure SQL from Visual Studio

This specific day is already described by some people as GDPRmageddon based on amount of emails people receive from all companies they ever dealt with at one or another point about their policies updates and so on. I’m not going to talk about this today, instead I decided to write this tiny post on how to connect to Azure SQL from Visual Studio.

Actually short answer to this is just fire off Visual Studio and select View > SQL Server Object Explorer, the rest is just “follow the wizard” thing, and it is all documented by MSFT of course. But I’m quite well aware about widespread allergy to official documentation (no matter how good it is) and also wanted to try this my self for the very first time recently, and this is how this post came about.

QUESTION: How to connect to Azure SQL from Visual Studio?

ANSWER:

1) First things first. Assuming you don’t have Visual Studio installed on your machine, you should download and install it. There is a free version named Visual Studio Community which can be downloaded here: https://www.visualstudio.com/vs/community

2) Once tiny web installer downloaded run it and click continue to kick off installation process:

Tiny installer will start fetching data from the internet before installing components on your  machine:

3) Once data downloaded you can opt in for default installation which will require 597 MB of disk space:

4) Once installation is complete Visual Studio will suggest you Sign up or Sign In to use additional services, but you can avoid that by clicking “Not now, maybe later” link:

5) Of course you must select Dark theme otherwise nothing will work and click on Start Visual Studio button 🙂

Of course other themes work too but you would look suspiciously in the developers crowd 🙂

6) Once Visual Studio opened you supposed to go to View > SQL Server Object Explorer, and… And if you followed previous steps exactly/selected default Visual Studio installation configuration SQL Server Object Explorer won’t be available in View menu:

7) To address this, re-run Visual Studio installer, open Individual components tab and mark SQL Server Data Tools component – it will automatically select bunch of dependencies upping disk space requirements from 597 MB to 2,52 GB:

8) Once installation complete, re-run Visual Studio and click SQL Server Object Explorer from View menu:

9) Once it opened, right click on SQL Server node in SQL Server Object Explorer tree and select “Add SQL Server…” option:

10) It will start Connect wizard where you can specify your Azure SQL server name and credentials (yes even for Azure SQL database you are connecting through “SQL server” which is logical entity you must have to connect to Azure SQL database(s) and should not be confused with SQL Server in Azure VM 🙂 ):

 

11) Both SSMS and Visual Studio are smart enough to detect if your server name and credentials are good but you don’t have firewall rule created to allow connection, and if you provide your Azure credentials these applications will be able to create required firewall rule for you:

To provide credentials you supposed to click on “Add an account…” under Azure account as shown above. It will open standard Microsoft credentials prompt dialog:

After you provided valid credentials you should be able to click OK in the previous window to create firewall rule:

If you navigate to Azure portal you will be able to see your public IP added to Azure SQL server firewall rules there.

12) And with valid credentials and firewall rule in place you will be able to connect and browse through your databases and objects and execute SQL queries more or less in the same way as in SSMS:

I hope now you are clear on how to connect to Azure SQL from Visual Studio 🙂 But in case you still have any questions just leave them in comments.

Facebooktwittergoogle_plusredditpinterestlinkedinmail

Exam Prep Resources for Microsoft Azure 70-473 Design and Implement Cloud Data Platform Solutions

I’m currently preparing for 70-473  Design and Implement Cloud Data Platform Solutions exam, so I’ve decided to compile a list of resources which may be useful to prepare for this exam. I’m going to append it with additional materials as I keep working on my preparation and I hope it may be useful to other test takers.

As with any MSFT exam your starting point has to be MSFT exam description page which contains run down of all exam topics as well as links to additional resources, so here it is – Exam 70-473 Designing and Implementing Cloud Data Platform Solutions. You should keep in mind that though this exam has been released in December 2015, it is being updated quarterly, so once in a while you need to check exam page to see if any new topics were added there. At the moment last update to this exam was made in June 2017 and changes are explained in exam 70-473 change document.

Paid resources:

70-473 Cloud Data Platform Solutions course by SoftwareArchitect.ca – this is an affordable (25$) online course which I bought and used during my preparation – good overview of all concepts at a fair price, and when I searched it was only 70-473 specific course from online training vendors which I was able to find. Author goes through all the “skills measured” topics as they stated in exam description. What I dislike about this course is amount of typos and some little issues like mismatch between numbering and naming  of videos in course navigation pane and inside of the videos themselves. One exactly the same video even inserted/listed twice there. So I would describe it as lack of QA/editing problem. My other complain would be lack of hands-on demos, there are some of them in the course but I wanted more. 🙂 Only after completion of the course I found that it is also available on Udemy and there it was priced 9,99$ with discount when I checked – so check both locations and compare prices if you want to try it.

Free resources and video recordings:

Certification Exam Overview: 70-473: Designing and Implementing Cloud Data Platform Solutions MVA course

Cert Exam Prep: Exam 70-473: Cloud Data Platform Solutions – exam overview video by MCT James Herring

Second link is YouTube video, looks like both of these links cover more or less the same material and delivered by the same person, yet YouTube session has newer slides, it seems, and they are not absolutely identical – so watch both of them.

Channel 9 – Keeping Sensitive Data Secure with Always Encrypted

YouTube – Secure your data in Azure SQL Database and SQL Data Warehouse

MSFT documentation:

Resolving Transact-SQL differences during migration to SQL Database

This article covers things which will work in SQL queries run on on-prem SQL Server while won’t work while run against Azure SQL DB. For example things you probably discovery very quickly is that USE statement is not supported.

Azure SQL Database – Controlling and granting database access

Article explains unrestricted administrative accounts, server-level administrative roles and non-administrator users + “access paths”.

Sizes for Windows virtual machines in Azure

General purpose virtual machine sizes

High performance compute VM sizes

You may expect questions around VM sizing based on given requirements so need to remember which series has premium storage and which not along with some other things which you can learn from the articles above.

Securing your SQL Database

Always Encrypted (Database Engine)

Always Encrypted Wizard

This article explains 2 very important things you should be aware of: key storage options and Always Encrypted Terms.

SQL Database dynamic data masking

Azure Blog – Microsoft Azure SQL Database provides unparalleled data security in the cloud with Always Encrypted

Azure SQL has loads of security features and you supposed to know them all 🙂 At least when to use, along with requirements and limitations.

Azure Cosmos DB: SQL API getting started tutorial

Get started with Azure Table storage and the Azure Cosmos DB Table API using .NET

ADO.NET Overview

Facebooktwittergoogle_plusredditpinterestlinkedinmail

Simple walkthrough: Using K2 Database Consolidation Tool

Purpose of this blog post is to outline K2 databases consolidation process using K2 Database Consolidation Tool.

When you may need it? For older K2 deployments when initial installer used to create 14 separate databases instead of one “K2” database we expect to see with current K2 versions. Such environments even after upgrades to newer versions carry on to have these 14 databases and only starting from K2 4.7 databases consolidation is enforced and you cannot upgrade till you consolidate your databases into one. So you can still see non-consolidated K2 database en environments which run any version of K2 up to 4.6.11 including.

To perform consolidation of these 14 K2 databases into one you need to obtain appropriate version of K2 Database Consolidation Tool from K2 support. Below you can see basic steps you need to perform while performing K2 databases consolidation using this tool.

1) First we need to check collation of your existing K2 databases, this is necessary because consolidation tool won’t handle conversions from one locale to another and consolidation will fail. You can run this script to see collation of your non-consolidated K2 DBs:

As you can see on the screenshot below output of this script shows that my non-consolidated databases have Ukrainian_100_CI_AS collation:

2) Make sure that your target SQL Server service instance has the same collation as your databases either via GUI:

or script:

and copy your non-consolidated databases to the target server which will be hosting consolidated database (unless it is not the same server which was hosting them initially).

2) Obtain K2 Database Consolidation Tool from K2 support, extract it on your SQL server which hosts your K2 databases and launch SourceCode.Database.Consolidator.exe, once you start it you will be presented with the following UI:

3) It will detect your non-consolidated K2 DBs (<No Instance> in the Instance drop down means that you are connecting to default, not named SQL Server instance) and here you need to select your target DB – just select <New Database>, specify “Create Database Name” (I’m using default name used by K2 installer which is K2) and click Create:

4) Once you click Create, database K2 will be created in the same collation as your SQL Server instance (your target DB will contain all the required tables and structure but no data) and Start button become available to you so that you can start consolidation process:

5) Before clicking on Start make sure  your K2 service is stopped. Despite we just created our target “K2” database we still getting warning that all data in target DB will be truncated and we have to click Yes to start consolidation process:

Once you clicked on next you will have to wait for a while till consolidation completes (in the bottom of the tool window in its “status line” you will see current operations which are being performed during databases consolidation process. Time which is necessary to complete this process is heavily depends on your server performance and volume of data in your source databases.

In some scenarios (e.g. source and destination collations have different locale IDs or you moved source databases to another SQL server without re-creating their master key) consolidation process may fail leaving your non-consolidated databases databases in read-only state:

In such scenario you need to review consolidation log to identify and address errors and once done. Switch your source databases back to RW mode (as explained here), delete your target database and start again from step (2). When consolidation completes successfully source non-consolidated databases also stay in read-only mode.

If consolidation completes without errors you will get a message confirming this and also informing you that ReconfigureServer.ps1 script has been created:

You can also click on Log Directory link which will open consolidation log file location – as usual you can open it and make sure than neither ‘Logged Warning’ or ‘Logged Error’ can be found anywhere in this log beyond Legend section in the beginning.

6) In the directory which contains K2 Database Consolidation Tool you will need to take ReconfigureServer.ps1 script and copy it over to your K2 server. This script fires off K2 blackpearl Setup Manager while instructing it to connect to your new consolidated DB:

Here is this script code which you can copy/paste:

Once you run this script on K2 server it will start K2 Setup Manager where you need to go through all pages of “Configure K2 blackpearl” process:

You will see on the database configuration step of the wizard that thanks to PS script we already targeting our new consolidated DB:

Once reconfiguration process is completes (without errors and warnings) you can start testing how your K2 environments behaves after K2 consolidation process.

Facebooktwittergoogle_plusredditpinterestlinkedinmail

When code and operations collide :)

I’ve just seen CBT Nuggets video on YouTube entitled “How to Transition to DevOps” and though I cancelled their subscription quite some time ago it sparked my interest and made it very tempting to subscribe again (if only not my financial and time budget constraints).

I really like expressive quotes and explanations which use analogy and one from this video which I really liked can be found below. Along with some basic theory on what is and how to approach DevOps in this video Shawn Powers shows little demo which demonstrates how to use Chef recipe for configuration management, and next goes the following conclusion:

“…configuration automation is awesome example of how DevOps is kind of taking two different worlds the world of installing packages and uploading files and code which allows us to programmatically solve problems and put them together kind of like peanut butter and chocolate goes together to make a Reese’s Cup and it’s you know awesome it’s better than the sum of its parts…”

Nice. And I also need to try these Reese’s Peanut Butter Cups now even if it a bit violates healthy diet 🙂 Think it goes well with coffee and IT training videos (if consumed in limited amounts).

I just looked at DevOps courses available at CBT Nuggets at the moment and though it seems there is no DevOps overview/general course available so far they already have courses on specific tools (Puppet, Chef, Docker, Ansible).

 

Facebooktwittergoogle_plusredditpinterestlinkedinmail

SQL script to attach detached non-consolidated K2 DBs

I keep playing with SQL and non-consolidated K2 DBs and in previous post I covered bringing “these 14” back online, now I realized that other case where SSMS requires way too many click is attaching “these 14” back (let’s say after you rebuild your SQL instance system DBs to change instance collation).

Quick google allowed me to find relevant question on dba.stackexchange.com where I took script which generates CREATE DATABASE FOR ATTACH for all existing user databases. Next having my 14 non consolidated K2 DBs I generated the following script to attach them back in bulk:

You can either use this CREATE DATABASE FOR ATTACH for all existing user databases script while your K2 databases are still attached, of it they are not just replace paths in script listed above and execute modified script to attach them quickly.

Facebooktwittergoogle_plusredditpinterestlinkedinmail

SQL Script to switch all currently RO databases to RW mode

I was doing some testing of K2 databases consolidation process which required me to re-run database consolidation process more than once to re-try it. Unfortunately K2 Database Consolidation Tool leaves all databases in read-only mode if something fails during consolidation process. If you remember K2 used to have 14 separate data bases prior to consolidated DB was introduced (see picture below).

Typing 14 statements manually to bring all these database to read-write mode is a bit time consuming so I came up with the following script:

Essentially it will select all databases currently in RO state and will output bunch of statements to bring all of them to RW state as an output:

Just copy-paste this script output into new query window of SSMS and press F5 🙂

It may be useful for you once in a while (and if not for this specific use case, then as an example of generating some repetitive statements which contain select statement results inside).

Facebooktwittergoogle_plusredditpinterestlinkedinmail

K2 5.0 unable to read CORS settings from SmO when using the Workflow REST API in JavaScript

If you are trying to use K2 Workflow REST API in JavaScript (as described in product documentation) you may see the issue described below (and you may want to upgrade to 5.1 to resolve it 😉 ).

You have CORS settings configured correctly for domain which hosts your JavaSript, i.e. you have settings similar to these:

Workflow REST API Settings

Screenshot above assumes that your JS resides within js.denallix.com domain, upon attempt to execute JS code you will be getting errors.

Using Chrome you will be getting the following error:

1 Failed to load https://k2.denallix.com/Api/Workflow/preview/workflows/?_=1523444398270: Response to preflight request doesn’t pass access control check: No ‘Access-Control-Allow-Origin’ header is present on the requested resource. Origin ‘https://js.denallix.com’ is therefore not allowed access. The response had HTTP status code 400.

IE will also give you an error but less clear one:

SCRIPT7002: XMLHttpRequest: Network Error 0x80070005, Access is denied.

Here is the screenshot of error message from Chrome browser:

And here is what you can see in Fiddler:

In case you want to reproduce this you may use sample code which returns a list of Workflows either owned or startable for the user credentials supplied from K2 documentation).

So you would expect CORS settings configured for Workflow REST API supposed to ensure we have it working fine, but it does not work. What’s wrong here?

If you enable Workflow API logging you can see the following in this log:

w3wp.exe Warning: 0 : Failed to retrieve CORS settings.
System.InvalidOperationException: Failed to determine user principal name
at SourceCode.Forms.AppFramework.ConnectionClass.HandleIdentityImpersonation(Boolean asAppPool, Action action)
at SourceCode.Forms.AppFramework.ConnectionClass.TryCredentialToken(BaseAPIConnection connection, String credentialToken, Boolean asAppPool)
at SourceCode.Forms.AppFramework.ConnectionClass.GetPoolConnection(Boolean asAppPool, Boolean& tokenApplied, String& credentialToken)
at SourceCode.Forms.AppFramework.ConnectionClass.Connect(BaseAPI baseAPI, Boolean asAppPool)
at SourceCode.Web.Api.Common.DataContexts.K2DataContext.EnsureConnectionIsOpen[T](T api)
at SourceCode.Web.Api.Common.DataContexts.K2DataContext.GetCorsSettings()

This means that when you send the request, the workflow API tries to retrieve the CORS settings from the SmartObject. When it does that, it makes a connection to host server. For some reason the connection is failing with an error “Failed to determine user principal”.

Because of this exception CORS settings are not retrieved and the list of allowed origin is empty on the web API side and this lead to the error mentioned above (The response had HTTP status code 400). In K2 5.0 something in the stack is not parsing/decoding the authentication credentials (from the state it was coming from the AJAX call) correctly and thus the identity isn’t recognized causing a failure in the connection.

If you are still on K2 Five (knowing how many people running older version I’m not very comfortable with this wording 🙂 ) your workaround for this issue is to remove the authorization header from the AJAX call and let the browser prompt you for username and password. Here is sample HTML code for that (essentially we just removing var username and war password in $.ajaxSetup):

But in case you running new and shiny 5.1 or ready to upgrade your 5.0 environment to this version, it will work just fine there without need to employ any workarounds.

And here is the ling to official KB covering the same issue: “Failed to load….No Access Control Allow Origin.”

Facebooktwittergoogle_plusredditpinterestlinkedinmail

Reading list: K2 Authentication and Authorization

This is a list of links to K2 documentation which covers K2 Authentication and Authorization topics. In case you have some time to read something for fun 🙂

Authentication

Authentication and Authorization in K2

Claims-based Authentication in K2

Outbound Authorization and OAuth in K2

About K2Trust

Troubleshooting Claims-based Authentication Issues

Identity and Data Security in K2 Cloud for SharePoint

SharePoint Hybrid, Multiple Identity Providers & K2

AAD Multi-Factor Authentication Considerations

Enabling AAD Multi-Factor Authentication Requires Changes in K2 4.7

Authentication Modes

Authentication (in Management)

Integrating with Salesforce

Azure Active Directory Management (Read/Write to AAD)

Claims and OAuth Configuration for SharePoint 2013

Standard SmartForms Authentication

Multi-Authentication Providers

Consolidation to Multi-Auth

IIS Authentication

Authorization

Authorization Framework Overview

Outbound Authorization and OAuth in K2

REST Broker

Resources for Working with the REST Service Broker

REST Swagger File Reference Format

REST Broker and Swagger Descriptor Overview (video)

Endpoints REST Service Type

OData Broker

Using the OData Service Broker (including Serialization and Deserialization)

Endpoints OData Service Type

Workflow and SmartObject APIs

APIs (in Management)

Configuring the Workflow REST API

Configuring the SmartObject OData API

How to Use the K2 Workflow REST Feed with Microsoft Flow to Redirect a K2 Task

How to Use the K2 Workflow REST Feed with Microsoft Flow to Start a Workflow

How to: Use the K2 OData Feed with Microsoft Excel

How to: Use the K2 OData Feed with Microsoft Power BI

Facebooktwittergoogle_plusredditpinterestlinkedinmail