Exam Prep Resources for Microsoft Azure 70-473 Design and Implement Cloud Data Platform Solutions

I’m currently preparing for 70-473  Design and Implement Cloud Data Platform Solutions exam, so I’ve decided to compile a list of resources which may be useful to prepare for this exam. I’m going to append it with additional materials as I keep working on my preparation and I hope it may be useful to other test takers.

As with any MSFT exam your starting point has to be MSFT exam description page which contains run down of all exam topics as well as links to additional resources, so here it is – Exam 70-473 Designing and Implementing Cloud Data Platform Solutions. You should keep in mind that though this exam has been released in December 2015, it is being updated quarterly, so once in a while you need to check exam page to see if any new topics were added there. At the moment last update to this exam was made in June 2017 and changes are explained in exam 70-473 change document.

Paid resources:

70-473 Cloud Data Platform Solutions course by SoftwareArchitect.ca – this is an affordable (25$) online course which I bought and used during my preparation – good overview of all concepts at a fair price, and when I searched it was only 70-473 specific course from online training vendors which I was able to find. Author goes through all the “skills measured” topics as they stated in exam description. What I dislike about this course is amount of typos and some little issues like mismatch between numbering and naming  of videos in course navigation pane and inside of the videos themselves. One exactly the same video even inserted/listed twice there. So I would describe it as lack of QA/editing problem. My other complain would be lack of hands-on demos, there are some of them in the course but I wanted more. 🙂 Only after completion of the course I found that it is also available on Udemy and there it was priced 9,99$ with discount when I checked – so check both locations and compare prices if you want to try it.

Free resources:

Certification Exam Overview: 70-473: Designing and Implementing Cloud Data Platform Solutions MVA course

Cert Exam Prep: Exam 70-473: Cloud Data Platform Solutions – exam overview video by MCT James Herring

Second link is YouTube video, looks like both of these links cover more or less the same material and delivered by the same person, yet YouTube session has newer slides, it seems, and they are not absolutely identical – so watch both of them.

MSFT documentation:

Resolving Transact-SQL differences during migration to SQL Database

This article covers things which will work in SQL queries run on on-prem SQL Server while won’t work while run against Azure SQL DB. For example things you probably discovery very quickly is that USE statement is not supported.

Sizes for Windows virtual machines in Azure

General purpose virtual machine sizes

High performance compute VM sizes

You may expect questions around VM sizing based on given requirements so need to remember which series has premium storage and which not along with some other things which you can learn from the articles above.

Facebooktwittergoogle_plusredditpinterestlinkedinmail

Simple walkthrough: Using K2 Database Consolidation Tool

Purpose of this blog post is to outline K2 databases consolidation process using K2 Database Consolidation Tool.

When you may need it? For older K2 deployments when initial installer used to create 14 separate databases instead of one “K2” database we expect to see with current K2 versions. Such environments even after upgrades to newer versions carry on to have these 14 databases and only starting from K2 4.7 databases consolidation is enforced and you cannot upgrade till you consolidate your databases into one. So you can still see non-consolidated K2 database en environments which run any version of K2 up to 4.6.11 including.

To perform consolidation of these 14 K2 databases into one you need to obtain appropriate version of K2 Database Consolidation Tool from K2 support. Below you can see basic steps you need to perform while performing K2 databases consolidation using this tool.

1) First we need to check collation of your existing K2 databases, this is necessary because consolidation tool won’t handle conversions from one locale to another and consolidation will fail. You can run this script to see collation of your non-consolidated K2 DBs:

As you can see on the screenshot below output of this script shows that my non-consolidated databases have Ukrainian_100_CI_AS collation:

2) Make sure that your target SQL Server service instance has the same collation as your databases either via GUI:

or script:

and copy your non-consolidated databases to the target server which will be hosting consolidated database (unless it is not the same server which was hosting them initially).

2) Obtain K2 Database Consolidation Tool from K2 support, extract it on your SQL server which hosts your K2 databases and launch SourceCode.Database.Consolidator.exe, once you start it you will be presented with the following UI:

3) It will detect your non-consolidated K2 DBs (<No Instance> in the Instance drop down means that you are connecting to default, not named SQL Server instance) and here you need to select your target DB – just select <New Database>, specify “Create Database Name” (I’m using default name used by K2 installer which is K2) and click Create:

4) Once you click Create, database K2 will be created in the same collation as your SQL Server instance (your target DB will contain all the required tables and structure but no data) and Start button become available to you so that you can start consolidation process:

5) Before clicking on Start make sure  your K2 service is stopped. Despite we just created our target “K2” database we still getting warning that all data in target DB will be truncated and we have to click Yes to start consolidation process:

Once you clicked on next you will have to wait for a while till consolidation completes (in the bottom of the tool window in its “status line” you will see current operations which are being performed during databases consolidation process. Time which is necessary to complete this process is heavily depends on your server performance and volume of data in your source databases.

In some scenarios (e.g. source and destination collations have different locale IDs or you moved source databases to another SQL server without re-creating their master key) consolidation process may fail leaving your non-consolidated databases databases in read-only state:

In such scenario you need to review consolidation log to identify and address errors and once done. Switch your source databases back to RW mode (as explained here), delete your target database and start again from step (2). When consolidation completes successfully source non-consolidated databases also stay in read-only mode.

If consolidation completes without errors you will get a message confirming this and also informing you that ReconfigureServer.ps1 script has been created:

You can also click on Log Directory link which will open consolidation log file location – as usual you can open it and make sure than neither ‘Logged Warning’ or ‘Logged Error’ can be found anywhere in this log beyond Legend section in the beginning.

6) In the directory which contains K2 Database Consolidation Tool you will need to take ReconfigureServer.ps1 script and copy it over to your K2 server. This script fires off K2 blackpearl Setup Manager while instructing it to connect to your new consolidated DB:

Here is this script code which you can copy/paste:

Once you run this script on K2 server it will start K2 Setup Manager where you need to go through all pages of “Configure K2 blackpearl” process:

You will see on the database configuration step of the wizard that thanks to PS script we already targeting our new consolidated DB:

Once reconfiguration process is completes (without errors and warnings) you can start testing how your K2 environments behaves after K2 consolidation process.

Facebooktwittergoogle_plusredditpinterestlinkedinmail

When code and operations collide :)

I’ve just seen CBT Nuggets video on YouTube entitled “How to Transition to DevOps” and though I cancelled their subscription quite some time ago it sparked my interest and made it very tempting to subscribe again (if only not my financial and time budget constraints).

I really like expressive quotes and explanations which use analogy and one from this video which I really liked can be found below. Along with some basic theory on what is and how to approach DevOps in this video Shawn Powers shows little demo which demonstrates how to use Chef recipe for configuration management, and next goes the following conclusion:

“…configuration automation is awesome example of how DevOps is kind of taking two different worlds the world of installing packages and uploading files and code which allows us to programmatically solve problems and put them together kind of like peanut butter and chocolate goes together to make a Reese’s Cup and it’s you know awesome it’s better than the sum of its parts…”

Nice. And I also need to try these Reese’s Peanut Butter Cups now even if it a bit violates healthy diet 🙂 Think it goes well with coffee and IT training videos (if consumed in limited amounts).

I just looked at DevOps courses available at CBT Nuggets at the moment and though it seems there is no DevOps overview/general course available so far they already have courses on specific tools (Puppet, Chef, Docker, Ansible).

 

Facebooktwittergoogle_plusredditpinterestlinkedinmail

SQL script to attach detached non-consolidated K2 DBs

I keep playing with SQL and non-consolidated K2 DBs and in previous post I covered bringing “these 14” back online, now I realized that other case where SSMS requires way too many click is attaching “these 14” back (let’s say after you rebuild your SQL instance system DBs to change instance collation).

Quick google allowed me to find relevant question on dba.stackexchange.com where I took script which generates CREATE DATABASE FOR ATTACH for all existing user databases. Next having my 14 non consolidated K2 DBs I generated the following script to attach them back in bulk:

You can either use this CREATE DATABASE FOR ATTACH for all existing user databases script while your K2 databases are still attached, of it they are not just replace paths in script listed above and execute modified script to attach them quickly.

Facebooktwittergoogle_plusredditpinterestlinkedinmail

SQL Script to switch all currently RO databases to RW mode

I was doing some testing of K2 databases consolidation process which required me to re-run database consolidation process more than once to re-try it. Unfortunately K2 Database Consolidation Tool leaves all databases in read-only mode if something fails during consolidation process. If you remember K2 used to have 14 separate data bases prior to consolidated DB was introduced (see picture below).

Typing 14 statements manually to bring all these database to read-write mode is a bit time consuming so I came up with the following script:

Essentially it will select all databases currently in RO state and will output bunch of statements to bring all of them to RW state as an output:

Just copy-paste this script output into new query window of SSMS and press F5 🙂

It may be useful for you once in a while (and if not for this specific use case, then as an example of generating some repetitive statements which contain select statement results inside).

Facebooktwittergoogle_plusredditpinterestlinkedinmail

K2 5.0 unable to read CORS settings from SmO when using the Workflow REST API in JavaScript

If you are trying to use K2 Workflow REST API in JavaScript (as described in product documentation) you may see the issue described below (and you may want to upgrade to 5.1 to resolve it 😉 ).

You have CORS settings configured correctly for domain which hosts your JavaSript, i.e. you have settings similar to these:

Workflow REST API Settings

Screenshot above assumes that your JS resides within js.denallix.com domain, upon attempt to execute JS code you will be getting errors.

Using Chrome you will be getting the following error:

1 Failed to load https://k2.denallix.com/Api/Workflow/preview/workflows/?_=1523444398270: Response to preflight request doesn’t pass access control check: No ‘Access-Control-Allow-Origin’ header is present on the requested resource. Origin ‘https://js.denallix.com’ is therefore not allowed access. The response had HTTP status code 400.

IE will also give you an error but less clear one:

SCRIPT7002: XMLHttpRequest: Network Error 0x80070005, Access is denied.

Here is the screenshot of error message from Chrome browser:

And here is what you can see in Fiddler:

In case you want to reproduce this you may use sample code which returns a list of Workflows either owned or startable for the user credentials supplied from K2 documentation).

So you would expect CORS settings configured for Workflow REST API supposed to ensure we have it working fine, but it does not work. What’s wrong here?

If you enable Workflow API logging you can see the following in this log:

w3wp.exe Warning: 0 : Failed to retrieve CORS settings.
System.InvalidOperationException: Failed to determine user principal name
at SourceCode.Forms.AppFramework.ConnectionClass.HandleIdentityImpersonation(Boolean asAppPool, Action action)
at SourceCode.Forms.AppFramework.ConnectionClass.TryCredentialToken(BaseAPIConnection connection, String credentialToken, Boolean asAppPool)
at SourceCode.Forms.AppFramework.ConnectionClass.GetPoolConnection(Boolean asAppPool, Boolean& tokenApplied, String& credentialToken)
at SourceCode.Forms.AppFramework.ConnectionClass.Connect(BaseAPI baseAPI, Boolean asAppPool)
at SourceCode.Web.Api.Common.DataContexts.K2DataContext.EnsureConnectionIsOpen[T](T api)
at SourceCode.Web.Api.Common.DataContexts.K2DataContext.GetCorsSettings()

This means that when you send the request, the workflow API tries to retrieve the CORS settings from the SmartObject. When it does that, it makes a connection to host server. For some reason the connection is failing with an error “Failed to determine user principal”.

Because of this exception CORS settings are not retrieved and the list of allowed origin is empty on the web API side and this lead to the error mentioned above (The response had HTTP status code 400). In K2 5.0 something in the stack is not parsing/decoding the authentication credentials (from the state it was coming from the AJAX call) correctly and thus the identity isn’t recognized causing a failure in the connection.

If you are still on K2 Five (knowing how many people running older version I’m not very comfortable with this wording 🙂 ) your workaround for this issue is to remove the authorization header from the AJAX call and let the browser prompt you for username and password. Here is sample HTML code for that (essentially we just removing var username and war password in $.ajaxSetup):

But in case you running new and shiny 5.1 or ready to upgrade your 5.0 environment to this version, it will work just fine there without need to employ any workarounds.

And here is the ling to official KB covering the same issue: “Failed to load….No Access Control Allow Origin.”

Facebooktwittergoogle_plusredditpinterestlinkedinmail

Reading list: K2 Authentication and Authorization

This is a list of links to K2 documentation which covers K2 Authentication and Authorization topics. In case you have some time to read something for fun 🙂

Authentication

Authentication and Authorization in K2

Claims-based Authentication in K2

Outbound Authorization and OAuth in K2

About K2Trust

Troubleshooting Claims-based Authentication Issues

Identity and Data Security in K2 Cloud for SharePoint

SharePoint Hybrid, Multiple Identity Providers & K2

AAD Multi-Factor Authentication Considerations

Enabling AAD Multi-Factor Authentication Requires Changes in K2 4.7

Authentication Modes

Authentication (in Management)

Integrating with Salesforce

Azure Active Directory Management (Read/Write to AAD)

Claims and OAuth Configuration for SharePoint 2013

Standard SmartForms Authentication

Multi-Authentication Providers

Consolidation to Multi-Auth

IIS Authentication

Authorization

Authorization Framework Overview

Outbound Authorization and OAuth in K2

REST Broker

Resources for Working with the REST Service Broker

REST Swagger File Reference Format

REST Broker and Swagger Descriptor Overview (video)

Endpoints REST Service Type

OData Broker

Using the OData Service Broker (including Serialization and Deserialization)

Endpoints OData Service Type

Workflow and SmartObject APIs

APIs (in Management)

Configuring the Workflow REST API

Configuring the SmartObject OData API

How to Use the K2 Workflow REST Feed with Microsoft Flow to Redirect a K2 Task

How to Use the K2 Workflow REST Feed with Microsoft Flow to Start a Workflow

How to: Use the K2 OData Feed with Microsoft Excel

How to: Use the K2 OData Feed with Microsoft Power BI

Facebooktwittergoogle_plusredditpinterestlinkedinmail

GAC folders

I keep forgetting exact GAC folders’ locations and whenever I try to Google this information it is always buried under some layers of misunderstanding and confused discussions. This steams from the fact that there are different locations depending on .NET version and bitness of your assemblies. So I’m just jotting down all the locations here for quick reference:

.NET 2.0 GAC:  C:\Windows\assembly

.NET 4.0 GAC: C:\Windows\Microsoft.NET\assembly

Now each of this folder has the following sub-folders:

GAC_32 for 32-bit assemblies (defines word size)

GAC_64 for 64-bit assemblies  (defines word size)

GAC_MSIL for assemblies that can be run in either 32-bit or 64-bit mode which are JIT compiled to the required word size as needed

On machines with x86/32-bit version of Windows (which is way too rare now, especially for Windows Server) there is only GAC_32 subfolder, and on x64 OS machines there are both GAC_32 and GAC_64 folders as 32-bit code is supported via emulation (WOW32).

Facebooktwittergoogle_plusredditpinterestlinkedinmail

“Glock: The Rise of America’s Gun” – book review and some thoughts on product design

It has been a while since I written anything on my blog – was a bit busy. Then I decided to write a tiny review of this book but fell under the spell of Steven Sinofsky‘s long form write ups and as a result this tiny review turned into something too big and I was trying to finish it for way too long. I end up finishing this abruptly and  posting using truly Bill Gates’ approach of “get it out there, fix it later”, as sticking to “keep it secret till you make it perfect” Apple approach is way too difficult to adhere to. So if anything is wrong here I’ll edit it later 🙂

I’ve recently finished listening Audible’s audio-book “Glock: The Rise of America’s Gun” by Paul M. Barret and it was so good that I can’t help writing (a bit) about it here on my blog. I have quite broad range of interests without allergy to go deeper in any number of narrow topics, so on my ever-growing to read/to listen list there are always very diverse books, with topics ranging from economics and linguistics to IT, to history and I never know what else.

From that vast array of topics two have special importance for me: philosophy and history. I just keep noticing that way too many people dismiss first as something you can read/listen only after smoking some weed (I almost quote one of my former school classmates here) and the second as something of a little value and relevancy to our present-day life. It makes me a tiny bit sad to see those disciplines neglected and grossly underestimated. Seriously, the negligence and ignorance about each of these domains is by itself a topic not for a blog post, but for an essay or even a whole book 😊 What could be more important to slow down and to think about “how do we think about things” and “what is worth to think about in the first place” along with “being acutely aware about what has been done and tried before you”? I hardly can name anything more important I think…

Anyhow getting back to the Glock book, it was one of those which just caught my attention somehow (back then I haven’t had any interest in guns beyond general vague subconscious man predisposition to all things military), and it then waited for something like 5 years before I decided to listen to it. Essentially as many of other books in my Audible wish list it landed there thanks to serendipity and maybe some clever Amazon recommendations algorithms.  And as it happens sometimes with the book turned out to be absolutely brilliant and it was just waited for the right time to be listened to (in this case it means some experience with pistol practical shooting and Glock pistol). Another example of the same random-perfect choice for me was “The Language Instinct – How the Mind Creates Language” book by Steven Pinker– this book too was sitting in my wish list for about 5 years and was added there instinctively, and despite I was interested in linguistics when I put it on my wish list, while it was sitting there I managed to learn a lot of stuff about the topic, took some Coursera courses which introduced me to some of the linguistics problems, and then I finally got around that book and it was just “wow” and “why I haven’t listened to it earlier” and “it is a book which eligible to re-read/re-listen many times”…

Looping back from randomly selected books and importance of history to the Glock book. It is one of those non-fiction books which introduce you to the history of the specific topic with great details, and I strongly believe when such books are written by informed person with keen interest to the topic, almost any topic can be really fascinating to dive in. In this case book has it all: history of engineering and enterprise, some political and cultural background, corporate rivalry and person/character evolution – there are so many facets covered in the book which make you understand a lot of things better (if you wish to) or merely enjoy fascinating unfolding of the great story (and as it often happens, true stories turn out to be way more exciting and unpredictable than most works of fiction). I won’t be writing coherent review of the book, but rather list some of my take-aways from it.

On good product. This book is in itself an example of good product design, where even a name (for informed person) designed to spark your interest and buy the book. I mean the title “The Rise of America’s Gun” combined with black Glock pistol on a white background should spark in you an interest as to how Austrian made pistol from old Europe can be an America’s gun, meaning a gun of a country where guns culture is a part of a nation´s psyche and where some other epic names used to reign supreme? Surely you know that gun which won the west? And it wasn’t Austrian one, right? So this book is artfully designed product about another good product which appeared out of nowhere (not exactly of course) and won the market which it possibly it never could have dreamed of, and it won it in a big way. But to understand how you need to know the history which will tell you that everything was important: right timing for entry to the market, a bit of luck, huge amount of controversial (but free for the company) publicity, importance of designing from scratch – good story about good product can teach you a lot about what is important for products, and this knowledge is transferable, meaning that it can be relevant not only to pistols design and manufacturing but, let’s say, for modern day software products or any other products. So I’ll just try to highlight some points from the book which show importance of learning from history and how it can be still relevant.

On engineering. Designing from scratch is something you should do to really innovate. And it does not mean you throw away history/what has been done before you – on the contrary you have to critically review with a pair of fresh eyes and then design from scratch. Before starting development of his gun Glock bought tested and disassembled number of popular guns available on the market:  and come to conclusion that all of them unnecessarily complex (too many parts).

What was really new for gun design is the following:

  • Pistol was designed for complete production on CNC (computer-controlled) tools = lower production cost. This was possible as Glock didn’t have an existing production plant and he was able to build one with this in mind
  • Pistol frame was made out of light, resilient, injection-molded plastic. And it is first commercially successful firearm which was designed with such material. Glock had begun learning about the material when he bought an injection-molding machine to make handles and sheaths for the military knives he produced in his garage. Glock hired former employees of a bankrupt camera manufacturer who brought advanced injection-molding and plastic-design skills. This allowed Glock pistol be remarkably strong and resistant to corrosion, a major problem with traditional steel guns. And light too. Bug main reasoning behind this design was getting savings on raw material and labor anddistinct ergonomic advantages over gun cobbled together from blued steel and walnut. There were earlier attempts to use polymer frame which had not had any success due to design shortcomings (American Remington Nylon 66 rifle and the German Heckler & Koch VP70 pistol)
  • Glock worked with shooters and wooden pistol models on a early design stages to decide on grip-to-frame angle which allows to point gun “instinctively” – and initially it was defined as 22 degrees. Angle was a bit reduced later but up to now unconventional  grip-to-frame angle of Glock makes difficult to shooters to switch to any other pistol (majority uses other angle).

All established market players were all intheir product-market fit (PMF) stage – they just were to attached to their existing gun designs and in PMF stage your business is about extracting more money from existing product – there is neither time no motivation for building different/new product. It is not only “we always done it like that” and “we cannot do it differently” mindset it is also “we have not tools for that” syndrome.

Innovation through removing features. One thing which was crucial for this product is taken away an essential feature and throwing it away, transforming absence of this feature into feature in its own right. I haven’t done any research on this, but I bet external safety trigger was once innovative product feature and selling point for some other gun. We can see this rather a lot in software products (especially as they move to the cloud) – we gradually lose some features we can fiddle with but after a while embrace the increased simplicity and efficiency of that, and the same happens with hardware products (think of mobile phones and bold move of throwing away hardware keyboard).

So Glock was able to sell idea of removal of external safety trigger (though technically it has some sort of 3 step internal one, but from usability POV there is just a trigger and no safety trigger) – it was major selling point as it introduced simplicity of use.

Your strength is your weakness too. Book brilliantly illustrates problem of fit to market stage – old gun manufacturers were busy extracting money from existing product designs with no ability to change them. Unfortunately even zeal of product fans and legendary brand image stop supporting you if there is new better product addressing clients’ needs.

And it is not only syndrome that we did it like that all the time, so we can´t change it, it is also “we don’t have tools” syndrome.

On time to entry (to the market). Glock not only won contract for Austrian army he also been in time (without any plans of doing so) to address concerns of American law enforcement organizations which were prepared to embrace necessity of moving away from west beloved revolvers to different gun. There were reasons for those concerns, in particular incident known as 1986 FBI Miami shootout  which eventually lead to the process of searching for new gun for FBI (1987) and later for other law enforcement agencies. Long story short that incident show inability of revolvers to compete with semiautomatic weapon in the hands of professionals. 4 minutes of shooting, 8 FBI agents armed with revolvers and some shotguns VS 2 criminal, only one of them having Ruger Mini-14 semi-automatic riffle which was sufficient to do suppressive fire.

On shaping client needs. Shape your client needs (Apple way) or at least talk to your clients early in design stage. Nobody asked for plastic pistol, and even once they get it some were to attached to their revolvers considering Glock an ugly gun – that has changed after it was adopted by professionals (publicity matters) and other shooters – then everybody discovered usability, efficiency and gun acquired its own cachet of best gun, instead of “ugly” people started to call its look “futuristic”… From ugly duckling to the pistol of the future.

On publicity. Publicity matters. Sometimes even not a very good one. Glock received a lot of free publicity on different occasions – congress hearing related to it being terrorist gun invisible for metal detectors and some completely irrelevant descriptions from Hollywood action movies which cemented gun presence in popular culture. Most of the publicity was free of charge and some was bad, but as Bill Gates used to say “whatever they say about us it is always better than not saying anything about us” (not 100% sure on exactness of quote but I believe it is something from MSFT early days).

Maybe someone still remember that epic description of non-existing Glock 7 in Die Hard movie:

Not a gun model nor single word in description provided is true, but main thing that everybody talking about your brand and you not paying for product placement ads.

On brand storytelling and company message. “Glock perfection” message and personal inventor/businessman legend was formed by some accidents, then supported, developed and shaped by company and its fans. At some point it just start living on its own. So if you as a business don’t have one you’d better work to have it early on and have it right – it may work for you later.

From humble beginnings to the arrogance of success. I believe the Glock as a product centered business is in its product-market fit (PMF) stage, but as it always the case with tangible and software-less product such periods are far longer than for any software or software-enabled/smart product. But still we may expect that somebody will come up with biometry based safety trigger totally blocking ability to fire the pistol to anyone but its legal owner or something that decrease complexity of a gun even more (we still have noise, moving parts and metal parts). But interestingly for Glock pistol and probably for most of the modern pistols in general, almost every remaining issue to address can be sold and believed by many people to be a feature they want to have and keep. Though in retrospect we may see that being big and cool looking, and surrounded by legends even, have not saved revolver(s) as a product – it was superseded by semi-automatic pistols and Glock had become just early entrant to the market which now enjoys status of perfect reputation and seemingly never ending PMF 😊

If we look at the personal evolution of Gaston Glock we may also see that he is changed quite a lot from a timid engineer to more flamboyant person with different lifestyle and demands. But let personal things be personal.

On corporate intrigue and creative accounting. This book covers unsuccessful assassination and I would say that it adds to the overall story twists and dynamics you normally expect to see in fiction movies rather than in history books… Though one would say you can expect than when there guns and a lot of money going around… There is nothing funny when such things happen in real life but nonetheless the way it happened reminded me that fight scene from 2004 Punisher movie for some reason…

And just to conclude, or to address people who tend to scroll down and read final paragraph only: this is a fantastic book which can entertain (education and thinking is always optional nowadays) and contains some surprises and unexpected twists. For me it was really interesting to know more about Glock pistol and its business and development story. Just before I listened to this book I tried Glock 17 on a shooting range right after using heavier, larger caliber Tanfoglio Limited within the same training session and I should tell that now I know what features of the Glock explain my immediate results improvement.

P.S. Tanfoglio is a beautiful, high quality pistol, pleasant to hold but it is still an example of that harder trigger pull resistance and larger stopping power even in highest quality does not provide you with benefits of an easy and consistent results which you can get with light trigger (and light weight) pistol which just makes it easy (maybe even dangerously easy) to shot.

P.P.S. I can be wrong about trigger pool resistance though – geeks can read up some specs. Update May 2018: Recently I tried again both Glock 34 and CZ Shadow – and indeed CZ has super easy trigger if compared to Glock where you need to put more effort while pressing it because of “built in safety”.

P.P.S. For those who found this post strangely incongruous with normal topics of my blog posts be sure to wait for the next one about pottery (no it won’t be considered as something you can shot at 🙂 ). I’m really have plans for this post stay tuned.

Facebooktwittergoogle_plusredditpinterestlinkedinmail

Configure K2 and SharePoint Online integration

Some time ago I posted an article “Configure K2 and SharePoint Online integration” on StarWind Software blog which outlines details about getting your SharePoint online instance up and running and adding K2 for SharePoint app to your app catalog, so if you are interested to know more read on at StarWind Software blog.

One thing I was not 100% clear on while writing that article is “Enable auto-activation on sites where the app is deployed” option, which is enabled by default (K2 for SharePoint app > Settings > Manage App Activations).

Based on the setting name wording I was not very clear whether “Enable auto-activation on sites where the app is deployed” setting works for SharePoint online newly created site collections? Especially as I’ve noticed that if I run activation manually there is a step “we need to create a token of your behalf” which, I assume, requires user input… So I had a question whether auto activation is possible for SharePoint Online newly created site collections where K2 app was only deployed?

It was also not very clear what drives/triggers automatic activation and with which frequency it happens.

Luckily enough there are some colleagues who always help 🙂 Below some extra details I’ve learnt only after writing that article.

The function of the Auto-Activation Setting on the Manage App Activation Page designed for strict activation on Site Collection Level (permission wise) and only allow Site Collections to be activated via the App Catalog Level.

When the setting is set to False the below Warning will be presented when the user tries to activate on the Site Collection Level:

This setting will not Auto Activate any new Site Collections created in SharePoint Online. When it set to True/Enabled you can perform activation from site collection level. So the wording “Enable auto-activation on sites where the app is deployed” is really a bit confusing though description above this setting is quite clear (but who reads notes and descriptions? 😉

Essentially this option allows Site Collection Owners activate K2 app on a site collection level manually and when it disabled they won’t be able to do that from the site collection level and it will be possible only via app catalog level from K2 app management page.

But, true auto activation does happen for sub-sites of already activated site collections. But this functionality works only with K2 Five. This functionality was introduced in K2 4.7 but did not work as expected. More information on this can be found in the following KB http://help.k2.com/kb001755

But this subsites auto-activation requires Event Receiver sub-site of the K2 site to be exposed to internet when you use SharePoint Online. When using SharePoint on premise there is no need for any exposure of the K2 site to the internet.

Facebooktwittergoogle_plusredditpinterestlinkedinmail