Sunday, June 14, 2020

Three ways to enable / disable a tool container in an Analytic App

On Friday I posted on Twitter how I found the configuration of how a tool container should behave a bit counter intuative. 



When connecting a radio button or check box to a tool container with an action tool, the fixed behaviour is to disable the tool container!



Often when I create analytic apps I want there to be an optional section of the workflow that a user can chose to run or not. Examples would be to save the output file, create a Tableau workbook, select a different API end point etc.

So this blog goes through my initial approach and then two alternatives suggested by Nicole Johnson (@The_Real_NJ) and Chris Lewis (@4loveofAlteryx).

Dummy radio button method - my approach

To account for this behaviour the way I usually set it up is to have a second radio button on the canvas. I'll call this a dummy radio button as it doesn't connect to anything. However it does appear in the user interface of the analytic app.



The advantage of doing it this way, is when selecting the radio button 'Enable the container' it forces the other radio button to not be selected, therefore it doesn't disable the container. And voila it gives the user an option to enable / disable (or should I say disable / enable!) a part of the workflow.

However, the reason for writing a blog post is I wanted to share two additional ways to control the behaviour of a tool container via radio buttons / check buttons which actually follow the desired logic I would have expected in my approach.

Conditional method - Nicole's approach

So the method Nicole suggested is to use a Condition tool.


The way this is configured on the canvas means you can have a single check box in the analytic app interface, where ticking the box enables the tool container and unchecking disables the tool container



Then you need to set up the condition tool as follows:

Here you just set the input connection from the Check box [#1], which when the check box is ticked will be true and when unselected the result of the condition is false. Then after the condition tool you add an action tool after the T(rue) and F(alse) outputs.

It is here where I think the set up then makes sense to the user as you have the option to set the behaviour of the action to to either Enable or Disable the connected container. 

Set up of the action tool after the true condition


Set up of the action tool after the false condition

Update formula method - Chris Lewis' approach

The method Chris suggested was to use the "Update Value with Formula" option.



To be honest when exploring how to do this I didn't really know how to set this one up. But looking into the way Alteryx set up the example Action tool workflow (right click on a tool and select 'Open Example'), it turns out this how they do it!

How to open an example workflow


Within the Action Tool example is a tool container showing how Alteryx set up the configuration to enable/disable tool containers.

So looking at the configuration in the example I was able to discover that you can update the value of the Disabled from False (i.e. the container is enabled) to True (to disable the container), but using a simple IIF statement: IFF([#1], "False", "True").


Like the method suggested by Nicole this approach also follows my expected logic on how to control the container. 

The benefit of this approach is it required fewer tools on the canvas, which to me is a benefit as I typically move all my interface tools to the top of the workflow and use wireless connections, however some may find the syntax of the IIF statement a bit confusing as the [#1] evaluates to true when the radio button / check box is selected (turned on) and false when deselected (turned off). Whereas using the condition tool by incorporating two distinct actions makes it clear what happens when it's selected / deselected.

By showing three methods for controlling the behaviour, hopefully you find a method that works for you and remember to consider review-ability for other users to know what the workflow does.













Friday, May 1, 2020

Plotting UK population data using Alteryx and Tableau

While not a geographer or having any real GIS background, I was previously a transport planner and would need to make use of tools like MapInfo. So the ease and intuitiveness of spatial analysis in Alteryx has always pleased me. 

For a recent proof of concept I needed to get UK population data to enrich a financial dataset I had, so after a quick Google I stumbled across this dataset - https://data.gov.uk/dataset/ca2daae8-8f36-4279-b15d-78b0463c61db/uk-gridded-population-2011-based-on-census-2011-and-land-cover-map-2015

Upon downloading the file I discovered it was in ERSI asc file format, which isn't a spatial file type that can natively be read into Alteryx, but by reading it in as a csv file with no delimiters, I could see that it looked like a grid of numbers with some header information.


This helpful Wikipedia article helped explain what the values in the header represented, and I knew I could get on my way with Alteryx'ing my way through this.

Step 1: Create spatial points on a grid

My first step was to make sure I'd interpreted the values correctly, so I sketched out the boundary points on the grid:


Which when converting to points in Alteryx (using the British Grid system) and then creating a polygon showed me that I had a polygon which covered the entire UK.


Initially I thought the next step would be to make a grid from that polygon. However when I did this I ended up with more grid cells than I would have expected, I wasn't 100% sure why this was but given from the image above where visually the polygon isn't a true rectangle I assumed the reason would be due to the projection. Therefore I needed to find another way of creating my grid...

Using two generate rows tools it was possible to create my 652 mid points for my columns (x axis) and the 1211 rows (y axis), remembering to step in by 500m. Appending these together resulted in 789,752 records which I could then convert to points.

Step 2: Parse out the .asc file

After removing the header rows, it was clear to see the rest of the .asc file was a big grid of numbers which needed parsing out. Using the multirow formula tool and splitting text to rows on the space delimiter it was possible to create a table like below:

This table could then be joined onto the spatial data, based on matching the X and Y fields.

Step 3: Create 1km polygons

The grid created in the two steps above was most of the way to where I needed to get to, but the limitation was each population data point was just a dot on the map. To enable spatial analysis later on this needed to be converted to a polygon representing the 1sqKm grid the population is distributed over.

The technique I used to do this was to create a 0.7km non-overlapping trade area which results in a map looking like this:

Then all that was left to do was to export the file.


This could then be used either again in Alteryx along with spatial match to calculate the population living within a catchment area of a store, or presented in Tableau to understand changes in population density.


Which results in a nice looking map which is much easier to interpret than the gridded numbers in the original text input.

All in this workflow took about 1 hour to build including the research on ESRI .asc file formats and the initial experimentation of understanding the spatial tools to use.

Note: The data used in the graphics contains data supplied by Natural Environment Research Council.
©NERC (Centre for Ecology & Hydrology). Contains National Statistics data © Crown copyright and database right 2011.

Wednesday, February 26, 2020

How to find out how people are using Alteryx (without an Alteryx server)

I have previously spoken about the ‘phone home’ feature which can be turned on to help people in a tech-enablement role understand how people are using Alteryx on Desktop by sending usage data back to Alteryx server, at the London User Group.

Feedback from that session was that this is a very helpful feature, but not everyone has access to Server, so how else can you see how Alteryx is being used?

For this blog, I am going to cover two approaches:

1) Tool usage

This is a hidden example built into Designer, it is a macro which searches through the .yxmd/yxmc files (these are just xml text files) on a user’s machine to identify which tools are being used within a workflow.



To access the macro, in Designer go to [ ] and open up the example workflow. Once you are in the example workflow, it’s just a case of selecting the folder where the Alteryx workflow files are saved and running the workflow. The output of this macro will give you details on the tool category, tool name, and the workflow name.

From here you can do some further analysis such as filtering out the standard macros which come with the Alteryx install, get a total count of workflows a user has built, the tool categories a user focuses on etc. Running this now on my machine I have learnt that I have built over 2,000 workflows, that my favourite tool is the formula tool with over 15,000 uses of it and that I have built over 100 macros!



In addition to getting tool usage stats, I have also found this tool really useful to locate that workflow where a particular tool has been used. For example, I had a colleague who wanted to know examples of when a particular tool could be used (in this example it was the map input tool). Now it would be pretty hard to put your hands on the workflow straight away that used that particular tool, but by running the macro you can then filter (it’s great that filtering in the browse pane is now possible!) to that tool, find the workflow name and load it up!

If your organisation has users save their workflows on a network drive, point the macro to that network drive then away you go. One limitation of the example workflow straight out the box is that while you will get details about the tools being used, you won’t know who built them. This can be solved though (if your users are following good governance) as the workflow has meta info about the author, which like details of the tools is embedded in the xml, so this just needs parsing our as well.

When collecting this information, it can provide useful insights into the workflows being built by users, and from a tech enablement perspective it can help identify which tool categories each user uses a lot and then you can provide advanced training on the tools, or identify areas where you can introduce other parts of the Alteryx palette.

2) A hidden macro within a workflow

The first example is great to get a lot of information about the tools being used, however to get the specific tool usage by user it relies on the population of user details within the workflow metadata configuration pane.



In addition you get details of the individual workflows being produced, but no information about the frequency of use. For example if there is a macro which makes use of an API, the first approach will give you details that a user has a workflow called, for example, ‘External Provider API.yxmc’, but there is no way to collect how frequently they may be using the external provider’s API. This is where the second approach has it’s benefits.

The second approach is to create a macro (I do it this way as it is then easy to embed into a workflow or another macro), which leverages the ‘Who Am I’ feature in Windows Command Prompt to identify the user, Workflow Constants to get information on the workflow name and the DateTimeNow tool to get a time stamp.


The run command tool is used to capture the user details from their windows login, so it’s helpful if this is a unique identifier such as a Staff ID or email address (set up below).

 

This workflow then writes out the information collected at the time of the workflow run to an Excel file (as you can append new data to an existing file), but this could quite easily be to a database or another file type.

Summary

So this blog sets out two ways in which either you as a user or an organisation can use tools available directly within Alteryx Designer to monitor usage of Alteryx and understand how your team are making use of Alteryx.

The techniques here do not require the Alteryx Server to capture this information, however the approach here has limitations such as having users save their workflows to a location that you can access, and ensuring that they embed the macro from the second approach into their workflow. Whereas the Server based approach is able to capture much more meta information and not rely on these factors which make it a richer dataset for understanding usage.