Alteryx Advanced

This course covers Alteryx for advanced Alteryx users

ABOUT THIS COURSE

It uses more advanced tools and features of Alteryx to answer business questions.

You will be provided with data sources and hands-on exercises to cover the course content.

The material provides tips throughout as well as challenges to allow the students to put into practice the concepts taught in each module.

Audience

Anyone who currently works with or analyses data, including business analysts, data analysts and developers who have completed the Alteryx Designer course

Duration: 2 days

Versions: 2018.x and 10.x

Prerequisites: Alteryx Designer

COURSE CONTENT

Lesson

Learn how to combine data from multiple sheets of an Excel workbook using Dynamic Input and how to write to multiple tabs in an Excel workbook. Using the Dynamic XSLX macro to combine not only different sheets but multiple Excel workbooks.

Lab

Follow Me to use the Dynamic Input, Output Data and Dynamic XSLX macro in various workflows.

Outcome

You will understand how to read Excel in different ways and in particular to design dynamic repeatable Alteryx workflows. Also you will learn how to import an Alteryx package from the Alteryx Gallery and install it into Alteryx Designer as a new tool.

Lesson

Are there times when you need to compare new customer data against existing to see if there is a match or not? This lesson is all about Fuzzy Match in both Purge and Merge modes. Essentially, Purge mode finds duplicates in a single source, while Merge mode is used to compare data from two different sources to find how close the data matches. We look at Address, Postcode and Surname matching using Alteryx supplied or Custom matches where you specify the match style and the match function e.g. Double Metaphone w/Digits as a match style and Character: Best of Jaro & Levenshtein Distance as a match function, along with the match threshold and match weight.

Lab

Follow Me to use the Purge function on the master records, then on the customer records.
Using these records, you will do a Fuzzy Match Merge tutorial followed by three challenges.

Outcome

You will understand the options that can be used in the configuration properties of the Fuzzy Match tool and how to create a workflow which uses the Purge and Merge functions.

Lesson

Can you input your survey data into your BI tool without having to manipulate it?

The data usually has many questions with multiple responses and often some responses are of numeric data type while other are of character type. Here we learn how to manipulate this data to get it into usable format for analysis.

Lab

Follow Me to prepare and output the survey data directly into an existing Tableau workbook which has multiple dashboards ready for analysis. The second part is to create multiple Tableau workbooks where the data is split by a specific chosen field.

There is a survey data challenge to answer specific questions about the survey responses.

Outcome

You will understand how to manipulate survey data into a workable format for analysis.

Lesson

Do you need to extract data from a website?

In this module, you will learn how to download data from websites. As an example we download Climate Data Online from the Bureau of Meteorology and parse it ready for use.

Lab

Follow Me to download BOM data, parse the data and output to Tableau.

The download data challenges, extract data from Dividends webpages and parse it using RegEx.

Outcome

You will understand how download data from URLs.

Lesson

How many records do you need to process?
Is this a very large number?
Does the final output include only summarised data?
Do you want to reduce the workflow run time?

If the answer to these questions is yes, then we can use the InDB tools. This is where we utilize the power of your database to do most of the processing and only bring the resulting rows into memory to complete the workflow.

Lab

Follow Me to read database data into memory and process and then compare the same thing using the InDB tools to process in the database and only bring the results out of the database and into memory. This comparison will show the difference in how long the workflow takes for each option.

Outcome

You will understand how to use the InDB tools to process large amounts of data in the most efficient manner.

Lesson

Do you need to use the PSMA Geocoded National Address Files to obtain the latitude and longitude of your customer address data so that you can display them on a map? This data is maintained and made available on a three monthly basis.

In this module, you will learn how to extract both Location and Address from the G-NAF files.

The amount of data for all Australian addresses is in the tens of millions of records. With this amount of data, millions of rows, but not larger than 2 billion and the fact that it is not volatile means that it is an excellent candidate for placing it into a Calgary Database for fast retrieval.

Lab

In the Follow Me session we create the Location data for all Australian states. The second part of this is to create a Calgary Database and to check the performance of reading the data against reading the same data from an Excel workbook.

Then we create another workflow that joins all Australian addresses to address business data. Followed by creating a Calgary Database with all Australian addresses and showing the difference between the Calgary Join tool and using Calgary Input and the Join tool.

There are two challenges to use both Location and Address data.

Outcome

You will understand the GNAF data can be joined to business data to provide the Latitude and Longitude so that your business data can be shown on a map by either location or even down to address level.

Lesson

Java Script Object Notation (JSON) text can be both parsed and built by Alteryx tools. The field that you want to parse must contain valid JSON text in a single field. Here you learn how to output the data into either values in a single field or to data type specific fields.

Lab

Follow Me outputs JSON data to a single field where we parse it into multiple columns and rows using JSON Parse.

Outcome

You will understand the two different configurable options available with the JSON Parse tool.

Lesson

A Macro is a workflow which can be run as a single tool in another workflow. This is ideal for any repetitive process.
There are four types of macros:

  1. Standard – is basically a series of tools that can be inserted into a workflow as a single tool.
  2. Batch – a process which can be run repeatedly taking special input called a control parameter to map fields to questions.
  3. Iterative – repeat running until a certain condition is met or until a certain number of iterations have been reached, e.g. increase drive time until a certain population is met.
  4. Location Optimizer – is again iterative but it determines the best score for each demand location.

In this module, you will learn how to create a macro using the interface tools and use the macro in a workflow.

Lab

In the Follow Me you create a macro and then incorporate it into an Alteryx workflow to test that the macro works correctly.

The macro challenge is used to extend the macro you have already created.

Outcome

You will understand what a macro can do, how to create it and how it gets used in an Alteryx workflow.

Lesson

With data that includes time intervals we can predict future values using forecast models. This is used for forecasting ocean tides, monthly sales, etc. and is normally plotted using line charts.

With the Time Series Forecasting tools, we can get a better understanding of the data and determine the best approach for modelling, then create models, compare them to choose the best outcome and finally create the forecast using that model.

Lab

The Follow Me session is used to run a Time Series Plot to understand the data better and help determine the best modelling method. From this investigation of the data you will create both an ETS and ARIMA model, compare these models and choose the best one to use in forecasting. We also check the forecast against actual before forecasting into the future.

Outcome

You will be able to understand your data, create and compare models to choose the optimum when developing forecasts with time-series data.

Lesson

In this session we learn the many tools that enable us to investigate our data so that we can help to predict, analyse and score these predictions.

The Data Investigative tools include:

  • Basic Data Profile
  • Association Analysis
  • Contingency Table
  • Distribution Analysis
  • Field Summary
  • Scatter Plot
  • Violin Plot

The Predictive tools used are:

  • Decision Tree
  • Lift Chart
  • Score

Lab

The Follow Me session investigates the data before predicting, analysing and scoring the data. The above tools are all covered in detail.

Outcome

You will have the knowledge to be able to investigate your data before being able to use the Predictive tools to make a decision about what you can do to get the best outcome with your data and business going forward.

Tell Me, I’ll Forget;
Show Me, I’ll Remember;
Let Me Do It, I’ll Understand

Scheduled Courses
No event found!