Trending February 2024 # Vlookup In Power Query Using List Functions # Suggested March 2024 # Top 7 Popular

You are reading the article Vlookup In Power Query Using List Functions updated in February 2024 on the website We hope that the information we have shared is helpful to you. If you find the content interesting and meaningful, please share it with your friends and continue to follow and support us for the latest updates. Suggested March 2024 Vlookup In Power Query Using List Functions

If you’ve done lookups in Power Query to pull values from one table into another, you may have used a query merge to do this.

Mynda has written previously on how to do an Exact Match in Power Query and an Approximate Match in Power Query

Here I’ll be showing you how to use List Functions in Power Query to do both Exact and Approximate matches.

This approach requires a little more M coding but requires less steps and is my preferred method for doing lookups. But hey, I would think that, I’m a coder 🙂

Using List Functions

The key to doing this is remembering that table columns are lists, and as such you can use List Functions to manipulate the data in them.

Watch the Video

Download Sample Excel Workbook

Enter your email address below to download the sample workbook.

By submitting your email address you agree that we can email you our Excel newsletter.

Please enter a valid email address.

. Note: This is a .xlsx file please ensure your browser doesn’t change the file extension on download. Excel Workbook . Note: This is a .xlsx file please ensure your browser doesn’t change the file extension on download.

Exact Match

My sample data is a table of sales on different dates for different types of food and drink. This table is named data.

What I want to do is add a column showing the category for each item. These categories are stored in a separate lookup table named categories.

I’ve loaded both tables into PQ and set categories to load as connection only because I don’t need to create a new table from it.

So with both tables loaded into PQ, let’s look at what we need to do.

The Food & Drink query needs to get the Category from the Categories table. By looking up the value in the Food & Drink[Product] column in the Categories[Product] column we can get the row number in Categories[Product] that matches.

Using that row number as an index on the Categories[Category] column will return the Category.

Let’s do this step by step. Start by adding a Custom Column and calling it Category Index.

Using the List.PositionOf function, I want to look up in the Categories[Product] column, the value in this [Product] column.

The new column shows the position of each Product in the Categories table.

Remember that Lists are indexed from 0, so Apple Juice is in Position 0 of the [Product] column in the Categories table.

Atlantic Salmon is position 65 etc.

Now that I have this index number I can use it to lookup the category. Create another Custom Column and call it Category.

What I need to do here is enter a reference to the value in the Categories[Category] column. This is done by specifying the Category Index value inside curly braces after the Table[Column] reference.

Let’s do a litle tidying up, We don’t need the category Index column now so delete it, and reorder the columns.

Before I finish though, I can make things a little neater. I like reducing my steps and code where possible.

You can see in the Added Custom step that the code creates the Category Index column

And the subsequent Added Custom1 step it uses the values from that Category Index column.

You can take the code List.PositionOf( Categories[Product] , [Product] ) from the Added Custom step, and replace [Category Index] in the Added Custom1 step with it.

This condenses the code from two steps into one and you end up with the same result.

As the Added Custom step is no longer needed, delete it. Also delete the Removed Columns step as all that is doing is deleting the Category Index column. But as the query is no longer creating that column, this step is not needed either. I don’t actually need to create the Category Index column at all.

OK so the query is done, load to the sheet and that’s our finished table.

Approximate Match

To do an approximate match I’m going to use this table of sales figures for various sales people and add a new column showing the bonus rate they’ll get for those sales.

The idea being that you multiply the sales amount by the bonus rate to work out the bonus the sales person gets paid.

The bonus rates are stored in a separate table called excitingly, BonusRates.

Make sure the table is sorted in ascending order on the Threshold value. It’s important to note that the first row has $0 threshold and a rate of 0.

The reason for this will become clear as I explain how the lookup query works

The bonus rate is determined by the sales amount. If you sell $10,000 or more, but less than $20,000 then the rate is 0.1

If you sell $20,000 or more but less than $30,000 then the rate is 0.15, etc

Load both tables in Power Query and set the Bonus Rates lookup table to connection only.

What I need to do here is of course look up the sales amounts in the BonusRates table. But this time I’ll use List.Select to create a list of all values in the BonusRates[Threshold] column less than or equal to the Sales Amount

The number of elements in this list will be used as the index to lookup up the bonus rate.

I’ll use the first sales value of $17,606 as an example. There are 2 values in the Threshold column less than or equal to $17,606.

The List.Select function creates a list containing 0 and 10000. Then by counting the items in this list, I can use that number to return the 2nd item in the Rate column which is 0.1, or 10%

Let’s look at the code. Open the Power Query editor and add a Custom Column called BonusRates and add this code

I’ll explain what’s going on here

The variable val contains the Sales value in the current row of this table. Remember that the code written here is run for every row in the Sales column.

List.Select creates a list containing all values in the BonusRates[Threshold] column, that are greater than or equal to the value in val.

The each _ is shorthand for the current item in the BonusRates[Threshold] column. It’s saying compare val with each item in the BonusRates[Threshold] column

The result of this code is a column of lists.

For the Sales value 9238 the BonusRates list contains just a 0, because the minimum sale amount to get a bonus is $10,000.

If there wasn’t a row in the lookup table with 0 threshold then the list for any sales value less than 10000 would be empty. When the query tried to lookup an index using an empty list it would generate an error. Having the 0 Threshold row means that the code will always create a non-empty list and avoid such errors.

Checking the list in row 4 for the Sales value 32455 shows that the list contains 4 items, because this sale amount crosses the $30,000 threshold.

With this new column of lists I can now count the items in each list and lookup the bonus rate for the Sales amounts.

Add another custom column, call it Bonus Rate, with this code

What we need to do is lookup the value from the Rate column in the BonusRates table, and the number of items in the list created in the previous step is the index to that value.

Remember that lists are indexed from 0 so if list has 2 items then the 2nd item is at position 1. Therefore we have to subtract 1 from the count of items in the list

I can now calculate the bonus amount in Power Query, or load this table to Excel and do it there.

You're reading Vlookup In Power Query Using List Functions

Grouped Running Totals In Power Query

In a previous post I looked at creating running totals in Power Query, here I’ll be creating grouped running totals.

I’ll start with a table of data showing values for cities within different countries.

Download the Workbook With Sample Query

The queries in this Excel file can be copied/pasted into the Power BI Desktop Advanced Editor and will work there too.

Enter your email address below to download the workbook with the data and code from this post.

By submitting your email address you agree that we can email you our Excel newsletter.

Please enter a valid email address.

Download the Excel Workbook.

I want to create a running total for each country so that we end up with this where the running total resets when we reach a new country in our list.

Grouped Running Total Custom Function

The M code to create the custom function is very similar to the code for the running totals custom function.

Read More : Power Query Custom Functions

You don’t need to worry about getting your hands dirty with M, you can just call the function as you would any other function in Power Query.

The function is named fxGroupedRunningTotal and takes two parameters named values and grouping, both of these are lists of equal length. The function returns a list which will have the same number of items as the input lists.

In this case the values parameter will be the list of values from our source table, and grouping will be the list of countries.

Here’s how the function works. It uses List.Generate to work through each item in the grouping list, following the rules as described below. It uses two variables GRT which holds the running total, and i which is a counter

Each time a new value is created in GRT, it is added to the list that is the result of the function, GRTList

Set the initial values. GRT takes the first value in the values list. The counter i is 0.

Keep looping while i is less than the number of items in the values list. Remember: lists are indexed from 0.

On each loop : if the current country in the list specified by grouping{[i]} is the same as the next country grouping{[i] +1} then add values{[i] + 1} to GRT else the next grouping (country) is different, so GRT just becomes values{[i] + 1}

Add the value in GRT to GRTList

The try .. otherwise in step 3 is to catch the error generated when [i] + 1 exceeds the number of items in the input lists.

Using the Custom Function in a Query

After loading the source table, the query creates two buffered lists for the values and countries. Buffering the lists allows the query to run much faster as the lists are held in memory and not re-evaluated every time the query needs to refer to an item in the lists.

If you want to group by something else in your data, then change Source[Country] to whatever column you want to use.

Then it’s just a case of calling the function to create the grouped running total, creating a table from the columns in the source table and the output from the function.

The final table loaded into Excel looks like this

How Fast Is This?

As with the running total query, by using buffered lists the query runs in a flash. Even when it’s calculating for 100,000 rows, the query finishes in a couple of seconds.

In the example workbook you can download, I’ve created a table with 100,000 rows of data and a separate query to calculate the grouped running total for that. Try it out for yourself.

Pro Tip – Quickly Create 100,000 Rows of Data

If you need to enter a lot of data, like the 100,000 rows of dummy data for this post, you can do this using the Immediate Window in the VBA editor.

Press ALT+F11 to open the VBA editor. If the Immediate Window is not already visible, press CTRL+G to open it.

Valid VBA commands typed into the Immediate Window get executed after you press the Return/Enter key.

So I can put the value 101 into cells A1 to A10 on the active sheet with

and to fill 100,000 cells with random numbers

Power Query Keyboard Shortcuts To Save Time

Below is a list of time saving keyboard shortcuts that you can use in Power Query, in both Excel and Power BI.

Watch the video to see me demonstrate how to use them, and download a PDF containing all shortcuts.

Watch the Video

Download Power Query Shortcuts PDF

Enter your email address below to download the sample workbook.

By submitting your email address you agree that we can email you our Excel newsletter.

Please enter a valid email address.

More Keyboard Shortcuts

You may also like to get a copy of my eBook 239 Excel Keyboard Shortcuts

All of these shortcuts work in both Excel and Power BI Desktop, except the shortcut to open the Power Query editor from Excel.

When the Power Query editor is open in both Excel and Power BI Desktop, pressing Alt highlights the keys required to access items on the Ribbon.


Open Power Query Editor (Excel Only)






Open Power Query Editor (after adding icon to 1st position on QAT in Excel)



Close Power Query Editor



Increase Font Size






Decrease Font Size





Reset Font Size (don’t use 0 on numeric keypad)





Move around table cells, move between columns, move between rows

Jump to first column (stay in current row)


Jump to last column (stay in current row)


Jump to first cell in first column




Jump to last cell in last column




Jump to first cell in current column (with any cell in column already selected)




Jump to last cell in current column (with any cell in column already selected)




Move up/down a page of data in the table

Pg Up

Pg Dn

Go To Column




then choose the column

Select Columns and Rows

Select current column




Select Adjacent Columns




to select a column.

to select a column.

Hold Shift then use ← or → to select other columns.

Select Non-Adjacent Columns




to select a column.

to select a column.

Hold Ctrl then use ← or → to move to other columns.



to select those other columns

Pressto select those other columns

Select All Columns




Preview an Entire Row’s Data

Navigate to leftmost column.

Press ← to preview the row’s data.

Use ↑ and ↓ to preview different rows.

Modify Columns

Rename Column

Select the column then


Delete Column(s)

Select one or more columns then


Add Column By Example




Open Menus

Open Sort and Filter Column Menu

Select a column.

Alt + ↓

Use ↓ ↑ Tab to move around the menu.

Space to select/deselect items. Enter to confirm selections.

Change Column Type Menu

This key is usually found on your keyboard underneath the rightmost Shift key.

The menu displayed is context sensitive – what you see depends on what you have selected.

Table Options Menu




Navigate to top cell in first column using

Then ← ↑

Enter or Space to open the menu

Exit Menu or Step Back a Level in Multi-Level Menus


Power Query Best Practices For Your Data Model

Power Query is used to prepare each of the tables loaded into the data model. Hence, it’s fundamental that the tables, fields, and measures in the data model should be uncomplicated and user-friendly. In this tutorial, let’s talk about some Power Query best practices for our data model, some of its features, and why we should use the query editor. 

Power Query allows users to do very complex stuff. Therefore, it’s always important to follow a couple of best practice rules to keep everything properly organized.

People might usually import their data directly to their data model by using the Get data option.

I highly suggest you do not do that and use the Query Editor first. This means that we should always bring our data to the query editor to clean them first. The reason is because data is never perfect. It would be better to check the data in the Query Editor before adding it to our data model. 

One of the most important Power Query best practices that I’d recommend is understanding what a query is. 

A query is like a snapshot of our data in its worst form. It doesn’t physically transfer anything into our Power BI model as well. 

Since our data tables could be big, we want to query it and don’t create any overload in terms of our Power BI models. Once we get them in the query format, that’s when we do all the cleaning and transforming of those tables. Therefore, it’s crucial to have a good understanding of what a query is versus directly committing data in the data model.  

It’s so important in terms of Power Query best practices for model development to organize our queries. This is because we’ll have a lot of queries when we develop more and more inside Power BI. Sometimes, a query could be like a staging table, and eventually might get appended or merged into another table. So, we might get a lot of queries and we need to be able to manage them. 

In this example, I organized them on the left hand side using folders. We can also drag and drop our queries to put them in a certain order. The key thing when organizing them is to name them intuitively as well—not only the queries but also the folders that they sit in. 

The other Power Query best practice that we need to learn is to know what goes on inside the Advanced Editor and more specifically, with M code. 

This is an example of a detailed M code with the dates query. It’s simply a code that will change every time we make a transformation.  So, it just lays out all the different details of transformations we’re doing. 

For example, let’s remove a column here. 

Lastly, I highly suggest users to have an understanding of how we want to structure or optimize tables for Power BI. This is really crucial because at the end of the day, once we get past this query stage, we’re going to commit it to our data model and have to build a data model around it. We’ve got to have the data model in mind as we’re working through this, because this is where we are optimizing your tables for the data model. 

So, what is a good shape or what is the most optimal shape for our tables to fit inside our data model? There’s no actual exact answer to that as well because every data situation is unique in a lot of cases.

So, those are my suggested Power Query best practices and some of the main key things that we’re going to cover in the other blog articles. Following these general tips can help you prepare a proper data model which is considered as the heart of  a Power BI report solution.

Always keep in mind that it’s really essential to have an understanding of what’s going on inside the Query Editor.  From there, we can go and apply what a good and optimized table looks like into our own data scenario and into our own model. 

All the best,


Running Sensitivity Analysis For Power Bi Using Dax

Have you ever thought it would be nice to work out what the most optimal outcome is from your scenario analysis work that you’re doing inside of Power BI? In this tutorial, I show you exactly how you can discover this and also include sensitivity analysis techniques to your what-if parameter and scenario analysis work. You may watch the full video of this tutorial at the bottom of this blog.

So we go through how you can build and optimize your model through running scenarios, and then explore or run sensitivity on those scenarios.

Incorporating sensitivity analysis and relevant visualizations to your reports enables consumers to see what would happen if multiple scenarios occurred at once versus just a singular result based on a selection.

By utilizing this technique in Power BI, you’re giving the consumer a chance to see what the most optimal outcome is based on the scenarios that could occur in your data.

The key thing for this analysis is to set up our data model correctly. Inside our data model, we have our Lookup tables — Dates, Customers, Products, and Regions — that are related to our Sales table.

More importantly, we create these three scenario tables or scenario supporting tables. In the older version of Power BI, we had to do this manually. But now with the recent Power BI version, we can create this using the What-If parameter feature.

 In these scenario tables, we can shock the demand, cost, and price.

I call this multi-layering of scenarios or a multi-layered approach to scenarios because we can thread through these three variables or elements into our calculations, allowing us to run multiple scenarios.

This is how we’re going to run sensitivity analysis to then see which is the most optimized scenario in this current environment we have here in this example.

Once we have this forecast or this sort of scenario in our demand, pricing, and costs, we will then see its ultimate impact on our total profits or sales.

We then thread all of our elements through our formulas. In our Scenario Profits calculation, we use iterating functions. We can isolate any element that we’re looking at in a particular row of a table.

In this case, it is the Sales table we’re iterating through every single row. And then we can shock it with the change in demand, price, and cost.

So, if you think about it, these elements or scenario tables are not even connected into anything in our model since they’re supporting tables. And, we use this formula to integrate them into our model.

Instead of just showing the overall results, we show the sensitivity. We’re using that multi-layering of scenario approach inside of iterating functions in this particular formula to then create these sensitivities.

In this chart here, we’ve brought in on the rows the Change in Price and across the top and the columns we have the Demand Changes.

In this chart, we can see what the change in demand, as well as the change in price, would actually do to our results. And then within the matrix, we can use the conditional formatting to color those in, which is another really awesome element, for a better visualization.

In this chart below, we can see the Change in Cost. So when costs decrease, for example, our demand increases.

We might as well add more elements to get a more comprehensive analysis. We can put in our Dates, Regions, etc.

We add our Dates slicer here, so we can change the time frame, which is seriously amazing. This will enable us to drill into a specific time frame, and then it will change the results we have in our charts.

With the power of the data model, we can also include any element on any dynamic way so we can really drill into aspects of our data.

So we can utilise anything inside our model, we can filter our Customers, Products, Regions, etc.

We can still run these sensitivities in these very specific regions. Once you select a region, it will dynamically change the results as well.

And that’s how you can optimize these scenarios. It’s basically running sensitivity analysis easily and effectively.

As you can see, this is really powerful stuff. This is really high quality analytical work that is going to impress anyone if you put this in front of them.

This work historically was very difficult to achieve. In Power BI, just like magic, you can create this work. You can create these insights in a really intuitive, effective and a scalable way.

I hope you can see how quickly you can do this. There are not many complex formulas involved. It just requires a really good understanding of iterating functions, and that is the key to implementing this technique.

Good luck using techniques like this in your own analysis.


Examples For Query Building In Postgresql

Introduction to PostgreSQL Select

According to our requirements, one of the most important purposes of any database is to store the data so that it can be retrieved and fetched whenever we want. Users mostly use the retrieved records for reporting and analysis or sometimes to modify existing results. You use the SELECT clause in the PostgreSQL database to fetch the data. We can retrieve the results from zero, one, or more tables using the select clause. This article will learn how to use the select clause to build the query statements, their Syntax, and examples to understand query building in PostgreSQL better.

Start Your Free Data Science Course

Syntax of PostgreSQL Select

Below is the Syntax of postgresql select:


columns_or_expressions FROM tables [WHERE conditional_restrictions] [GROUP BY column_or_expression] [HAVING conditional_restrictions]

The select clause’s Syntax is very complex and involves many possible combinations to provide flexibility to the user. We will learn the Syntax by learning all the above-used clauses with the select clause.

ALL: To retrieve all the records that the query will fetch after applying all the conditions, restrictions, and expressions.

DISTINCT: To retrieve only unique values of the column and expression from the retrieved results and further filter out the unique entries with respect to the column or expression mentioned in the distinct parameter.

columns_or_expressions: This is the list of the column names or expressions that you wish to retrieve using the select query.

FROM: This keyword helps specify the name of the table from which you wish to retrieve the records. Further, we can use joins of the type left join, right join, natural join, etc., to combine the results of two or more tables while retrieving the records.

WHERE: This clause helps specify the conditions, restrictions, and expressions to filter out the results while retrieving the records in the select query.

GROUP BY: You can group the result set based on a specific column or expression using the group by statement. People most frequently use this when retrieving manipulated columns with aggregate functions, such as the sum or product of certain columns.

HAVING: You can further filter the result by applying conditions and restrictions to the retrieved columns and expressions, including any aggregated values used in the retrieval process.

ORDER BY: You can arrange the result set in an orderly format based on specific columns and expressions by specifying them after the ORDER BY keyword. We can arrange the data in ascending or descending order by using ASC or DESC keyword.

LIMIT: You can limit the number of rows retrieved by using the limit keyword. For example, if the query result would have resulted in 55 records and after applying the limit of 10 statements in the select query, only the first 10 records will be retrieved.

OFFSET: This is the number of the row from which you want to begin retrieving the records. For example, if you specify the offset as 5, the result’s rows will be retrieved starting from the 5th record.

FETCH: Like the limit keyword, this function restricts the number of records that can be retrieved to a specific number.

FOR: The records can be restricted for access and are write-locked if FOR UPDATE is specifies and is allowed for reading operation but not update and insert operations by other transactions when FOR SHARE is specified.

Except for FROM, all other clauses/keywords used in the above select clause syntax are optional in nature.

Examples of PostgreSQL Select

Following are the examples of postgresql select:

Let us create one example and insert a few records in the table to learn how to use a select clause to retrieve the records. Open your PostgreSQL command-line prompt and enter the following command to create a table named educba –

CREATE TABLE educba (id INTEGER PRIMARY KEY, technologies VARCHAR, workforce INTEGER, address VARCHAR);

Let us insert some values in the educba table using the following statement –

INSERT INTO educba VALUES (1,'java',20,'satara'),(2,'javascript',30,'mumbai'),(3,'java',20,'satara'),(4,'psql',30,'mumbai'),(5,'mysql',20,'satara'),(6,'maven',30,'mumbai'),(7,'hibernate',20,'satara'),(8,'spring',30,'mumbai'),(9,'angular',20,'satara'),(10,'html',30,'mumbai'),(11,'css',20,'satara'),(12,'reddis',30,'mumbai');

SELECT * FROM educba;

Here, * represents all the columns to be retrieved, and firing the above query results in the following output –

Now we will apply the conditions using the where clause and retrieve only records with a workforce of 20 persons. For this, we will have to mention the condition as workforce = 20 in the where clause, and our query statement will be as follows –

SELECT * FROM educba WHERE workforce=20;

Now, suppose we only want to retrieve the list of name of technologies with the workforce as 20, then the query statement will be as follows –

SELECT technologies FROM educba WHERE workforce=20;

Let us see how we can group the result based on workforce count and retrieve the technologies’ comma-separated string. For this, the query statement will be as follows:

SELECT string_agg(technologies,','), workforce FROM educba GROUP BY workforce;

Suppose that instead of retrieving the column head as string_agg, we can give the alias for the same using the “as” keyword as follows:

SELECT string_agg(technologies,',') as "List Of Technologies", workforce FROM educba GROUP BY workforce;

Let us order the results alphabetically based on the technology’s name and limit the records to only 7 by using the limit clause. Our query statement will be as follows –

SELECT * FROM educba ORDER BY technologies ASC LIMIT 7;

Recommended Articles

We hope that this EDUCBA information on “PostgreSQL Select” was beneficial to you. You can view EDUCBA’s recommended articles for more information.

Update the detailed information about Vlookup In Power Query Using List Functions on the website. We hope the article's content will meet your needs, and we will regularly update the information to provide you with the fastest and most accurate information. Have a great day!