11 Secret Power Query Functions You Need To Know
Let me tell you about the time I thought I was a Power Query pro.
I had cleaned up dozens of finance data dumps, normalized messy trial balances, and even built automated refresh flows that made my Excel files practically sing. I strutted around like I was the Beyoncé of ETL. And then boom, I stumbled into the Advanced Editor and saw a sea of code I didn’t recognize.
There were strange words like each, try, and let… things that never showed up in the friendly point-and-click interface. It was like peeking behind the curtain and realizing the wizard wasn’t just pushing buttons, he was writing spells.
That was the moment I realized: Power Query had a secret life. And if I wanted to level up I needed to speak its hidden language.
Why You Should Care About These “Hidden” Functions
Most finance pros use about 10% of Power Query’s full capabilities. We click “Remove Columns,” “Group By,” maybe even “Merge Queries” if we’re feeling brave. And that’s fine, for basic clean-up jobs.
But if you’re wrangling large datasets, automating month-end reports, or trying to scale up anything resembling a reporting process, that basic toolkit quickly becomes a bottleneck. Hidden Power Query functions enable advanced data transformations like grouping, aggregating, pivoting, and merging data. Plus, they can efficiently handle data from various sources, making complex data shaping and cleaning much easier.
That’s where these secret functions come in. They’re not in the UI. You won’t find them on a ribbon. But they exist, and they’re powerful as hell.
With the right mix of Power Query’s hidden gems, I’ve:
- Cut 2-day data prep jobs down to 2 hours
- Created reusable functions that transformed 12 different files in one go
- Built error-proof pipelines that could handle surprises without exploding
This guide is your flashlight in the Power Query cave. I’m going to walk you through the secret functions, how they work, and how I’ve used them in real finance workflows from month-end reporting to dirty CSV cleanup missions.
Live Video Walk-Through
Download The Practice Files
Row-Level Functions
If Power Query were a heist movie, these three would be the silent assassins. They don’t wear flashy suits. They don’t show up in the UI. But they do the hard, gritty work behind the scenes and they’re freaking lethal when you learn how to use them.
Let’s break down what each one does and how you can use them to make your queries smarter, faster, and less likely to implode during month-end.
each: The Row-Level Workhorse
You’ve probably seen this little gremlin before:
Table.AddColumn(Source, "New Column", each [Amount] * 1.1)
At first glance, it’s like, “Cool, it adds a column.” But what the hell is each?
Think of each as shorthand for ‘for every row, do this.’ It’s how Power Query lets you apply logic to each record without having to write a full-blown function.

Real Example: Add a 10% markup
= Table.AddColumn(Source, "Marked Up", each [Amount] * 1.1)
Boom. One line, no drama, column added.

Pitfall: each can get vague fast
If you start chaining multiple functions, each gets cryptic. When that happens, break it out into a named function to make things clearer. This approach creates a simple function that is easier to read and maintain:
= Table.AddColumn(Source, “Gross Margin”, (row) => row[Revenue] - row[COGS])
Same result, but now it’s readable. (Your future self will thank you.)
_: The Mystery Placeholder
This one’s even sneakier.
The underscore _ is just a placeholder for the current value. You’ll mostly see it when working with lists, not tables. In Power Query, functions like List.Transform use _ to efficiently transform list values, allowing you to modify each item in a list with a custom operation.
Real Example: Clean up extra spaces
= List.Transform(TextList, each Text.Trim(_))
Here, _ refers to each item in TextList. You’re trimming whitespace from every string in the list. Short, sweet, and powerful.
Bonus: _ works inside each, too. Because Power Query loves layers of weird.
try…otherwise: Your Query’s Safety Net
Let’s say your data source randomly tosses you bad rows. Maybe a null. Maybe a bad data type. Maybe it just hates you.
Instead of crashing your whole query, you can wrap risky logic in try…otherwise and give it a fallback plan.
After the code example in 1.3.1, note that the function returns a default value, such as null, when an error occurs. This ensures the query continues without interruption, and the function returns a predictable result as its output even in the presence of errors.
Real Example: Handle a flaky date conversion
= Table.AddColumn(Source, "Parsed Date", each try Date.FromText([RawDate]) otherwise null)
If Power Query can’t convert the value to a date, it just returns null. No errors, no screaming into your pillow at 10 p.m.

You can even debug with try
Want to see what went wrong? Use:
= Table.AddColumn(Source, "Debug Info", each try Number.FromText([Amount]))
This gives you a record with fields like [HasError], [Error], and [Value]. Super helpful when you’re building complex logic.
Case Study: Cleaning Up Messy Date Formats
I once had a “Received Date” column where half the entries were text like “06/01/2023” and the other half were weird Excel serial numbers from 2003.
If I tried to convert the whole thing directly, boom, query death.
Here’s what I did:
= Table.AddColumn(Source, "Cleaned Date", each try Date.From([Received Date]) otherwise null)
That one line saved me from manually filtering out 100+ bad rows. It gave me clean, usable dates—without babysitting.
List and Table Functions
Here’s something most finance folks don’t realize: Power Query is basically a data ninja hiding in Excel’s basement. And while you’re over there clicking “Remove Duplicates” like it’s 2012, Power Query’s list and table functions are doing backflips and precision strikes in the background.
Power Query offers a wide range of list functions for manipulating data, allowing you to work with a single list or handle multiple lists at once. There are also many other functions available for specialized tasks, making Power Query extremely versatile for advanced data operations.
Let’s break down some criminally underrated ones you can—and should—start using right now.
List.IsDistinct: Duplicate Checks Without Drama
You ever get that one Excel file where some genius decided to paste 40,000 rows of GL data… with duplicate rows scattered like confetti?
This function checks whether all values in a list are unique. It examines the original list and returns true if every single value appears only once. Think of it as the sanity check your reconciliation didn’t know it needed.

Example:
= List.IsDistinct({1001, 1002, 1003, 1002}) // Returns false
Real-World Use:
I used this during a vendor payment audit. Pulled a list of invoice numbers by supplier, and this function instantly flagged a supplier that submitted the same invoice ID five times. Caught it before AP even blinked.
List.ContainsAny: Fast Matching Across Lists
When you want to check if any value in one list appears in another, this one’s gold.
The List.ContainsAny function evaluates whether any of the input values from one list match a specified value or values in the target list.

Example:
= List.ContainsAny({“Red”, “Blue”}, {“Green”, “Blue”, “Yellow”}) // Returns true
Use Case:
I used this to flag expense lines that matched a “banned vendor list.” Instead of merging tables or writing long IF statements, it took one clean line of M code. Red flags up, no false alarms.
List.MatchesAll: All-or-Nothing Matching
Want to make sure every value in a list passes a condition?
This one is like quality control. Super useful when validating import files.
After the code example, note that List.MatchesAll uses a logical function and an expression to test whether each element in the list meets the specified condition. This allows you to apply custom logic to every value efficiently.
Example:
= List.MatchesAll({1, 2, 3}, each _ < 5) // Returns true
Use Case:
I used this for checking that every cost center in a journal entry matched valid codes from our master list. One bad value? The whole entry gets flagged. Saved us from booking to an old, deactivated CC.
Table.Profile: Data Profiling Without Power BI
You can actually generate a summary table of column stats (like distinct count, nulls, min/max) right in Power Query. It’s like doing a data health check in one line.
Table.Profile not only provides these statistics, but also helps you understand the data types present in your columns. This is especially useful when profiling data from different data sources, as it allows you to quickly assess and categorize the value types within your tables.
Example:
= Table.Profile(Source)

Use Case:
I pulled in 6 months of sales data dumped from an ERP. This function instantly showed me which fields were 90% null, which had unexpected text values, and which columns were secretly booby-trapped. No clicks, just results.
Table.ApproximateRowCount: When You Need a Sanity Check
Sometimes you just need to know roughly how many rows are coming through before you start doing anything heavy. This function gives a faster (but approximate) row count than the standard Table.RowCount.
Example:
= Table.ApproximateRowCount(Source)
Use this early in your queries to decide if you need to buffer or sample the data. It’s great for massive tables where performance matters.
Table.Partition: Split Your Table Like a Boss
Want to break a giant table into logical partitions based on a column value (like region or month)? This is your move.
Example:
= Table.Partition(Source, "Region", {"East", "West"}, each Record.Field(_, "Region"))
It returns a record of tables, one per partition. You can loop through them, apply logic, and reassemble.
Use Case:
I split a 200k-line sales report by region and applied separate filters to each. It helped build a refreshable P&L by region—clean and scalable.
Case Study: Cleaning 12 GL Files With Different Vendors
I had 12 files from 12 vendors, all with “slightly different” formatting (which is code for “pure chaos”).
- I used Table.Profile to scan each one for nulls and text fields that shouldn’t exist.
- I used List.ContainsAny to make sure vendor codes matched our approved list.
- Then used Table.Partition to apply transformations by vendor format.
That combo saved me from 4 hours of manually checking each file and let me build a universal, flexible query that I could drop new files into every month.
By using these steps, you can combine data from multiple sources and achieve a clean, unified end result.
Custom Functions In Power Query
If you’ve ever copied the same transformation steps across five different queries, stop. Just… stop.
Power Query has a better way. It’s called custom functions, and once you learn how to use them, you’ll never go back to duplicating steps like it’s 2007.
Custom functions are defined using the M language, which is the formula language behind Power Query. Typically, you create a new query, sometimes called a function query, to hold your custom function code and make it reusable across your data transformations.
Wait, What’s a Custom Function?
Think of it like this:
You know that one coworker who always formats vendor names just right? What if you could trap them in a reusable box and call on them whenever you needed?
A custom function is just a reusable recipe. It takes inputs, applies logic, and spits out clean data. For each invocation, the function produces a single output, often a single value, based on the input provided. You can plug it into any query, anywhere, and boom—your logic scales across files, tabs, or even months.
Step-by-Step: Build a Custom Function to Clean Vendor Names
Creating a custom function in Power Query involves several steps, from defining the function logic to handling various data scenarios. The following examples will demonstrate each step in detail.
Let’s say we’ve got vendor names like:
“acme inc”, “Acme Inc.”, “ACME INC.”, null
You want to:
- Trim whitespace
- Capitalize correctly
- Handle nulls without blowing up your query
Here’s how to build the function.
Step 1: Create a New Blank Query
In Power Query Editor, go to:
Home > New Source > Blank Query
Then click Advanced Editor
Paste this:
let CleanVendorName = (Vendor as nullable text) as text => let SafeVendor = if Vendor = null then "" else Vendor, Trimmed = Text.Trim(SafeVendor), Capitalized = Text.Proper(Trimmed) in Capitalized in CleanVendorName
Rename the query something like: fnCleanVendorName
Now you’ve got a reusable function that:
- Handles nulls
- Trims spaces
- Converts to proper case (e.g., “acme inc” → “Acme Inc”)

Step 2: Use It in Another Query
Let’s say you’ve got a “Vendors” table. Just go to Add Column > Invoke Custom Function, and select fnCleanVendorName.
Or, do it directly with M code:
= Table.AddColumn(Source, "Cleaned Vendor", each fnCleanVendorName([Vendor]))
That’s it. One function. Reusable. Maintainable. Future-proof.
Case Study: Month-End Normalization Across 12 Reports
At a previous job, we had twelve different regional P&L files. Each had slightly different names for the same cost centers:
“Cost Ctr 1001”, “CC1001”, “1001 – Ops”
Instead of cleaning them manually in each file, I:
- Built a custom function fnNormalizeCostCenter with Text.Replace, Text.Trim, and some try…otherwise logic
- Dropped it into each region’s import query
- Boom. Uniform cost centers, no Excel drama
Next month? New files came in. I didn’t lift a finger. The function did its job.
Bonus Tips
- Custom functions can take multiple parameters—great for things like conditional logic or multi-column processing
- You can store them in a separate query group, so you keep your workbook clean
- Combine with parameters to make fully dynamic ETL steps (we’ll get to that in a future section)
For more information and practical tips, you can explore articles dedicated to Power Query custom functions, which provide up-to-date examples and technical guidance.
Build Your Own Power Query Toolkit
Okay, so now you’ve got a taste of custom functions. Great. But what if I told you there’s an even better move?
You don’t just build one-off functions. You build a library, your personal finance automation arsenal inside Power Query.
By creating a function library, you can reference these functions in other queries, making your transformations more dynamic and maintainable. Referencing functions is a key part of building a reusable toolkit that streamlines your workflow.
This section is all about how to organize, store, and reuse custom functions across queries and files so your cleanup logic becomes scalable and actually maintainable. No more repeating yourself. No more spaghetti code.
What Is a Function Library?
A function library is just a fancy way of saying:
“I saved all my useful Power Query functions in one place and can call them whenever I want.”
It’s like having your own internal app store except instead of games, it’s got things like:
- Vendor name cleaners
- Cost center normalizers
- Custom date logic
- Sales tax calculators
- Comment field validators

Step-by-Step: How to Build Your Function Library
Let’s go full ninja with a modular structure you can drop into any finance workflow.
Step 1: Open Power BI and launch Power Query Editor.
Step 2: Create a new blank query for your function.
Step 3: Paste your M code into the query.
Note: You can use the formula bar in Power Query Editor to view or edit the M code for your functions. If the formula bar is not visible, enable it from the View tab.
Step 4: Name your function and save it.
Step 5: Create a Query Group Called Functions
In Power Query:
- Right-click in the Queries pane → New Group
- Name it Functions
Inside this group, you’ll add your reusable building blocks:
- fnCleanVendorName
- fnNormalizeCostCenter
- fnFixDate
- fnParseCurrency
- Anything you keep doing more than once
Step 6: Make Your Functions Flexible
Good functions take parameters. Great functions handle edge cases.
Example: fnFixDate
let fnFixDate = (RawDate as any) as nullable date => let CleanDate = try Date.From(RawDate) otherwise null in CleanDate in fnFixDate
This one saved my butt when dealing with Excel dumps where half the dates were numbers, half were strings, and a few were just chaos incarnate.
Step 7: Call Functions Dynamically in Other Queries
Let’s say you’ve imported a messy GL file. Here’s how you layer your library like a boss:
= Table.AddColumn(Source, "Fixed Date", each fnFixDate([TxnDate])) = Table.AddColumn(_, "Clean Vendor", each fnCleanVendorName([Vendor]))
Stack them. Reuse them. Brag about them to your team.
Case Study: Finance Close Automation
When I rebuilt our month-end reporting pipeline, I made a function library that included:
- fnFormatAccountNames (strips extra spaces, standardizes capitalization)
- fnApplyMappingTable (takes a chart of accounts and maps categories)
- fnFixZeroes (replaces blank/null/zero noise with real values)
Then, I dropped those functions into each region’s P&L query. Same logic. Same structure. Zero mess.
The best part? When the CFO wanted to change how accounts were grouped, I updated one function and it cascaded everywhere.
Folding & Performance-Tuning Secrets
So, you’ve got clean data. You’ve got reusable functions. But then… your query takes five minutes to load. You grab coffee, come back, and it’s still spinning. And you’re wondering if Power Query just gave up and walked off the job.
Chances are, your transformations broke query folding, or you’re pulling entire datasets into memory without buffering them.
Let’s fix that.
What Is Query Folding?
Query folding is when Power Query pushes your transformations back to the source system (like SQL Server, Snowflake, etc.), letting that system do the heavy lifting.
When folding works, it’s magic. Filtering, joins, aggregations — all done at the source, way faster than Excel or Power BI ever could.
When folding breaks, Power Query pulls the entire dataset in and applies transformations locally, which is like trying to clean your whole garage using a toothbrush.
How to Check If Folding Is Working
Right-click any step in your query → View Native Query.
If it’s grayed out? Folding is broken.
Table.Buffer: Your Emergency Brake
When Power Query starts trying to be too smart, like re-evaluating a step 50 times, Table.Buffer lets you lock in a result at a given point and prevent reprocessing.
Example:
let BufferedData = Table.Buffer(Source) in BufferedData
Now Power Query grabs that data once and stops peeking upstream every time you add a new step.
When to use it:
- After importing from a slow source (like a network share or a flaky API)
- Before complex steps like joins or row-by-row logic
- When you’re noticing repetitive evaluation of earlier steps
Table.StopFolding: Use With Caution
This function does exactly what it says: kills query folding on purpose. You almost never need this unless you’re intentionally forcing a transformation to happen locally for a very specific reason.
Usually, it’s used in Power BI when:
- A step must run after a non-foldable step
- You want to prevent source queries from executing long-running SQL
Example:
= Table.StopFolding(Source)
Think of this like hitting the “manual override” button. Use with caution—or just stick to Table.Buffer unless you really know what you’re doing.
Case Study: SQL Server Lag and Month-End Pain
I had a reporting flow that pulled from a SQL Server with half a million GL rows. Everything worked great—until someone added a custom column using if [Account] = “XYZ” logic halfway down the query.
Folding broke. Load times tripled.
Fix: I moved the logic earlier in the query (closer to the source), wrapped my staging step in Table.Buffer, and folding came back online. Load time dropped from 3 minutes to under 20 seconds.
Moral of the story: folding matters, and so does when you do your transformations.
Best Practices to Keep Things Fast
- Always filter early — the sooner you reduce rows, the better
- Avoid custom column logic on massive datasets unless you’ve buffered first
- Use Table.Buffer when combining large datasets
- Check folding status regularly — one innocent Add Column step can kill it
- When using Excel files as sources, clean them up or load them into a local table first
Hidden UI-Free Tricks: What the Ribbon Doesn’t Show You
Let’s be honest — most people using Power Query never touch the Advanced Editor. And I get it. It looks scary. It smells like code. But under the hood? That’s where the magic lives.
Understanding the correct M code syntax is crucial for unlocking advanced Power Query functions that aren’t available through the standard UI. Mastering syntax lets you write scripts and transformations that go far beyond what the interface offers.
There are a bunch of transformations and techniques you cannot do from the UI — at least not directly. These are the tricks that separate the weekend warriors from the automation assassins.
Let’s get into the stuff the interface keeps secret.
Trick #1: Parameterized Functions With Multiple Inputs
You can’t build functions with optional or multiple parameters from the GUI. But in the Advanced Editor? Totally doable.
Example: A flexible cost center normalizer
let NormalizeCC = (CC as nullable text, Region as text) as text => let CleanCC = Text.Upper(Text.Trim(CC)), Result = if Region = "US" then Text.Start(CleanCC, 4) else CleanCC in Result in NormalizeCC
You just created a dynamic function that cleans cost centers differently depending on region. Try doing that with a right-click.
Trick #2: Conditional Columns With Complex Logic
The UI gives you a basic IF…ELSE builder, which is fine if you’re doing grade school logic. But anything more complex? You’re going to need to write it yourself.
Example: Multi-condition status flag
= Table.AddColumn(Source, "Status", each if [Amount] = null then "Missing" else if [Amount] = 0 then "Zero" else if [Amount] < 0 then "Credit" else "Debit")
This is readable, flexible, and faster than clicking “Add Conditional Column” 40 times.
Trick #3: Custom Error Handling & Debugging With try Blocks
The UI can’t give you this level of control. This is where try…otherwise shines.
Example: Capture failed transformations and show the error
= Table.AddColumn(Source, "Debug", each try Number.FromText([Input]) otherwise "Error: " & _[Input])
You’re not just avoiding a crash—you’re logging the problem inline.
Perfect for dirty vendor files, half-complete journal uploads, or weird exports from legacy systems.
Trick #4: Inline Record Construction
Sometimes you need to build a small lookup table or config on the fly. You don’t need to import a file — you can just hard-code a record or table inline.
Example: Create a mini mapping table
let AccountMap = #table({"Old", "New"}, { {"1000", "Revenue"}, {"2000", "COGS"}, {"3000", "Operating Expense"} }) in AccountMap
This is a great way to build internal mapping logic, date ranges, or quick tests without needing a separate Excel sheet.
Trick #5: Recursive Functions
Yeah… we’re going there.
Let’s say you want to flatten a nested structure, or loop through some specified position until a condition is met. You can write a recursive function in M.
⚠️ Advanced Example:
let RecursiveSum = (list as list, acc as number) => if List.IsEmpty(list) then acc else @RecursiveSum(List.Skip(list,1), acc + List.First(list)) in RecursiveSum({1,2,3,4}, 0)
This returns 10 — the sum of the list — calculated via recursion. Is it overkill for most use cases? Sure. But knowing you can do it? That’s powerful.
Case Study: The UI Just Couldn’t Help Me
At one point, I was building a cleanup flow for an AP system where:
- Vendor names had different formats
- The “Net Amount” field was sometimes negative (refunds), sometimes zero (voids), sometimes null (???)
- We needed to classify them and return a comment explaining what happened
The GUI gave up. I built a single AddColumn step in the Advanced Editor using if, try, and multiple conditions. Worked like a charm. No UI in the world could’ve built that logic tree.
Putting It All Together With A Full Workflow
Let’s stop talking theory. I’m going to walk you through a real, ugly, “why-is-this-still-a-thing” scenario from my month-end war chest — and show you how I used Power Query’s secret weapons to clean it up and automate it. Power Query enables users to perform complex data transformations efficiently, reducing manual work and streamlining the entire process.
Scenario: Month-End Sales Reconciliation Across 6 Regions
The setup:
Every region sends me their monthly sales report in Microsoft Excel. And of course, no two files are the same:
- Different column orders
- Wildly inconsistent naming (e.g., “Customer ID” vs “Cust#”)
- Dates formatted like cryptic riddles
- Some files have 12 tabs, others only 1
- One guy sends a CSV with zero headers. A CSV. In 2025.
Before Power Query, this mess took me 2 full days of copy-pasting, cross-checking, and praying to the VLOOKUP gods.
Now? It takes about 30 minutes — and I mostly just hit “Refresh.”
Here’s the breakdown.
Step 1: Normalize the Source Files
Each region’s report goes in a shared folder.
I set up a folder connection in Power Query to pull them all in dynamically:
= Folder.Files("C:SalesReportsMonthly")
Then I filter out temporary or backup files, and drill into the actual data tabs.
Step 2: Clean and Standardize Column Names
I use a custom function, fnStandardizeHeaders, that:
- Replaces spaces with underscores
- Standardizes known headers (e.g., maps “Cust#”, “Customer_ID”, and “ClientID” to just “CustomerID”)
- Trims whitespace
= Table.TransformColumnNames(Source, each fnStandardizeHeaders(_))
Step 3: Clean Up Dates With try…otherwise
Every region has its own way of formatting transaction dates — some use MM/DD/YYYY, others send Excel serial numbers. To handle this, I use:
= Table.AddColumn(Source, "CleanDate", each try Date.From([TxnDate]) otherwise null)
Now I have a reliable CleanDate column no matter what chaos they send me.
Step 4: Classify Transactions With Conditional Logic
Some regions include adjustments or test data. I use a multi-condition if statement to flag anything that looks suspicious:
= Table.AddColumn(Source, "TxnType", each if [Amount] = null then "Missing" else if [Amount] < 0 then "Return" else if Text.Contains([Comment], "test") then "Test" else "Sale")
No way to do this from the UI cleanly. Advanced Editor all the way.
Step 5: Apply Reusable Cleanup Functions
I drop in the rest of my function library:
- fnCleanCustomerName
- fnFixAmountField
- fnValidateSalesRep
Each gets added with AddColumn steps that standardize the fields and handle nulls/errors automatically.
This lets me scale the logic across every file, every time, with zero manual steps.
Step 6: Use Table.Buffer for Stability
After all the cleanup, I lock the intermediate table with:
= Table.Buffer(CleanedTable)
This keeps Power Query from re-running expensive folder scans or recalculating formulas multiple times.
Step 7: Combine, Filter, and Load
I append all cleaned regional files into one final table, filter for valid transactions, and load it to Excel (or Power BI) as a single unified data set.
The Result
- Time saved: Down from 2 days to 30 minutes
- Errors avoided: No more mismatched customer names or mystery dates
- Scalability: New region added last month? Dropped their file in the folder and it worked out of the box
- Boss reaction: “Wait, this is automated now? Why didn’t we do this sooner?”
