Have you ever pasted a small table into Power Query using Enter Data and noticed the long binary string in the M code? That gibberish is actually a compressed version of your data. What if you could do the reverse? Turn any table into one of those encoded strings?
In this post, you’ll see how Power Query stores Enter Data tables and learn to convert your own tables into that same format. That way, you can keep static data in your queries without relying on a source connection.
Enter Data and Binary Compressed Text
Power Query has a feature called Enter Data. It allows you to paste tabular data into the UI, and Power Query will automatically generate the M code needed to reproduce that table.
This is useful when you want to hard-code a small dataset directly into your query logic. But have you ever looked at the M code behind it and wondered what all that binary gibberish means?
Entering a Simple Table
Suppose you enter the following table via the Enter Data window:

Once you press OK, Power Query generates code like this:
Table.FromRows(
Json.Document(
Binary.Decompress(
Binary.FromText(
"i45WSlTSUTI21DU00jUyMDIBcgwNlGJ1opWSgEwDQ10gAoqbAjlGEPFkENNU18AMpt7UVCk2FgA=",
BinaryEncoding.Base64
),
Compression.Deflate
)
),
let
_t = ((type nullable text) meta [Serialized.Text = true])
in
type table [ID = _t, Date = _t, Amount = _t]
)
That long Base64 string? It’s a compressed representation of your data. Power Query converts the table to JSON, compresses it, encodes it in Base64, and stores it inside your query. No external source needed.
This is how Enter Data works behind the scenes. It creates the below table.

Important
Notice that this table has different columns and values, and each column is of type text
Reversing the Logic: From Table to Binary
What if you want to do the opposite?
Imagine you have a table already loaded in Power Query (from any source — Excel, SQL, etc.), and you want to convert that table into a Base64-encoded string, just like the Enter Data feature does.
Why would you want that?
- To embed static data directly into your queries
- To share a query without requiring access to the original source
- To copy/paste data into other reports without external dependencies
Here’s the code that converts a table to a binary string:
let
tbl = Source, // Any table you want to encode
rows = Table.ToRows(tbl), // Convert to list of rows
json = Json.FromValue(rows), // Serialize to JSON
bin1 = Binary.Compress(json, Compression.Deflate), // Compress JSON text
bin2 = Binary.ToText(bin1, BinaryEncoding.Base64) // Turn into Binary encoded string
in
bin2
Now remember that complex string that we used earlier to create this table? That exact string is now returned based on just the input table. In our case that is:
"i45WSlTSUTI21DU00jUyMDIBcgwNlGJ1opWSgEwDQ10gAoqbAjlGEPFkENNU18AMpt7UVCk2FgA="
You can then use it in combination with Binary.FromText and Binary.Decompress (the ones we used earlier) to recreate the table, just like Power Query does internally.
Note
The original code also defines the column types. If you omit the type argument in Table.FromRows, Power Query will create a generic table without column names, which may not be usable without further adjustments.
Portability
This technique lets you extract data from any source and bake it into your query logic.
That means you can:
- Move your queries between files or environments without worrying about source access.
- Store static datasets as part of your code.
- Send someone a query that contains data but doesn’t require refresh credentials.
There are limitations, of course:
- The data is static, changes in the original source won’t be reflected
- You’re manually maintaining a “snapshot” of your table
- For large datasets, this approach can get cumbersome
But for prototyping, sharing, or small tables? It’s a powerful trick.
Custom Function
If you want to have a custom function that turns your table into binary code, and also includes the table type, you can use the below code:
(tbl as table) as text =>
let
// 1) Encode rows to Base64 (Deflate-compressed JSON)
rowList_lst = Table.ToRows(tbl),
jsonBin_bin = Json.FromValue(rowList_lst),
deflatedBin_bin = Binary.Compress(jsonBin_bin, Compression.Deflate),
base64_txt = Binary.ToText(deflatedBin_bin, BinaryEncoding.Base64),
// 2) Build the table-type text from the input's schema
schema_tbl = Type.TableSchema(Value.Type(tbl)),
nameKind_tbl = Table.SelectColumns(schema_tbl, {"Name", "Kind"}),
typePairs_lst = List.Transform(
Table.ToRows(nameKind_tbl),
each _{0} & " = " & Text.Lower(_{1})
),
typeSpec_txt = "type table [" & Text.Combine(typePairs_lst, ", ") & "]",
// 3) Compose the final code snippet
code_txt =
Text.Format(
"Table.FromRows( Json.Document( Binary.Decompress( Binary.FromText(""#{0}"", BinaryEncoding.Base64), Compression.Deflate ) ), #(lf) #{1} )",
{ base64_txt, typeSpec_txt }
)
in
code_txt
When you apply this function on a table, you will receive a text value. You can copy-paste that text value into a new query to build a table that includes table types.
Conclusion
Embedding data as a Base64-encoded string in Power Query is a fun trick that gives you control over static tables in a portable way. By understanding how Enter Data works behind the scenes, you can reverse-engineer or recreate that same compressed binary format yourself.
This means you can build reusable, self-contained queries that don’t rely on a live connection. Just remember, the data won’t auto-update. It’s a fixed snapshot.
Rick – I’ve noticed that you often cite others’ contributions. This is so alarmingly rare on most of the non-academic web pages I come across that it truly stands out. Not only does it underscore your honesty and evident appreciation for others’ contributions, but it lends all the more credence to your own. And oh yeah – I am learning a heck of a lot from your posts. 🙂
Thank you (from retired prof.)
Hello Rick,
please add the new possibility using MissingField.Ignore in the Table.TransformColumnTypes function.
Thank You
Radek
Great timing! I’ve made a Power Query in Excel that write to an Excel Table. Next time it runs it reads from same table, appends new rows from another source, then write back to the Excel table. It’s a poor man’s “writeback”. I may try the same for Power BI Desktop with queries using binary data. Thanks!