OData through Power BI
Extracting data
In Power Query, there are two ways to hit an OData API call:
OData.Feed
OData.Feed("https://<companyname>.dynamics.com/data/")
This tends to be slow. It returns a menu of all of the tables that are available. To access the data, you can navigate and expand the relevant tables.
Apparently you can use parameters. I tend to avoid query parameters using this function because it is complicated.
Web.Contents
Web.Contents(
"https://<companyname>.dynamics.com/data/",
[
RelativePath = "<tablename>",
Query = [
#"$select" = "<field1>,<field2>",
#"$filter" = "<filterexpression>",
#"$count" = "true",
#"$odata.maxpagesize" = "true",
#"$skip" = "<skip n rows as text>",
#"$top" = "<top n rows as text>"
]
]
)
This tends to be faster than OData.Feed
. Hit a single table only and allows for actual OData filters in the request. Returns a text document (maybe JSON) that needs to be converted to a table.
Note that the tables that are accessible through OData are not the same as the raw system tables, and have less information (fields) on them. If they need to be changed at all, they need to be changed as a modification to the F&O environment.
Remember when using Web.Contents
to use the proper syntax for parameters instead of dynamically composing it with the &
operator. It will not update in Power BI Service otherwise (see here).
Note also that, for Dynamics F&O, there seems to be a 10,000 row limit (i.e. the maxpagesize) when requesting data through Web.Contents
.
(entity as text, fields as nullable list) =>
let
Source =
if fields = null or List.Count(fields) = 0
then Web.Contents(odataUrl, [ RelativePath = entity ] )
else Web.Contents(odataUrl, [ RelativePath = entity, Query = [#"$select" = Text.Combine(fields, ",")] ]),
json = Json.Document(Source),
value = json[value],
table = Table.FromRecords(value)
in
table
Extracting big data
Using Power Query
For big tables that are beyond the API limit, do this:
- Ask the API how many rows are in the table
- Make a list of numbers so that the data set is segmented e.g. if the API limit is 100, and there are 550 rows, then it creates the array: [0, 100, 200, 300, 400, 500]. THe algorithm then skips this number at every iteration, effectively getting the first 100, then the next 100, then the next 100, until it has everything
- Pull in all of the segments
- Combine them all together
Note that, if in Power BI and publishing to Power BI Service, you cannot use this as a function. Power BI Service will refuse it on the grounds that it is a dynamic data source. Paste in the whole algorithm instead.
(url as text, entity as text, interval as number, options as record) =>
let
count = Json.Document(
Web.Contents(
url,
[
RelativePath = entity,
Query = [ #"$top" = "0", #"$count" = "true" ] & options
]
)
)[#"@odata.count"],
initArray = List.Transform(
List.Generate(() => 0, each _ < count, each _ + interval),
each Text.From(_)
),
get = List.Transform(
initArray,
each Json.Document(
Web.Contents(
url,
[
RelativePath = entity,
Query = [
#"$skip" = _,
#"$top" = Text.From(interval)
] & options
]
)
)[value]
),
combine = List.Combine(get),
tabulate = Table.FromRecords(combine)
in
tabulate
Using Power Automate
The same technique as above is also possible using Power Automate. However, the in-built OData connector does not support the $count
parameter, so there is no way of "pre-counting" how many records we are anticipating. Thus, we need to predict how many cycles we will need to loop through (or find out how many records are in the entity first, by some other way).
In the example below, we will cycle through 5 pages to get all of the data. The maxpagesize
is 10,000.
- Create an array called 'Index',
[0, 1, 2, 3, 4]
- Initialise an integer variable called 'count'
- For every element in 'Index':
- Compose 'Skip', which is the current value of 'Index', times the
maxpagesize
:mul(items('For_every_index'), 10000)
- If variable 'count' is less than
maxpagesize
:- Do nothing. This means that we've reached the last page, because the number of records is less than
maxpagesize
- Otherwise, start pulling data.
- Use the function block 'List items present in table', as part of the 'Fin & Ops (Dynamics 365) premium package
- As parameters, set
Top Count
as themaxpagesize
, and setSkip Count
asoutputs('Skip')
- Set variable 'count' as the count of the data:
length(outputs('List_items_present_in_table').body.value)
- If the data has any rows:
- Write it into a SharePoint file, or whatever else you want to do
- Otherwise, do nothing (has no rows)
- Do nothing. This means that we've reached the last page, because the number of records is less than
- Compose 'Skip', which is the current value of 'Index', times the
Enums
APIs like OData for Dynamics F&O will have some fields in the data entity that are enums. The implication is that you cannot use simple syntax to refer to them in filter expressions.
For example, is there is a field called Status
and it is an enum, containing the values Active
, On Hold
, and Inactive
, it will not suffice to specify a filter condition that reads
$filter=Status eq 'Active'
This is because Status
is not of type string.
You need to go to the metadata, which is a large, comprehensive XML file, and find the exact type for that field. Metadata is here: https://<url>/data/$metadata
. Look for a block of code that reads something like:
<Property Name="Status" Type="Microsoft.Dynamics.DataEntities.Status">
...
<EnumMember>Microsoft.Dynamics.OData.Core.V1.AXType/Enum</EnumMember>
</Property>
From here, we can see the Type
property in this object. The filter expression should thus be:
$filter=Status eq Microsoft.Dynamics.DataEntities.Status'Active'