Skip to content

Supercharge your Microsoft Fabric development with the fabric_cat_tools library

License

Notifications You must be signed in to change notification settings

m-kovalsky/fabric_cat_tools

Repository files navigation

fabric_cat_tools

This is a python library intended to be used in Microsoft Fabric notebooks. This library was originally intended to contain functions used for migrating semantic models to Direct Lake mode. However, it quickly became apparent that functions within such a library could support many other useful activities in the realm of semantic models, reports, lakehouses and really anything Fabric-related. As such, this library contains a variety of functions ranging from running Vertipaq Analyzer or the Best Practice Analyzer against a semantic model to seeing if any lakehouse tables hit Direct Lake guardrails or accessing the Tabular Object Model and more!

Instructions for migrating import/DirectQuery semantic models to Direct Lake mode can be found here.

If you encounter any issues, please raise a bug.

If you have ideas for new features/functions, please request a feature.

Install the .whl file in a Fabric notebook

%pip install "https://raw.githubusercontent.com/m-kovalsky/fabric_cat_tools/main/fabric_cat_tools-0.4.1-py3-none-any.whl"

Once installed, run this code to import the library into your notebook

import fabric_cat_tools as fct
from fabric_cat_tools.TOM import connect_semantic_model

Load fabric_cat_tools into a custom Fabric environment

An even better way to ensure the fabric_cat_tools library is available in your workspace/notebooks is to load it as a library in a custom Fabric environment. If you do this, you will not have to run the above '%pip install' code every time in your notebook. Please follow the steps below.

Create a custom environment

  1. Navigate to your Fabric workspace
  2. Click 'New' -> More options
  3. Within 'Data Science', click 'Environment'
  4. Name your environment, click 'Create'

Add fabric_cat_tools as a library to the environment

  1. Download the latest fabric_cat_tools library
  2. Within 'Custom Libraries', click 'upload'
  3. Upload the .whl file which was downloaded in step 1
  4. Click 'Save' at the top right of the screen
  5. Click 'Publish' at the top right of the screen
  6. Click 'Publish All'

Update your notebook to use the new environment (must wait for the environment to finish publishing)

  1. Navigate to your Notebook
  2. Select your newly created environment within the 'Environment' drop down in the navigation bar at the top of the notebook

Function Categories

Semantic Model

Report

Model Optimization

Direct Lake Migration

Direct Lake

Lakehouse

Helper Functions

'All' functions for non-parent objects within TOM

'Add' functions

'Set' functions

'Remove' functions

'Used-in' and dependency functions

Vertipaq Analyzer data functions

Perspectives

Annotations

Extended Properties

Misc

Functions

cancel_dataset_refresh

Cancels the refresh of a semantic model which was executed via the Enhanced Refresh API.

import fabric_cat_tools as fct
fct.cancel_dataset_refresh(
            dataset = 'MyReport',
            #request_id = None,
            #workspace = None
            )

Parameters

dataset str

Required; Name of the semantic model.

request_id str

Optional; The request id of a semantic model refresh. Defaults to finding the latest active refresh of the semantic model.

workspace str

Optional; The workspace where the semantic model resides.

Returns

A printout stating the success/failure of the operation.


check_fallback_reason

Shows the reason a table in a Direct Lake semantic model would fallback to Direct Query.

Note

This function is only relevant to semantic models in Direct Lake mode.

import fabric_cat_tools as fct
fct.check_fallback_reason(
            dataset = 'AdventureWorks',
            #workspace = None
            )

Parameters

dataset str

Required; Name of the semantic model.

workspace str

Optional; The workspace where the semantic model resides.

Returns

Pandas dataframe showing the tables in the semantic model and their fallback reason.


clear_cache

Clears the cache of a semantic model.

import fabric_cat_tools as fct
fct.clear_cache(
            dataset = 'AdventureWorks',
            #workspace = '' 
            )

Parameters

dataset str

Required; Name of the semantic model.

workspace str

Optional; The workspace where the semantic model resides.

Returns

A printout stating the success/failure of the operation.


clone_report

Makes a clone of a Power BI report

import fabric_cat_tools as fct
fct.clone_report(
            report = 'MyReport',
            cloned_report = 'MyNewReport',
            #workspace = None,
            #target_workspace = None,
            #target_dataset = None
            )

Parameters

report str

Required; Name of the report to be cloned.

cloned_report str

Required; Name of the new report.

workspace str

Optional; The workspace where the original report resides.

target_workspace str

Optional; The workspace where the new report will reside. Defaults to using the workspace in which the original report resides.

target_dataset str

Optional; The semantic model from which the new report will be connected. Defaults to using the semantic model used by the original report.

Returns

A printout stating the success/failure of the operation.


control_fallback

Set the DirectLakeBehavior for a semantic model.

Note

This function is only relevant to semantic models in Direct Lake mode.

import fabric_cat_tools as fct
fct.control_fallback(
            dataset = 'AdventureWorks',
            direct_lake_behavior = 'DirectLakeOnly',
            #workspace = None
            )

Parameters

dataset str

Required; Name of the semantic model.

direct_lake_behavior str

Required; Setting for Direct Lake Behavior. Options: ('Automatic', 'DirectLakeOnly', 'DirectQueryOnly').

workspace str

Optional; The workspace where the semantic model resides.

Returns

A printout stating the success/failure of the operation.


create_blank_semantic_model

Creates a new blank semantic model (no tables/columns etc.).

import fabric_cat_tools as fct
fct.create_blank_semantic_model(
            dataset = 'AdventureWorks',
            #workspace = None
            )

Parameters

dataset str

Required; Name of the semantic model.

compatibility_level int

Optional; Setting for the compatibility level of the semantic model. Default value: 1605.

workspace str

Optional; The workspace where the semantic model resides.

Returns

A printout stating the success/failure of the operation.


create_pqt_file

Dynamically generates a Power Query Template file based on the semantic model. The .pqt file is saved within the Files section of your lakehouse.

import fabric_cat_tools as fct
fct.create_pqt_file(
            dataset = 'AdventureWorks',
            #file_name = 'PowerQueryTemplate',
            #workspace = None
            )

Parameters

dataset str

Required; Name of the import/DirectQuery semantic model.

file_name str

Optional; TName of the Power Query Template (.pqt) file to be created.

workspace str

Optional; The workspace where the semantic model resides.

Returns

A printout stating the success/failure of the operation.


create_report_from_reportjson

Creates a report based on a report.json file (and an optional themes.json file).

import fabric_cat_tools as fct
fct.create_report_from_reportjson(
            report = 'MyReport',
            dataset = 'AdventureWorks',
            report_json = '',
            #theme_json = '',
            #workspace = None
            )

Parameters

report str

Required; Name of the report.

dataset str

Required; Name of the semantic model to connect to the report.

report_json Dict or str

Required; The report.json file to be used to create the report.

theme_json Dict or str

Optional; The theme.json file to be used for the theme of the report.

workspace str

Optional; The workspace where the semantic model resides.

Returns

A printout stating the success/failure of the operation.


create_semantic_model_from_bim

Creates a new semantic model based on a Model.bim file.

import fabric_cat_tools as fct
fct.create_semantic_model_from_bim(
            dataset = 'AdventureWorks',
            bim_file = '',
            #workspace = ''
            )

Parameters

dataset str

Required; Name of the semantic model.

bim_file Dict or str

Required; The model.bim file to be used to create the semantic model.

workspace str

Optional; The workspace where the semantic model resides.

Returns

A printout stating the success/failure of the operation.


create_shortcut_onelake

Creates a shortcut to a delta table in OneLake.

import fabric_cat_tools as fct
fct.create_shortcut_onelake(
            table_name = 'DimCalendar',
            source_lakehouse = 'Lakehouse1',
            source_workspace = 'Workspace1',
            destination_lakehouse = 'Lakehouse2',
            #destination_workspace = '',
            shortcut_name = 'Calendar'
            )

Parameters

table_name str

Required; The table name for which a shortcut will be created.

source_lakehouse str

Required; The lakehouse in which the table resides.

sourceWorkspace str

Required; The workspace where the source lakehouse resides.

destination_lakehouse str

Required; The lakehouse where the shortcut will be created.

destination_workspace str

Optional; The workspace in which the shortcut will be created. Defaults to the 'sourceWorkspaceName' parameter value.

shortcut_name str

Optional; The name of the shortcut 'table' to be created. This defaults to the 'tableName' parameter value.

Returns

A printout stating the success/failure of the operation.


create_warehouse

Creates a warehouse in Fabric.

import fabric_cat_tools as fct
fct.create_warehouse(
            warehouse = 'MyWarehouse',
            workspace = None
            )

Parameters

warehouse str

Required; Name of the warehouse.

description str

Optional; Description of the warehouse.

workspace str

Optional; The workspace where the warehouse will reside.

Returns

A printout stating the success/failure of the operation.


delete_shortcut

Deletes a OneLake shortcut.

import fabric_cat_tools as fct
fct.delete_shortcut(
            shortcut_name = 'DimCalendar',
            lakehouse = 'Lakehouse1',
            workspace = 'Workspace1'
            )

Parameters

shortcut_name str

Required; The name of the OneLake shortcut to delete.

lakehouse str

Optional; The lakehouse in which the shortcut resides.

workspace str

Optional; The workspace where the lakehouse resides.

Returns

A printout stating the success/failure of the operation.


direct_lake_schema_compare

Checks that all the tables in a Direct Lake semantic model map to tables in their corresponding lakehouse and that the columns in each table exist.

Note

This function is only relevant to semantic models in Direct Lake mode.

import fabric_cat_tools as fct
fct.direct_lake_schema_compare(
            dataset = 'AdventureWorks',
            workspace = '',
            #lakehouse = '',
            #lakehouse_workspace = ''
            )

Parameters

dataset str

Required; Name of the semantic model.

workspace str

Optional; The workspace where the semantic model resides.

lakehouse str

Optional; The lakehouse used by the Direct Lake semantic model.

lakehouse_workspace str

Optional; The workspace in which the lakehouse resides.

Returns

Shows tables/columns which exist in the semantic model but do not exist in the corresponding lakehouse.


direct_lake_schema_sync

Shows/adds columns which exist in the lakehouse but do not exist in the semantic model (only for tables in the semantic model).

Note

This function is only relevant to semantic models in Direct Lake mode.

import fabric_cat_tools as fct
fct.direct_lake_schema_sync(
     dataset = 'AdvWorks',
     add_to_model = True,
    #workspace = '',
    #lakehouse = '',
    #lakehouse_workspace = ''
    )

Parameters

dataset str

Required; Name of the semantic model.

add_to_model bool

Optional; Adds columns which exist in the lakehouse but do not exist in the semantic model. No new tables are added. Default value: False.

workspace str

Optional; The workspace where the semantic model resides.

lakehouse str

Optional; The lakehouse used by the Direct Lake semantic model.

lakehouse_workspace str

Optional; The workspace in which the lakehouse resides.

Returns

A list of columns which exist in the lakehouse but not in the Direct Lake semantic model. If 'add_to_model' is set to True, a printout stating the success/failure of the operation is returned.


export_model_to_onelake

Exports a semantic model's tables to delta tables in the lakehouse. Creates shortcuts to the tables if a lakehouse is specified.

Important

This function requires:

XMLA read/write to be enabled on the Fabric capacity.

OneLake Integration feature to be enabled within the semantic model settings.

import fabric_cat_tools as fct
fct.export_model_to_onelake(
            dataset = 'AdventureWorks',
            workspace = None,
            destination_lakehouse = 'Lakehouse2',
            destination_workspace = 'Workspace2'
            )

Parameters

dataset str

Required; Name of the semantic model.

workspace str

Optional; The workspace where the semantic model resides.

destination_lakehouse str

Optional; The lakehouse where shortcuts will be created to access the delta tables created by the export. If the lakehouse specified does not exist, one will be created with that name. If no lakehouse is specified, shortcuts will not be created.

destination_workspace str

Optional; The workspace in which the lakehouse resides.

Returns

A printout stating the success/failure of the operation.


export_report

Exports a Power BI report to a file in your lakehouse.

import fabric_cat_tools as fct
fct.export_report(
            report = 'AdventureWorks',
            export_format = 'PDF',
            #file_name = None,
            #bookmark_name = None,
            #page_name = None,
            #visual_name = None,
            #workspace = None
            )
import fabric_cat_tools as fct
fct.export_report(
            report = 'AdventureWorks',
            export_format = 'PDF',
            #file_name = 'Exports\MyReport',
            #bookmark_name = None,
            #page_name = 'ReportSection293847182375',
            #visual_name = None,
            #workspace = None
            )
import fabric_cat_tools as fct
fct.export_report(
            report = 'AdventureWorks',
            export_format = 'PDF',
            #page_name = 'ReportSection293847182375',
            #report_filter = "'Product Category'[Color] in ('Blue', 'Orange') and 'Calendar'[CalendarYear] <= 2020",
            #workspace = None
            )
import fabric_cat_tools as fct
fct.export_report(
            report = 'AdventureWorks',
            export_format = 'PDF',
            #page_name = ['ReportSection293847182375', 'ReportSection4818372483347'],
            #workspace = None
            )
import fabric_cat_tools as fct
fct.export_report(
            report = 'AdventureWorks',
            export_format = 'PDF',
            #page_name = ['ReportSection293847182375', 'ReportSection4818372483347'],
            #visual_name = ['d84793724739', 'v834729234723847'],
            #workspace = None
            )

Parameters

report str

Required; Name of the semantic model.

export_format str

Required; The format in which to export the report. See this link for valid formats: https://learn.microsoft.com/rest/api/power-bi/reports/export-to-file-in-group#fileformat. For image formats, enter the file extension in this parameter, not 'IMAGE'.

file_name str

Optional; The name of the file to be saved within the lakehouse. Do not include the file extension. Defaults ot the reportName parameter value.

bookmark_name str

Optional; The name (GUID) of a bookmark within the report.

page_name str or list of str

Optional; The name (GUID) of the report page.

visual_name str or list of str

Optional; The name (GUID) of a visual. If you specify this parameter you must also specify the page_name parameter.

report_filter str

Optional; A report filter to be applied when exporting the report. Syntax is user-friendly. See above for examples.

workspace str

Optional; The workspace where the report resides.

Returns

A printout stating the success/failure of the operation.


generate_embedded_filter

Runs a DAX query against a semantic model.

import fabric_cat_tools as fct
fct.generate_embedded_filter(
            filter = "'Product'[Product Category] = 'Bikes' and 'Geography'[Country Code] in (3, 6, 10)"       
            )

Parameters

filter str

Returns

A string converting the filter into an embedded filter


get_direct_lake_guardrails

Shows the guardrails for when Direct Lake semantic models will fallback to Direct Query based on Microsoft's online documentation.

import fabric_cat_tools as fct
fct.get_direct_lake_guardrails()

Parameters

None

Returns

A table showing the Direct Lake guardrails by SKU.


get_directlake_guardrails_for_sku

Shows the guardrails for Direct Lake based on the SKU used by your workspace's capacity.

Use the result of the 'get_sku_size' function as an input for this function's skuSize parameter.

import fabric_cat_tools as fct
fct.get_directlake_guardrails_for_sku(
            sku_size = ''
            )

Parameters

sku_size str

Required; Sku size of a workspace/capacity

Returns

A table showing the Direct Lake guardrails for the given SKU.


get_direct_lake_lakehouse

Identifies the lakehouse used by a Direct Lake semantic model.

Note

This function is only relevant to semantic models in Direct Lake mode.

import fabric_cat_tools as fct
fct.get_direct_lake_lakehouse(
            dataset = 'AdventureWorks',
            #workspace = '',
            #lakehouse = '',
            #lakehouse_workspace = ''            
            )

Parameters

dataset str

Required; Name of the semantic model.

workspace str

Optional; The workspace where the semantic model resides.

lakehouse str

Optional; Name of the lakehouse used by the semantic model.

lakehouse_workspace str

Optional; The workspace where the lakehouse resides.


get_direct_lake_sql_endpoint

Identifies the lakehouse used by a Direct Lake semantic model.

Note

This function is only relevant to semantic models in Direct Lake mode.

import fabric_cat_tools as fct
fct.get_direct_lake_sql_endpoint(
            dataset = 'AdventureWorks',
            #workspace = None
            )

Parameters

dataset str

Required; Name of the semantic model.

workspace str

Optional; The workspace where the semantic model resides.

Returns

A string containing the SQL Endpoint ID for a Direct Lake semantic model.


get_lakehouse_columns

Shows the tables and columns of a lakehouse and their respective properties.

import fabric_cat_tools as fct
fct.get_lakehouse_columns(
            lakehouse = 'AdventureWorks',
            #workspace = None
            )

Parameters

lakehouse str

Optional; The lakehouse name.

workspace str

Optional; The workspace where the lakehouse resides.

Returns

A pandas dataframe showing the tables/columns within a lakehouse and their properties.


get_lakehouse_tables

Shows the tables of a lakehouse and their respective properties. Option to include additional properties relevant to Direct Lake guardrails.

import fabric_cat_tools as fct
fct.get_lakehouse_tables(
        lakehouse = 'MyLakehouse',
        workspace = 'NewWorkspace',
        extended = True,
        count_rows = True)

Parameters

lakehouse str

Optional; The lakehouse name.

workspace str

Optional; The workspace where the lakehouse resides.

extended bool

Optional; Adds the following additional table properties ['Files', 'Row Groups', 'Table Size', 'Parquet File Guardrail', 'Row Group Guardrail', 'Row Count Guardrail']. Also indicates the SKU for the workspace and whether guardrails are hit. Default value: False.

count_rows bool

Optional; Adds an additional column showing the row count of each table. Default value: False.

export bool

Optional; If specified as True, the resulting dataframe will be exported to a delta table in your lakehouse.

Returns

A pandas dataframe showing the delta tables within a lakehouse and their properties.


get_measure_dependencies

Shows all dependencies for all measures in a semantic model

import fabric_cat_tools as fct
fct.get_measure_dependencies(
            dataset = 'AdventureWorks',
            #workspace = None
            )

Parameters

dataset str

Required; Name of the semantic model.

workspace str

Optional; The workspace where the semantic model resides.

Returns

A pandas dataframe showing all dependencies for all measures in the semantic model.


get_model_calc_dependencies

Shows all dependencies for all objects in a semantic model

import fabric_cat_tools as fct
fct.get_model_calc_dependencies(
            dataset = 'AdventureWorks',
            #workspace = None
            )

Parameters

dataset str

Required; Name of the semantic model.

workspace str

Optional; The workspace where the semantic model resides.

Returns

A pandas dataframe showing all dependencies for all objects in the semantic model.


get_object_level_security

Shows a list of columns used in object level security.

import fabric_cat_tools as fct
fct.get_object_level_security(
        dataset = 'AdventureWorks',
        workspace = '')

Parameters

dataset str

Optional; The semantic model name.

workspace str

Optional; The workspace where the semantic model resides.

Returns

A pandas dataframe showing the columns used in object level security within a semantic model.


get_report_json

Gets the report.json file content of a Power BI report.

import fabric_cat_tools as fct
fct.get_report_json(
            report = 'MyReport',
            #workspace = None
            )
import fabric_cat_tools as fct
fct.get_report_json(
            report = 'MyReport',
            #workspace = None,
            save_to_file_name = 'MyFileName'
            )

Parameters

report str

Required; Name of the report.

workspace str

Optional; The workspace where the report resides.

save_to_file_name str

Optional; Specifying this parameter will save the report.json file to your lakehouse with the file name of this parameter.

Returns

The report.json file for a given Power BI report.


get_semantic_model_bim

Extracts the Model.bim file for a given semantic model.

import fabric_cat_tools as fct
fct.get_semantic_model_bim(
            dataset = 'AdventureWorks',
            #workspace = None
            )
import fabric_cat_tools as fct
fct.get_semantic_model_bim(
            dataset = 'AdventureWorks',
            #workspace = None,
            save_to_file_name = 'MyFileName'
            )

Parameters

dataset str

Required; Name of the semantic model.

workspace str

Optional; The workspace where the semantic model resides.

save_to_file_name str

Optional; Specifying this parameter will save the model.bim file to your lakehouse with the file name of this parameter.

Returns

The model.bim file for a given semantic model.


get_shared_expression

Dynamically generates the M expression used by a Direct Lake model for a given lakehouse.

import fabric_cat_tools as fct
fct.get_shared_expression(
            lakehouse = '',
            #workspace = '' 
            )

Parameters

lakehouse str

Optional; The lakehouse name.

workspace str

Optional; The workspace where the lakehouse resides.

Returns

A string showing the expression which can be used to connect a Direct Lake semantic model to its SQL Endpoint.


get_sku_size

Shows the SKU size for a workspace.

import fabric_cat_tools as fct
fct.get_sku_size(
            workspace = '' 
            )

Parameters

workspace str

Optional; The workspace where the semantic model resides.

Returns

A string containing the SKU size for a workspace.


import_vertipaq_analyzer

Imports and visualizes the vertipaq analyzer info from a saved .zip file in your lakehouse.

import fabric_cat_tools as fct
fct.import_vertipaq_analyzer(
          folder_path = '/lakehouse/default/Files/VertipaqAnalyzer',
          file_name = 'Workspace Name-DatasetName.zip'
          )

Parameters

folder_path str

Required; Folder within your lakehouse in which the .zip file containing the vertipaq analyzer info has been saved.

file_name str

Required; File name of the file which contains the vertipaq analyzer info.


launch_report

Shows a Power BI report within a Fabric notebook.

import fabric_cat_tools as fct
fct.launch_report(
          report = 'MyReport',
          #workspace = None
          )

Parameters

report str

Required; The name of the report.

workspace str

Optional; The name of the workspace in which the report resides.


list_dashboards

Shows the dashboards within the workspace.

import fabric_cat_tools as fct
fct.list_dashboards(
            #workspace = '' 
            )

Parameters

workspace str

Optional; The workspace name.

Returns

A pandas dataframe showing the dashboards which exist in the workspace.


list_dataflow_storage_accounts

Shows the dataflow storage accounts.

import fabric_cat_tools as fct
fct.list_dataflow_storage_accounts()

Parameters

None

Returns

A pandas dataframe showing the accessible dataflow storage accounts.


list_direct_lake_model_calc_tables

Shows the calculated tables and their respective DAX expression for a Direct Lake model (which has been migrated from import/DirectQuery.

Note

This function is only relevant to semantic models in Direct Lake mode.

import fabric_cat_tools as fct
fct.list_direct_lake_model_calc_tables(
            dataset = 'AdventureWorks',
            #workspace = '' 
            )

Parameters

dataset str

Required; Name of the semantic model.

workspace str

Optional; The workspace where the semantic model resides.

Returns

A pandas dataframe showing the calculated tables which were migrated to Direct Lake and whose DAX expressions are stored as model annotations.


list_lakehouses

Shows the properties associated with lakehouses in a workspace.

import fabric_cat_tools as fct
fct.list_lakehouses(
            workspace = None
            )

Parameters

workspaceName str

Optional; The workspace where the lakehouse resides.

Returns

A pandas dataframe showing the properties of a all lakehouses in a workspace.


list_semantic_model_objects

Shows a list of semantic model objects.

import fabric_cat_tools as fct
fct.list_semantic_model_objects(
            dataset = 'AdvWorks',
            workspace = None
            )

Parameters

dataset str

Required; Name of the import/DirectQuery semantic model.

workspace str

Optional; The workspace where the semantic model resides.

Returns

A dataframe showing a list of objects in the semantic model


list_shortcuts

Shows the shortcuts within a lakehouse (note: the API behind this function is not yet available. The function will work as expected once the API is officially released)

import fabric_cat_tools as fct
fct.list_shortcuts(
            lakehouse = 'MyLakehouse',
            #workspace = '' 
            )

Parameters

lakehouse str

Optional; Name of the lakehouse.

workspace str

Optional; The workspace where the lakehouse resides.

Returns

A pandas dataframe showing the shortcuts which exist in a given lakehouse and their properties.


list_warehouses

Shows the warehouss within a workspace.

import fabric_cat_tools as fct
fct.list_warehouses(
            #workspace = None
            )

Parameters

workspace str

Optional; The workspace name.

Returns

A pandas dataframe showing the warehouses which exist in a given workspace and their properties.


measure_dependency_tree

Shows a measure dependency tree of all dependent objects for a measure in a semantic model.

import fabric_cat_tools as fct
fct.measure_dependency_tree(
            dataset = 'AdventureWorks',
            measure_name = 'Sales Amount',
            #workspace = '' 
            )

Parameters

dataset str

Required; Name of the semantic model.

measure_name str

Required; Name of the measure to use for building a dependency tree.

workspace str

Optional; The workspace where the semantic model resides.

Returns

A tree view showing the dependencies for a given measure within the semantic model.


migrate_calc_tables_to_lakehouse

Creates delta tables in your lakehouse based on the DAX expression of a calculated table in an import/DirectQuery semantic model. The DAX expression encapsulating the calculated table logic is stored in the new Direct Lake semantic model as model annotations.

Note

This function is specifically relevant for import/DirectQuery migration to Direct Lake

import fabric_cat_tools as fct
fct.migrate_calc_tables_to_lakehouse(
            dataset = 'AdventureWorks',
            new_dataset = 'AdventureWorksDL',
            #workspace = '',
            #new_dataset_workspace = '',
            #lakehouse = '',
            #lakehouse_workspace = ''
            )

Parameters

dataset str

Required; Name of the import/DirectQuery semantic model.

new_dataset str

Required; Name of the Direct Lake semantic model.

workspace str

Optional; The workspace where the semantic model resides.

new_dataset_workspace str

Optional; The workspace to be used by the Direct Lake semantic model.

lakehouse str

Optional; The lakehouse to be used by the Direct Lake semantic model.

lakehouse_workspace str

Optional; The workspace where the lakehouse resides.

Returns

A printout stating the success/failure of the operation.


migrate_calc_tables_to_semantic_model

Creates new tables in the Direct Lake semantic model based on the lakehouse tables created using the 'migrate_calc_tables_to_lakehouse' function.

Note

This function is specifically relevant for import/DirectQuery migration to Direct Lake

import fabric_cat_tools as fct
fct.migrate_calc_tables_to_semantic_model(
            dataset = 'AdventureWorks',
            new_dataset = 'AdventureWorksDL',
            #workspace = '',
            #new_dataset_workspace = '',
            #lakehouse = '',
            #lakehouse_workspace = ''
            )

Parameters

dataset str

Required; Name of the import/DirectQuery semantic model.

new_dataset str

Required; Name of the Direct Lake semantic model.

workspace str

Optional; The workspace where the semantic model resides.

new_dataset_workspace str

Optional; The workspace to be used by the Direct Lake semantic model.

lakehouse str

Optional; The lakehouse to be used by the Direct Lake semantic model.

lakehouse_workspace str

Optional; The workspace where the lakehouse resides.

Returns

A printout stating the success/failure of the operation.


migrate_field_parameters

Migrates field parameters from one semantic model to another.

Note

This function is specifically relevant for import/DirectQuery migration to Direct Lake

import fabric_cat_tools as fct
fct.migrate_field_parameters(
            dataset = 'AdventureWorks',
            new_dataset = '',
            #workspace = '',
            #new_dataset_workspace = ''
            )

Parameters

dataset str

Required; Name of the import/DirectQuery semantic model.

new_dataset str

Required; Name of the Direct Lake semantic model.

workspace str

Optional; The workspace where the semantic model resides.

new_dataset_workspace str

Optional; The workspace to be used by the Direct Lake semantic model.

Returns

A printout stating the success/failure of the operation.


migrate_model_objects_to_semantic_model

Adds the rest of the model objects (besides tables/columns) and their properties to a Direct Lake semantic model based on an import/DirectQuery semantic model.

Note

This function is specifically relevant for import/DirectQuery migration to Direct Lake

import fabric_cat_tools as fct
fct.migrate_model_objects_to_semantic_model(
            dataset = 'AdventureWorks',
            new_dataset = '',
            #workspace = '',
            #new_dataset_workspace = ''
            )

Parameters

dataset str

Required; Name of the import/DirectQuery semantic model.

new_dataset str

Required; Name of the Direct Lake semantic model.

workspace str

Optional; The workspace where the semantic model resides.

new_dataset_workspace str

Optional; The workspace to be used by the Direct Lake semantic model.

Returns

A printout stating the success/failure of the operation.


migrate_tables_columns_to_semantic_model

Adds tables/columns to the new Direct Lake semantic model based on an import/DirectQuery semantic model.

Note

This function is specifically relevant for import/DirectQuery migration to Direct Lake

import fabric_cat_tools as fct
fct.migrate_tables_columns_to_semantic_model(
            dataset = 'AdventureWorks',
            new_dataset = 'AdventureWorksDL',
            #workspace = '',
            #new_dataset_workspace = '',
            #lakehouse = '',
            #lakehouse_workspace = ''
            )

Parameters

dataset str

Required; Name of the import/DirectQuery semantic model.

new_dataset str

Required; Name of the Direct Lake semantic model.

workspace str

Optional; The workspace where the semantic model resides.

new_dataset_workspace str

Optional; The workspace to be used by the Direct Lake semantic model.

lakehouse str

Optional; The lakehouse to be used by the Direct Lake semantic model.

lakehouse_workspace str

Optional; The workspace where the lakehouse resides.

Returns

A printout stating the success/failure of the operation.


migration_validation

Shows the objects in the original semantic model and whether then were migrated successfully or not.

import fabric_cat_tools as fct
fct.migration_validation(
            dataset = 'AdvWorks',
            new_dataset = 'AdvWorksDL',
            workspace = None,
            new_dataset_workspace = None
            )

Parameters

dataset str

Required; Name of the import/DirectQuery semantic model.

new_dataset str

Required; Name of the Direct Lake semantic model.

workspace str

Optional; The workspace where the semantic model resides.

new_dataset_workspace str

Optional; The workspace to be used by the Direct Lake semantic model.

Returns

A dataframe showing a list of objects and whether they were successfully migrated. Also shows the % of objects which were migrated successfully.


model_bpa_rules

Shows the default Best Practice Rules for the semantic model used by the run_model_bpa function

import fabric_cat_tools as fct
fct.model_bpa_rules()

Returns

A pandas dataframe showing the default semantic model best practice rules.


optimize_lakehouse_tables

Runs the OPTIMIZE function over the specified lakehouse tables.

import fabric_cat_tools as fct
fct.optimize_lakehouse_tables(
            tables = ['Sales', 'Calendar'],
            #lakehouse = None,
            #workspace = None
        )
import fabric_cat_tools as fct
fct.optimize_lakehouse_tables(
            tables = None,
            #lakehouse = 'MyLakehouse',
            #workspace = None
        )

Parameters

tables str or list of str

Required; Name(s) of the lakehouse delta table(s) to optimize. If 'None' is entered, all of the delta tables in the lakehouse will be queued to be optimized.

lakehouse str

Optional; Name of the lakehouse.

workspace str

Optional; The workspace where the lakehouse resides.

Returns

A printout stating the success/failure of the operation.


refresh_calc_tables

Recreates the delta tables in the lakehouse based on the DAX expressions stored as model annotations in the Direct Lake semantic model.

Note

This function is only relevant to semantic models in Direct Lake mode.

import fabric_cat_tools as fct
fct.refresh_calc_tables(
            dataset = 'AdventureWorks',
            #workspace = None
            )

Parameters

dataset str

Required; Name of the semantic model.

workspace str

Optional; The workspace where the semantic model resides.

Returns

A printout stating the success/failure of the operation.


refresh_semantic_model

Performs a refresh on a semantic model.

import fabric_cat_tools as fct
fct.refresh_semantic_model(
    dataset = 'AdventureWorks',
    refresh_type = 'full',
    workspace = None
)
import fabric_cat_tools as fct
fct.refresh_semantic_model(
    dataset = 'AdventureWorks',
    tables = ['Sales', 'Geography'],
    workspace = None
)
import fabric_cat_tools as fct
fct.refresh_semantic_model(
    dataset = 'AdventureWorks',
    partitions = ["'Sales'[Sales - 2024]", "'Sales'[Sales - 2023]"],
    workspace = None
)
import fabric_cat_tools as fct
fct.refresh_semantic_model(
    dataset = 'AdventureWorks',
    tables = ['Geography'],
    partitions = ["'Sales'[Sales - 2024]", "'Sales'[Sales - 2023]"],
    workspace = None
)

Parameters

dataset str

Required; Name of the semantic model. If no tables/partitions are specified, the entire semantic model is refreshed.

tables str or list of str

Optional; Tables to refresh.

partitions str or list of str

Optional; Partitions to refresh. Must be in "'Table'[Partition]" format.

refresh_type str

Optional; Type of processing to perform. Options: ('full', 'automatic', 'dataOnly', 'calculate', 'clearValues', 'defragment'). Default value: 'full'.

retry_count int

Optional; Number of retry attempts. Default is 0.

workspace str

Optional; The workspace where the semantic model resides.

Returns

A printout stating the success/failure of the operation.


report_rebind

Rebinds a report to a semantic model.

import fabric_cat_tools as fct
fct.report_rebind(
            report = '',
            dataset = '',
            #report_workspace = '',
            #dataset_workspace = ''
            )

Parameters

report str

Required; Name of the report.

dataset str

Required; Name of the semantic model to rebind to the report.

report_workspace str

Optional; The workspace where the report resides.

dataset_workspace str

Optional; The workspace where the semantic model resides.

Returns

A printout stating the success/failure of the operation.


report_rebind_all

Rebinds all reports in a workspace which are bound to a specific semantic model to a new semantic model.

import fabric_cat_tools as fct
fct.report_rebind_all(
            dataset = '',
            new_dataset = '',
            #dataset_workspace = '' ,
            #new_dataset_workspace = '' ,
            #report_workspace = '' 
            )

Parameters

dataset str

Required; Name of the semantic model currently binded to the reports.

new_dataset str

Required; Name of the semantic model to rebind to the reports.

dataset_workspace str

Optional; The workspace where the original semantic model resides.

new_dataset_workspace str

Optional; The workspace where the new semantic model resides.

report_workspace str

Optional; The workspace where the reports reside.

Returns

A printout stating the success/failure of the operation.


resolve_lakehouse_name

Returns the name of the lakehouse for a given lakehouse Id.

import fabric_cat_tools as fct
fct.resolve_lakehouse_name(
        lakehouse_id = '',
        #workspace = '' 
        )

Parameters

lakehouse_id UUID

Required; UUID object representing a lakehouse.

workspace str

Optional; The workspace where the lakehouse resides.

Returns

A string containing the lakehouse name.


resolve_lakehouse_id

Returns the ID of a given lakehouse.

import fabric_cat_tools as fct
fct.resolve_lakehouse_id(
        lakehouse = 'MyLakehouse',
        #workspace = '' 
        )

Parameters

lakehouse str

Required; Name of the lakehouse.

workspace str

Optional; The workspace where the lakehouse resides.

Returns

A string conaining the lakehouse ID.


resolve_dataset_id

Returns the ID of a given semantic model.

import fabric_cat_tools as fct
fct.resolve_dataset_id(
        dataset = 'MyReport',
        #workspace = '' 
        )

Parameters

datasetName str

Required; Name of the semantic model.

workspaceName str

Optional; The workspace where the semantic model resides.

Returns

A string containing the semantic model ID.


resolve_dataset_name

Returns the name of a given semantic model ID.

import fabric_cat_tools as fct
fct.resolve_dataset_name(
        dataset_id = '',
        #workspace = '' 
        )

Parameters

dataset_id UUID

Required; UUID object representing a semantic model.

workspace str

Optional; The workspace where the semantic model resides.

Returns

A string containing the semantic model name.


resolve_report_id

Returns the ID of a given report.

import fabric_cat_tools as fct
fct.resolve_report_id(
        report = 'MyReport',
        #workspace = '' 
        )

Parameters

report str

Required; Name of the report.

workspace str

Optional; The workspace where the report resides.

Returns

A string containing the report ID.


resolve_report_name

Returns the name of a given report ID.

import fabric_cat_tools as fct
fct.resolve_report_name(
        report_id = '',
        #workspace = '' 
        )

Parameters

report_id UUID

Required; UUID object representing a report.

workspace str

Optional; The workspace where the report resides.

Returns

A string containing the report name.


run_dax

Runs a DAX query against a semantic model.

import fabric_cat_tools as fct
fct.run_dax(
            dataset = 'AdventureWorks',
            dax_query = 'Internet Sales',
            user_name = 'FACT_InternetSales',
            #workspace = None
            )

Parameters

dataset str

Required; Name of the semantic model.

dax_query str

Required; The DAX query to be executed.

user_name str

Optional; The workspace where the semantic model resides.

workspace str

Optional; The workspace where the semantic model resides.

Returns

A pandas dataframe with the results of the DAX query.


run_model_bpa

Runs the Best Practice Rules against a semantic model.

import fabric_cat_tools as fct
fct.run_model_bpa(
        dataset = 'AdventureWorks',
        #workspace = None
        )

Parameters

dataset str

Required; Name of the semantic model.

rules_dataframe

Optional; A pandas dataframe including rules to be analyzed.

workspace str

Optional; The workspace where the semantic model resides.

return_dataframe bool

Optional; Returns a pandas dataframe instead of the visualization.

export bool

Optional; Exports the results to a delta table in the lakehouse.

Returns

A visualization showing objects which violate each Best Practice Rule by rule category.


save_as_delta_table

Saves a dataframe as a delta table in the lakehouse

import fabric_cat_tools as fct
fct.save_as_delta_table(
            dataframe = df,
            delta_table_name = 'MyNewTable',
            write_mode = 'overwrite',
            lakehouse = None,
            workspace = None
            )
import fabric_cat_tools as fct
fct.save_as_delta_table(
            dataframe = df,
            delta_table_name = 'MyNewTable',
            write_mode = 'append',
            lakehouse = None,
            workspace = None
            )

Parameters

dataframe DataFrame

Required; The dataframe to save as a delta table.

delta_table_name str

Required; The name of the delta table to save the dataframe.

write_mode str

Required; Options: 'append' or 'overwrite'.

lakehouse str

Optional: The name of the lakehouse in which the delta table will be saved. Defaults to the default lakehouse attached to the notebook.

workspace str

Optional; The workspace where the lakehouse resides. Defaults to the workspace in which the notebook resides.

Returns

A printout stating the success/failure of the operation.


show_unsupported_direct_lake_objects

Returns a list of a semantic model's objects which are not supported by Direct Lake based on official documentation.

import fabric_cat_tools as fct
fct.show_unsupported_direct_lake_objects(
        dataset = 'AdventureWorks',
        #workspace = None
        )

Parameters

dataset str

Required; Name of the semantic model.

workspace str

Optional; The workspace where the semantic model resides.

Returns

3 pandas dataframes showing objects (tables/columns/relationships) within the semantic model which are currently not supported by Direct Lake mode.


translate_semantic_model

Translates names, descriptions, display folders for all objects in a semantic model.

import fabric_cat_tools as fct
fct.translate_semantic_model(
            dataset = 'AdventureWorks',
            languages = ['it_IT', 'fr-FR'],
            #workspace = None
            )
import fabric_cat_tools as fct
fct.translate_semantic_model(
            dataset = 'AdventureWorks',
            languages = ['it_IT', 'fr-FR'],
            exclude_characters = '_-',
            #workspace = None
            )

Parameters

dataset str

Required; Name of the semantic model.

languages str or list of str

Required; Language code(s) to translate.

exclude_characters str

Optional; Any character in this string will be replaced by a space when given to the AI translator.

workspace str

Optional; The workspace where the semantic model resides.

Returns

A printout stating the success/failure of the operation.


update_direct_lake_model_lakehouse_connection

Remaps a Direct Lake semantic model's SQL Endpoint connection to a new lakehouse.

Note

This function is only relevant to semantic models in Direct Lake mode.

import fabric_cat_tools as fct
fct.update_direct_lake_model_lakehouse_connection(
            dataset = '',
            #lakehouse = '',
            #workspace = ''
            )

Parameters

dataset str

Required; Name of the semantic model.

lakehouse str

Optional; Name of the lakehouse.

workspace str

Optional; The workspace where the semantic model resides.

lakehouse_workspace str

Optional; The workspace where the lakehouse resides.

Returns

A printout stating the success/failure of the operation.


update_direct_lake_partition_entity

Remaps a table (or tables) in a Direct Lake semantic model to a table in a lakehouse.

Note

This function is only relevant to semantic models in Direct Lake mode.

import fabric_cat_tools as fct
fct.update_direct_lake_partition_entity(
            dataset = 'AdventureWorks',
            table_name = 'Internet Sales',
            entity_name = 'FACT_InternetSales',
            #workspace = '',
            #lakehouse = '',
            #lakehouse_workspace = ''            
            )
import fabric_cat_tools as fct
fct.update_direct_lake_partition_entity(
            dataset = 'AdventureWorks',
            table_name = ['Internet Sales', 'Geography'],
            entity_name = ['FACT_InternetSales', 'DimGeography'],
            #workspace = '',
            #lakehouse = '',
            #lakehouse_workspace = ''            
            )

Parameters

dataset str

Required; Name of the semantic model.

table_name str or list of str

Required; Name of the table in the semantic model.

entity_name str or list of str

Required; Name of the lakehouse table to be mapped to the semantic model table.

workspace str

Optional; The workspace where the semantic model resides.

lakehouse str

Optional; Name of the lakehouse.

lakehouse_workspace str

Optional; The workspace where the lakehouse resides.

Returns

A printout stating the success/failure of the operation.


update_item

Creates a warehouse in Fabric.

import fabric_cat_tools as fct
fct.update_item(
            item_type = 'Lakehouse',
            current_name = 'MyLakehouse',
            new_name = 'MyNewLakehouse',
            #description = 'This is my new lakehouse',
            #workspace = None
            )

Parameters

item_type str

Required; Type of item to update. Valid options: 'DataPipeline', 'Eventstream', 'KQLDatabase', 'KQLQueryset', 'Lakehouse', 'MLExperiment', 'MLModel', 'Notebook', 'Warehouse'.

current_name str

Required; Current name of the item.

new_name str

Required; New name of the item.

description str

Optional; New description of the item.

workspace str

Optional; The workspace where the item resides.

Returns

A printout stating the success/failure of the operation.


vertipaq_analyzer

Extracts the vertipaq analyzer statistics from a semantic model.

import fabric_cat_tools as fct
fct.vertipaq_analyzer(
        dataset = 'AdventureWorks',
        #workspace = '',
        export = None
        )
import fabric_cat_tools as fct
fct.vertipaq_analyzer(
        dataset = 'AdventureWorks',
        #workspace = '',
        export = 'zip'
        )
import fabric_cat_tools as fct
fct.vertipaq_analyzer(
        dataset = 'AdventureWorks',
        #workspace = '',
        export = 'table'
        )

Parameters

dataset str

Required; Name of the semantic model.

workspace str

Optional; The workspace where the semantic model resides.

export str

Optional; Specifying 'zip' will export the results to a zip file in your lakehouse (which can be imported using the import_vertipaq_analyzer function. Specifying 'table' will export the results to delta tables (appended) in your lakehouse. Default value: None.

lakehouse_workspace str

Optional; The workspace in which the lakehouse used by a Direct Lake semantic model resides.

read_stats_from_data bool

Optional; Setting this parameter to true has the function get Column Cardinality and Missing Rows using DAX (Direct Lake semantic models achieve this using a Spark query to the lakehouse).

Returns

A visualization of the Vertipaq Analyzer statistics.


warm_direct_lake_cache_perspective

Warms the cache of a Direct Lake semantic model by running a simple DAX query against the columns in a perspective

Note

This function is only relevant to semantic models in Direct Lake mode.

import fabric_cat_tools as fct
fct.warm_direct_lake_cache_perspective(
        dataset = 'AdventureWorks',
        perspective = 'WarmCache',
        add_dependencies = True,
        #workspace = None
        )

Parameters

dataset str

Required; Name of the semantic model.

perspective str

Required; Name of the perspective which contains objects to be used for warming the cache.

add_dependencies bool

Optional; Includes object dependencies in the cache warming process.

workspace str

Optional; The workspace where the semantic model resides.

Returns

A printout stating the success/failure of the operation.


warm_direct_lake_cache_isresident

Performs a refresh on the semantic model and puts the columns which were in memory prior to the refresh back into memory.

Note

This function is only relevant to semantic models in Direct Lake mode.

import fabric_cat_tools as fct
fct.warm_direct_lake_cache_isresident(
        dataset = 'AdventureWorks',
        #workspace = None
        )

Parameters

dataset str

Required; Name of the semantic model.

workspace str

Optional; The workspace where the semantic model resides.

Returns

A printout stating the success/failure of the operation.


fabric_cat_tools.TOM Functions

connect_semantic_model

Forms the connection to the Tabular Object Model (TOM) for a semantic model

with connect_semantic_model(dataset ='AdventureWorks', workspace = None, readonly = True) as tom:
with connect_semantic_model(dataset ='AdventureWorks', workspace = None, readonly = False) as tom:

Parameters

dataset str

Required; The name of the semantic model.

workspace str

Optional; The name of the workspace in which the semantic model resides. Defaults to the workspace in which the notebook resides.

readonly bool

Optional; Setting this to true uses a read only mode of TOM. Setting this to false enables read/write and saves any changes made to the semantic model. Default value: True.

add_calculated_column

Adds a calculated column to a table within a semantic model.

import fabric_cat_tools as fct
from fabric_cat_tools.TOM import connect_semantic_model

with connect_semantic_model(dataset = 'AdventureWorks', workspace = None, readonly = False) as tom:
    tom.add_calculated_column(
        table_name = 'Segment',
        column_name = 'Business Segment',
        expression = '',
        data_type = 'String'
    )

Parameters

table_name str

Required; The name of the table where the column will be added.

column_name str

Required; The name of the calculated column.

expression str

Required; The DAX expression for the calculated column.

data_type str

Required; The data type of the calculated column.

format_string str

Optional; The formats strinf for the column.

hidden bool

Optional; Sets the column to be hidden if True. Default value: False.

description str

Optional; The description of the column.

display_folder str

Optional; The display folder for the column.

data_category str

Optional; The data category of the column.

key bool

Optional; Marks the column as the primary key of the table. Default value: False.

summarize_by str

Optional; Sets the value for the Summarize By property of the column.

Returns


add_calculated_table

Adds a calculated table to a semantic model.

import fabric_cat_tools as fct
from fabric_cat_tools.TOM import connect_semantic_model

with connect_semantic_model(dataset = 'AdventureWorks', workspace = None, readonly = False) as tom:
    tom.add_calculated_table(
        name = 'Segment',
        expression = ''
    )

Parameters

name str

Required; The name of the table.

expression str

Required; The DAX expression for the table.

description str

Optional; The description of the table.

data_category str

Optional; The data category of the table.

hidden bool

Optional; Sets the table to be hidden if True. Default value: False.

Returns


add_calculated_table_column

Adds a calculated table column to a calculated table within a semantic model.

import fabric_cat_tools as fct
from fabric_cat_tools.TOM import connect_semantic_model

with connect_semantic_model(dataset = 'AdventureWorks', workspace = None, readonly = False) as tom:
    tom.add_calculated_table_column(
        table_name = 'Segment',
        column_name = 'Business Segment',
        source_column = '',
        data_type = 'String'
    )

Parameters

table_name str

Required; The name of the table in which the column will reside.

column_name str

Required; The name of the column.

source_column str

Required; The source column for the column.

data_type str

Required; The data type of the column.

format_string str

Optional; The format string of the column.

hidden bool

Optional; Sets the column to be hidden if True. Default value: False.

description str

Optional; The description of the column.

display_folder str

Optional; The display folder for the column.

data_category str

Optional; The data category of the column.

key bool

Optional; Marks the column as the primary key of the table. Default value: False.

summarize_by str

Optional; Sets the value for the Summarize By property of the column.

Returns


add_calculation_group

Adds a calculation group to a semantic model.

import fabric_cat_tools as fct
from fabric_cat_tools.TOM import connect_semantic_model

with connect_semantic_model(dataset = 'AdventureWorks', workspace = None, readonly = False) as tom:
    tom.add_calculation_group(
        name = 'Segment',
        precedence = 1
    )

Parameters

name str

Required; The name of the calculation group.

precedence int

Optional;

description str

Optional; The description of the calculation group.

hidden bool

Optional; Sets the calculation group to be hidden if True. Default value: False.

Returns


add_calculation_item

Adds a calculation item to a calculation group within a semantic model.

import fabric_cat_tools as fct
from fabric_cat_tools.TOM import connect_semantic_model

with connect_semantic_model(dataset = 'AdventureWorks', workspace = None, readonly = False) as tom:
    tom.add_calculation_item(
        table_name = 'Segment',
        calculation_item_name = 'YTD'
        expression = "CALCULATE(SELECTEDMEASURE(), DATESYTD('Date'[Date]))"
    )

Parameters

table_name str

Required; The name of the table.

calculation_item_name str

Required; The name of the calculation item.

expression str

Required; The DAX expression encapsulating the logic of the calculation item.

ordinal int

Optional;

format_string_expression str

Optional;

description str

Optional; The description of the calculation item.

Returns


add_data_column

Adds a data column to a table within a semantic model.

import fabric_cat_tools as fct
from fabric_cat_tools.TOM import connect_semantic_model

with connect_semantic_model(dataset = 'AdventureWorks', workspace = None, readonly = False) as tom:
    tom.add_data_column(
        table_name = 'Segment',
        column_name = 'Business Segment',
        source_column = '',
        data_type = 'String'
    )

Parameters

table_name str

Required; The name of the table in which the column will exist.

column_name str

Required; The name of the column.

source_column str

Required; The name of the column in the source.

data_type str

Required; The data type of the column.

format_string str

Optional; The format string of the column.

hidden bool

Optional; Sets the column to be hidden if True. Default value: False.

description str

Optional; The description of the column.

display_folder str

Optional; The display folder for the column.

data_category str

Optional; The data category of the column.

key bool

Optional; Marks the column as the primary key of the table. Default value: False.

summarize_by str

Optional; Sets the value for the Summarize By property of the column.

Returns


add_entity_partition

Adds an entity partition to a table in a semantic model. Entity partitions are used for tables within Direct Lake semantic models.

import fabric_cat_tools as fct
from fabric_cat_tools.TOM import connect_semantic_model

with connect_semantic_model(dataset = 'AdventureWorks', workspace = None, readonly = False) as tom:
    tom.add_entity_partition(
        table_name = 'Sales',
        entity_name = 'Fact_Sales'
    )

Parameters

table_name str

Required; The name of the table in which to place the entity partition.

entity_name str

Required; The name of the lakehouse table.

expression str

Optional; The expression to use for the partition. This defaults to using the existing 'DatabaseQuery' expression within the Direct Lake semantic model.

description str

Optional; The description of the partition.

Returns


add_expression

Adds an expression to a semantic model.

import fabric_cat_tools as fct
from fabric_cat_tools.TOM import connect_semantic_model

with connect_semantic_model(dataset = 'AdventureWorks', workspace = None, readonly = False) as tom:
    tom.add_expression(
        name = 'DatabaseQuery',
        expression = 'let...'
    )

Parameters

name str

Required; The name of the expression.

expression str

Required; The M-code encapsulating the logic for the expression.

description str

Optional; The description of the expression.

Returns


add_field_parameter

Adds a field parameter to a semantic model.

import fabric_cat_tools as fct
from fabric_cat_tools.TOM import connect_semantic_model

with connect_semantic_model(dataset = 'AdventureWorks', workspace = None, readonly = False) as tom:
    tom.add_field_parameter(
        table_name = 'Segment',
        objects = ["'Product'[Product Category]", "[Sales Amount]", "'Geography'[Country]"]
    )

Parameters

table_name str

Required; The name of the field parameter.

objects list of str

Required; A list of columns/tables to place in the field parameter. Columns must be fully qualified (i.e. "'Table Name'[Column Name]" and measures must be unqualified (i.e. "[Measure Name]").

Returns


add_hierarchy

Adds a hierarchy to a table within a semantic model.

import fabric_cat_tools as fct
from fabric_cat_tools.TOM import connect_semantic_model

with connect_semantic_model(dataset = 'AdventureWorks', workspace = None, readonly = False) as tom:
    tom.add_hierarchy(
        table_name = 'Geography',
        hierarchy_name = 'Geo Hierarchy',
        columns = ['Continent', 'Country', 'City']
    )

Parameters

table_name str

Required; The name of the table in which the hierarchy will reside.

hierarchy_name str

Required; The name of the hierarchy.

columns list of str

Required; A list of columns to use in the hierarchy. Must be ordered from the top of the hierarchy down (i.e. ["Continent", "Country", "City"]).

levels list of str

Optional; A list of levels to use in the hierarchy. These will be the displayed name (instead of the column names). If omitted, the levels will default to showing the column names.

hierarchy_description str

Optional; The description of the hierarchy.

hierarchy_hidden bool

Optional; Sets the hierarchy to be hidden if True. Default value: False.

Returns


add_m_partition

Adds an M-partition to a table within a semantic model.

import fabric_cat_tools as fct
from fabric_cat_tools.TOM import connect_semantic_model

with connect_semantic_model(dataset = 'AdventureWorks', workspace = None, readonly = False) as tom:
    tom.add_m_partiiton(
        table_name = 'Segment',
        partition_name = 'Segment',
        expression = 'let...',
        mode = 'Import'
    )

Parameters

table_name str

Required; The name of the table in which the partition will reside.

partition_name str

Required; The name of the M partition.

expression str

Required; The M-code encapsulating the logic of the partition.

mode str

Optional; The storage mode for the partition. Default value: 'Import'.

description str

Optional; The description of the partition.

Returns


add_measure

Adds a measure to the semantic model.

import fabric_cat_tools as fct
from fabric_cat_tools.TOM import connect_semantic_model

with connect_semantic_model(dataset = 'AdventureWorks', workspace = None, readonly = False) as tom:
    tom.add_measure(
        table_name = 'Sales',
        measure_name = 'Sales Amount',
        expression = "SUM('Sales'[SalesAmount])",
        format_string = '$,00'
    )

Parameters

table_name str

Required; The name of the table in which the measure will reside.

measure_name str

Required; The name of the measure.

expression str

Required; The DAX expression encapsulating the logic of the measure.

format_string str

Optional; The format string of the measure.

hidden bool

Optional; Sets the measure to be hidden if True. Default value: False.

description str

Optional; The description of the measure.

display_folder str

Optional; The display folder for the measure.

Returns


add_perspective

Adds a perspective to the semantic model.

import fabric_cat_tools as fct
from fabric_cat_tools.TOM import connect_semantic_model

with connect_semantic_model(dataset = 'AdventureWorks', workspace = None, readonly = False) as tom:
    tom.add_perspective(
        perspective_name = 'Marketing'
    )

Parameters

perspective_name str

Required; The name of the perspective.

Returns


add_relationship

Adds a relationship to the semantic model.

import fabric_cat_tools as fct
from fabric_cat_tools.TOM import connect_semantic_model

with connect_semantic_model(dataset = 'AdventureWorks', workspace = None, readonly = False) as tom:
    tom.add_relationship(
        from_table = 'Sales',
        from_column = 'ProductKey',
        to_table = 'Product',
        to_column = 'ProductKey',
        from_cardinality = 'Many',
        to_cardinality = 'One',
        is_active = True
    )

Parameters

from_table str

Required; The name of the table on the 'from' side of the relationship.

from_column str

Required; The name of the column on the 'from' side of the relationship.

to_table str

Required; The name of the table on the 'to' side of the relationship.

to_column str

Required; The name of the column on the 'to' side of the relationship.

from_cardinality str

Required; The cardinality of the 'from' side of the relationship. Options: ['Many', 'One', 'None'].

to_cardinality str

Required; The cardinality of the 'to' side of the relationship. Options: ['Many', 'One', 'None'].

cross_filtering_behavior str

Optional; Setting for the cross filtering behavior of the relationship. Options: ('Automatic', 'OneDirection', 'BothDirections'). Default value: 'Automatic'.

is_active bool

Optional; Setting for whether the relationship is active or not. Default value: True.

security_filtering_behavior str

Optional; Setting for the security filtering behavior of the relationship. Options: ('None', 'OneDirection', 'BothDirections'). Default value: 'OneDirection'.

rely_on_referential_integrity bool

Optional; ; Setting for the rely on referential integrity of the relationship. Default value: False.

Returns


add_role

Adds a role to the semantic model.

import fabric_cat_tools as fct
from fabric_cat_tools.TOM import connect_semantic_model

with connect_semantic_model(dataset = 'AdventureWorks', workspace = None, readonly = False) as tom:
    tom.add_role(
        role_name = 'Reader'
    )

Parameters

role_name str

Required; The name of the role.

model_permission str

Optional; The model permission of the role. Default value: 'Reader'.

description str

Optional; The description of the role.

Returns


add_table

Adds a table to the semantic model.

import fabric_cat_tools as fct
from fabric_cat_tools.TOM import connect_semantic_model

with connect_semantic_model(dataset = 'AdventureWorks', workspace = None, readonly = False) as tom:
    tom.add_table(
        name = 'Sales',
        description = 'This is the sales table.',
        hidden = False
    )

Parameters

name str

Required; The name of the table.

description str

Optional; The descrition of the table.

data_category str

Optional; The data category of the table.

hidden bool

Optional; Sets the table to be hidden if True. Default value: False.

Returns


add_to_perspective

Adds an object to a perspective.

import fabric_cat_tools as fct
from fabric_cat_tools.TOM import connect_semantic_model

with connect_semantic_model(dataset = 'AdventureWorks', workspace = None, readonly = False) as tom:
    tom.add_to_perspective(
        object = tom.model.Tables['Sales'].Measures['Sales Amount'],
        perspective_name = 'Marketing'
    )

Parameters

object

Required; The TOM object.

perspective_name str

Required; The name of the perspective.

Returns


add_translation

Adds a translation language to the semantic model.

import fabric_cat_tools as fct
from fabric_cat_tools.TOM import connect_semantic_model

with connect_semantic_model(dataset = 'AdventureWorks', workspace = None, readonly = False) as tom:
    tom.add_translation(
       language = 'it-IT'
    )

Parameters

language str

Required; The language code to add to the semantic model.

Returns


all_calculation_items

Outputs a list of all calculation items within all calculation groups in the semantic model.

import fabric_cat_tools as fct
from fabric_cat_tools.TOM import connect_semantic_model

with connect_semantic_model(dataset = 'AdventureWorks', workspace = None, readonly = True) as tom:
    for c in tom.all_calculation_items():
        print(c.Name)

Parameters

None

Returns


all_columns

Outputs a list of all columns within all tables in the semantic model.

import fabric_cat_tools as fct
from fabric_cat_tools.TOM import connect_semantic_model

with connect_semantic_model(dataset = 'AdventureWorks', workspace = None, readonly = True) as tom:
    for c in tom.all_columns():
        print(c.Name)

Parameters

None

Returns


all_hierarchies

Outputs a list of all hierarchies within all tables in the semantic model.

import fabric_cat_tools as fct
from fabric_cat_tools.TOM import connect_semantic_model

with connect_semantic_model(dataset = 'AdventureWorks', workspace = None, readonly = True) as tom:
    for h in tom.all_hierarchies():
        print(h.Name)

Parameters

None

Returns


all_levels

Outputs a list of all levels within all hierarchies within all tables in the semantic model.

import fabric_cat_tools as fct
from fabric_cat_tools.TOM import connect_semantic_model

with connect_semantic_model(dataset = 'AdventureWorks', workspace = None, readonly = True) as tom:
    for l in tom.all_levels():
        print(l.Name)

Parameters

None

Returns


all_measures

Outputs a list of all measures within all tables in the semantic model.

import fabric_cat_tools as fct
from fabric_cat_tools.TOM import connect_semantic_model

with connect_semantic_model(dataset = 'AdventureWorks', workspace = None, readonly = True) as tom:
    for m in tom.all_measures():
        print(m.Name)

Parameters

None

Returns


all_partitions

Outputs a list of all partitions within all tables in the semantic model.

import fabric_cat_tools as fct
from fabric_cat_tools.TOM import connect_semantic_model

with connect_semantic_model(dataset = 'AdventureWorks', workspace = None, readonly = True) as tom:
    for p in tom.all_partitions():
        print(p.Name)

Parameters

None

Returns


all_rls

Outputs a list of all row level security objects within all roles of the semantic model.

import fabric_cat_tools as fct
from fabric_cat_tools.TOM import connect_semantic_model

with connect_semantic_model(dataset = 'AdventureWorks', workspace = None, readonly = True) as tom:
    for r in tom.all_rls():
        print(r.Name)

Parameters

None

Returns


cardinality

Obtains the cardinality of a column within a semantic model.

import fabric_cat_tools as fct
from fabric_cat_tools.TOM import connect_semantic_model

with connect_semantic_model(dataset = 'AdventureWorks', workspace = None, readonly = True) as tom:
    tom.cardinality(object = tom.model.Tables['Product'].Columns['Color'])

Parameters

column

Required; The TOM column object.

Returns


clear_annotations

Removes all annotations on a given object within a semantic model.

import fabric_cat_tools as fct
from fabric_cat_tools.TOM import connect_semantic_model

with connect_semantic_model(dataset = 'AdventureWorks', workspace = None, readonly = False) as tom:
    tom.clear_annotations(object = tom.model.Tables['Product'].Columns['Color'])
import fabric_cat_tools as fct
from fabric_cat_tools.TOM import connect_semantic_model

with connect_semantic_model(dataset = 'AdventureWorks', workspace = None, readonly = False) as tom:
    tom.clear_annotations(object = tom.model.Tables['Product'])

Parameters

object

Required; The TOM object.

Returns


clear_extended_properties

Removes all extended properties on a given object witihn a semantic model.

import fabric_cat_tools as fct
from fabric_cat_tools.TOM import connect_semantic_model

with connect_smantic_model(dataset = 'AdventureWorks', workspace = None, readonly = False) as tom:
    tom.clear_extened_properties(object = tom.model.Tables['Product'].Columns['Color'])

Parameters

object

Required; The TOM object.

Returns


data_size

Obtains the data size of a column within a semantic model.

import fabric_cat_tools as fct
from fabric_cat_tools.TOM import connect_semantic_model

with connect_semantic_model(dataset = 'AdventureWorks', workspace = None, readonly = True) as tom:
    tom.data_size(column = tom.model.Tables['Product'].Columns['Color'])

Parameters

column

Required; The TOM column object.

Returns


depends_on

import fabric_cat_tools as fct
from fabric_cat_tools.TOM import connect_semantic_model

with connect_semantic_model(dataset = 'AdventureWorks', workspace = None, readonly = True) as tom:

    dep = fct.get_model_calc_dependencies(dataset = 'AdventureWorks', workspace = None)
    tom.depends_on(
        object = tom.model.Tables['Product'].Columns['Color'],
        dependencies = dep
    )

Parameters

object

Required; The TOM object.

dependencies

Required; A dataframe showing the model's calculation dependencies.

Returns


dictionary_size

Obtains the

import fabric_cat_tools as fct
from fabric_cat_tools.TOM import connect_semantic_model

with connect_semantic_model(dataset = 'AdventureWorks', workspace = None, readonly = True) as tom:
    tom.dictionary_size(column = tom.model.Tables['Product'].Columns['Color'])

Parameters

column

Required; The TOM column object.

Returns


fully_qualified_measures

Shows all fully-qualified measures referenced by a given measure's DAX expression.

import fabric_cat_tools as fct
from fabric_cat_tools.TOM import connect_semantic_model

with connect_semantic_model(dataset = 'AdventureWorks', workspace = None, readonly = True) as tom:

    dep = fct.get_model_calc_dependencies(dataset = 'AdventureWorks', workspace = None)
    tom.fully_qualified_measuress(
        object = tom.model.Tables['Product'].Columns['Color'],
        dependencies = dep
    )

Parameters

object

Required; The TOM object.

dependencies

Required; A dataframe showing the model's calculation dependencies.

Returns


get_annotation_value

Obtains the annotation value for a given object's annotation in a semantic model.

import fabric_cat_tools as fct
from fabric_cat_tools.TOM import connect_semantic_model

with connect_semantic_model(dataset = 'AdventureWorks', workspace = None, readonly = True) as tom:
    tom.get_annotation_value(
        object = tom.model.Tables['Product'].Columns['Color'],
        name = 'MyAnnotation'
    )

Parameters

object

Required; The TOM object.

name str

Required; The name of the annotation.

Returns


get_annotations

Obtains all of the annotations for a given object in a semantic model.

import fabric_cat_tools as fct
from fabric_cat_tools.TOM import connect_semantic_model

with connect_semantic_model(dataset = 'AdventureWorks', workspace = None, readonly = True) as tom:
    tom.get_annotations(
        object = tom.model.Tables['Product'].Columns['Color']
    )

Parameters

object

Required; The TOM object.

Returns


get_extended_properties

Obtains all of the extended properties for a given object in a semantic model.

import fabric_cat_tools as fct
from fabric_cat_tools.TOM import connect_semantic_model

with connect_semantic_model(dataset = 'AdventureWorks', workspace = None, readonly = True) as tom:
    tom.get_extended_properties(
        object = tom.model.Tables['Product'].Columns['Color']
    )

Parameters

object

Required; The TOM object.

Returns


get_extended_property_value

Obtains the extended property value for an object's extended property.

import fabric_cat_tools as fct
from fabric_cat_tools.TOM import connect_semantic_model

with connect_semantic_model(dataset = 'AdventureWorks', workspace = None, readonly = True) as tom:
    tom.get_extended_property_value(
        object = tom.model.Tables['Product'].Columns['Color'],
        name = 'MyExtendedProperty'
    )

Parameters

object

Required; The TOM object.

name str

Required; The name of the extended property.

Returns


in_perspective

Identifies whether an object is in a given perspective.

import fabric_cat_tools as fct
from fabric_cat_tools.TOM import connect_semantic_model

with connect_semantic_model(dataset = 'AdventureWorks', workspace = None, readonly = True) as tom:
    tom.in_perspective(
        object = tom.model.Tables['Product'].Columns['Color'],
        perspective_name = 'Marketing'
    )

Parameters

object

Required; The TOM object.

perspective_name str

Required; The name of the perspective.

Returns


is_direct_lake

Identifies whether a semantic model is in Direct Lake mode.

import fabric_cat_tools as fct
from fabric_cat_tools.TOM import connect_semantic_model

with connect_semantic_model(dataset = 'AdventureWorks', workspace = None, readonly = True) as tom:
    print(tom.is_direct_lake())

Parameters

None

Returns

True/False


is_field_parameter

Identifies whether a table is a field parameter.

import fabric_cat_tools as fct
from fabric_cat_tools.TOM import connect_semantic_model

with connect_semantic_model(dataset = 'AdventureWorks', workspace = None, readonly = True) as tom:
    print(tom.is_field_parameter(
        table_name = 'Parameter'
    ))

Parameters

table_name str

Required; The name of the table.

Returns

True/False


records_per_segment

Obtains the records per segment of a partition within a semantic model.

import fabric_cat_tools as fct
from fabric_cat_tools.TOM import connect_semantic_model

with connect_semantic_model(dataset = 'AdventureWorks', workspace = None, readonly = True) as tom:
    tom.records_per_segment(
        object = tom.model.Tables['Sales'].Partitions['Sales - 2024']
    )

Parameters

object

Required; The TOM object.

Returns


referenced_by

Shows the objects referenced by a given object in a semantic model.

import fabric_cat_tools as fct
from fabric_cat_tools.TOM import connect_semantic_model

with connect_semantic_model(dataset = 'AdventureWorks', workspace = None, readonly = True) as tom:

    dep = fct.get_model_calc_dependencies(dataset = 'AdventureWorks', workspace = None)
    tom.referenced_by(
        object = tom.model.Tables['Product'].Columns['Color'],
        dependencies = dep
    )

Parameters

object

Required; The TOM object.

dependencies

Required; A dataframe showing the model's calculation dependencies.

Returns


remove_annotation

Removes the annotation from an object in a semantic model.

import fabric_cat_tools as fct
from fabric_cat_tools.TOM import connect_semantic_model

with connect_semantic_model(dataset = 'AdventureWorks', workspace = None, readonly = False) as tom:
    tom.remove_annotation(
        object = tom.model.Tables['Product'].Columns['Color'],
        name = 'MyAnnotation'
    )

Parameters

object

Required; The TOM object.

name str

Required; The name of the annotation.

Returns


remove_extended_property

Removes the extended property from an object in a semantic model.

import fabric_cat_tools as fct
from fabric_cat_tools.TOM import connect_semantic_model

with connect_semantic_model(dataset = 'AdventureWorks', workspace = None, readonly = False) as tom:
    tom.remove_extended_property(
        object = tom.model.Tables['Product'].Columns['Color'],
        name = 'MyExtendedProperty'
    )

Parameters

object

Required; The TOM object.

name str

Required; The name of the extended property.

Returns


remove_from_perspective

Removes an object (table, column, measure or hierarchy) from a perspective.

import fabric_cat_tools as fct
from fabric_cat_tools.TOM import connect_semantic_model

with connect_semantic_model(dataset = 'AdventureWorks', workspace = None, readonly = False) as tom:
    tom.remove_from_perspective(
        object = tom.model.Tables['Product'].Columns['Color'],
        perspective_name = 'Marketing'
    )

Parameters

object

Required; The TOM object.

perspective_name str

Required; The name of the perspective.

Returns


remove_object

Removes an object from a semantic model.

import fabric_cat_tools as fct
from fabric_cat_tools.TOM import connect_semantic_model

with connect_semantic_model(dataset = 'AdventureWorks', workspace = None, readonly = False) as tom:
    tom.remove_object(
        object = tom.model.Tables['Product'].Columns['Color']
    )

Parameters

object

Required; The TOM object.

Returns


remove_translation

Removes a translation for an object in a semantic model.

import fabric_cat_tools as fct
from fabric_cat_tools.TOM import connect_semantic_model

with connect_semantic_model(dataset = 'AdventureWorks', workspace = None, readonly = False) as tom:
    tom.remove_translation(
        object = tom.model.Tables['Product'].Columns['Color'],
        language = 'it-IT'
    )

Parameters

object

Required; The TOM object.

language str

Required; The language code.

Returns


remove_vertipaq_annotations

Removes the annotations set using the [set_vertipaq_annotations] function.

import fabric_cat_tools as fct
from fabric_cat_tools.TOM import connect_semantic_model

with connect_semantic_model(dataset = 'AdventureWorks', workspace = None, readonly = False) as tom:
    tom.remove_vertipaq_annotations()

Parameters

None

Returns


row_count

Obtains the row count of a table or partition within a semantic model.

import fabric_cat_tools as fct
from fabric_cat_tools.TOM import connect_semantic_model

with connect_semantic_model(dataset = 'AdventureWorks', workspace = None, readonly = True) as tom:
    tom.row_count(
        object = tom.model.Tables['Product']
    )

Parameters

object

Required; The TOM object.

Returns


set_annotation

Sets an annotation on an object within a semantic model.

import fabric_cat_tools as fct
from fabric_cat_tools.TOM import connect_semantic_model

with connect_semantic_model(dataset = 'AdventureWorks', workspace = None, readonly = False) as tom:
    tom.set_annotation(
        object = tom.model.Tables['Product'].Columns['Color'],
        name = 'MyAnnotation',
        value = '1'
    )

Parameters

object

Required; The TOM object.

name str

Required; The annotation name.

value str

Required; The annotation value.

Returns


set_direct_lake_behavior

Sets the DirectLakeBehavior property for a Direct Lake semantic model.

import fabric_cat_tools as fct
from fabric_cat_tools.TOM import connect_semantic_model

with connect_semantic_model(dataset = 'AdventureWorks', workspace = None, readonly = False) as tom:
    tom.set_direct_lake_behavior(
        direct_lake_behavior = 'DirectLakeOnly'
    )

Parameters

direct_lake_behavior str

Required; The DirectLakeBehavior value.

Returns


set_extended_property

Sets an extended property on an object within the semantic model.

import fabric_cat_tools as fct
from fabric_cat_tools.TOM import connect_semantic_model

with connect_semantic_model(dataset = 'AdventureWorks', workspace = None, readonly = False) as tom:
    tom.set_extended_property(
        object = tom.model.Tables['Product'].Columns['Color'],
        type = 'Json',
        name = 'MyExtendedProperty',
        value = '{...}'
    )

Parameters

object

Required; The TOM object.

extended_property_type str

Required; The type of extended property to set. Options: ['Json', 'String'].

name str

Required; The name of the extended property.

value str

Required; The value of the extended property.

Returns


set_is_available_in_mdx

import fabric_cat_tools as fct
from fabric_cat_tools.TOM import connect_semantic_model

with connect_semantic_model(dataset = 'AdventureWorks', workspace = None, readonly = False) as tom:
    tom.set_is_available_in_mdx(
        table_name = 'Sales',
        column_name = 'SalesAmount',
        value = False
    )

Parameters

table_name str

Required; The name of the table in which the column resides.

column_name str

Required; The name of the column.

value bool

Required; The value to set for the IsAvailableInMDX property.

Returns


set_ols

Sets object level security for a given role/column within a semantic model.

import fabric_cat_tools as fct
from fabric_cat_tools.TOM import connect_semantic_model

with connect_semantic_model(dataset = 'AdventureWorks', workspace = None, readonly = False) as tom:
    tom.set_ols(
        role_name = 'Reader'
        table_name = 'Geography',
        column_name = 'Country',
        permission = 'None'
    )

Parameters

role_name str

Required; The name of the role.

table_name str

Required; The name of the table.

column_name str

Required; The name of the column.

permission str

Required; The permission for a given column. Options: ['Read', 'None', 'Default'].

Returns


set_rls

Sets the row level security expression for a given role/table within a semantic model.

import fabric_cat_tools as fct
from fabric_cat_tools.TOM import connect_semantic_model

with connect_semantic_model(dataset = 'AdventureWorks', workspace = None, readonly = False) as tom:
    tom.set_rls(
        role_name = 'Reader'
        table_name = 'UserGeography',
        filter_expression = "'UserGeography'[UserEmail] = USERPRINCIPALNAME()"
    )

Parameters

role_name str

Required; The name of the role.

table_name str

Required; The name of the table to place row level security.

filter_expression str

Required; The DAX expression containing the row level security logic.

Returns


set_summarize_by

Sets the Summarize By property on a column in a semantic model.

import fabric_cat_tools as fct
from fabric_cat_tools.TOM import connect_semantic_model

with connect_semantic_model(dataset = 'AdventureWorks', workspace = None, readonly = False) as tom:
    tom.set_summarize_by(
        table_name = 'Geography',
        column_name = 'Country',
        value = 'None'
    )

Parameters

table_name str

Required; The name of the table in which the column resides.

column_name str

Required; The name of the column.

value str

Required; The summarize by property of the column.

Returns


set_translation

Sets the translation value for an object in a semantic model.

import fabric_cat_tools as fct
from fabric_cat_tools.TOM import connect_semantic_model

with connect_semantic_model(dataset = 'AdventureWorks', workspace = None, readonly = False) as tom:
    tom.set_translation(
        object = tom.model.Tables['Geography']
        language = 'it-IT'
        property = 'Name'
        value = 'Geografia'
    )

Parameters

object

Required; The TOM object.

language str

Required; The language code in which to translate the object property.

property str

Required; The property to translate. One of the following values: ['Name', 'Description', 'Display Folder'].

value str

Required; The translation value.

Returns


set_vertipaq_annotations

Saves Vertipaq Analyzer statistics as annotations on objects in the semantic model.

import fabric_cat_tools as fct
from fabric_cat_tools.TOM import connect_semantic_model

with connect_semantic_model(dataset = 'AdventureWorks', workspace = None, readonly = False) as tom:
    tom.set_vertipaq_annotations()

Parameters

None

Returns


total_size

Obtains the total size (in bytes) of a table or column within a semantic model.

import fabric_cat_tools as fct
from fabric_cat_tools.TOM import connect_semantic_model

with connect_semantic_model(dataset = 'AdventureWorks', workspace = None, readonly = True) as tom:
    tom.total_size(
        object = tom.model.Tables['Sales'].Columns['SalesAmount']
    )

Parameters

object

Required; The TOM object.

Returns

The total size (in bytes) of the object.


unqualified_columns

Shows all unqalified columns referenced by a given measure's DAX expression.

import fabric_cat_tools as fct
from fabric_cat_tools.TOM import connect_semantic_model

with connect_semantic_model(dataset = 'AdventureWorks', workspace = None, readonly = True) as tom:

    dep = fct.get_model_calc_dependencies(dataset = 'AdventureWorks', workspace = None)
    tom.unqualified_columns(
        object = tom.model.Tables['Product'].Columns['Color'],
        dependencies = dep
    )

Parameters

object

Required; The TOM object.

dependencies

Required; A dataframe showing the model's calculation dependencies.

Returns


used_in_calc_item

Identifies the calculation items which reference a given object.

import fabric_cat_tools as fct
from fabric_cat_tools.TOM import connect_semantic_model

with connect_semantic_model(dataset = 'AdventureWorks', workspace = None, readonly = True) as tom:

    dep = fct.get_model_calc_dependencies(dataset = 'AdventureWorks', workspace = None)
    tom.used_in_calc_item(
        object = tom.model.Tables['Product'].Columns['Color'],
        dependencies = dep
    )

Parameters

object

Required; The TOM object.

dependencies

Required; A dataframe showing the model's calculation dependencies.

Returns


used_in_hierarchies

Identifies the hierarchies which reference a given column.

import fabric_cat_tools as fct
from fabric_cat_tools.TOM import connect_semantic_model

with connect_semantic_model(dataset = 'AdventureWorks', workspace = None, readonly = True) as tom:
    tom.used_in_hierarchies(
        column = tom.model.Tables['Geography'].Columns['City']
    )

Parameters

column

Required; The TOM column object.

Returns


used_in_levels

Identifies the levels which reference a given column.

import fabric_cat_tools as fct
from fabric_cat_tools.TOM import connect_semantic_model

with connect_semantic_model(dataset = 'AdventureWorks', workspace = None, readonly = True) as tom:
    tom.used_in_levels(
        column = tom.model.Tables['Geography'].Columns['City']
    )

Parameters

column

Required; The TOM column object.

Returns


used_in_relationships

Identifies the relationships which use a given table/column.

import fabric_cat_tools as fct
from fabric_cat_tools.TOM import connect_semantic_model

with connect_semantic_model(dataset = 'AdventureWorks', workspace = None, readonly = True) as tom:
    tom.used_in_relationships(
        object = tom.model.Tables['Geography'].Columns['GeographyID']
    )
import fabric_cat_tools as fct
from fabric_cat_tools.TOM import connect_semantic_model

with connect_semantic_model(dataset = 'AdventureWorks', workspace = None, readonly = True) as tom:
    tom.used_in_relationships(
        object = tom.model.Tables['Geography']
    )

Parameters

object

Required; The TOM object.

Returns


used_in_rls

Identifies the filter expressions which reference a given object.

import fabric_cat_tools as fct
from fabric_cat_tools.TOM import connect_semantic_model

with connect_semantic_model(dataset = 'AdventureWorks', workspace = None, readonly = True) as tom:

    dep = fct.get_model_calc_dependencies(dataset = 'AdventureWorks', workspace = None)
    tom.used_in_rls(
        object = tom.model.Tables['Product'].Columns['Color'],
        dependencies = dep
    )

Parameters

object

Required; The TOM object.

dependencies

Required; A dataframe showing the model's calculation dependencies.

Returns


used_in_sort_by

Identifies the column used for sorting a given column.

import fabric_cat_tools as fct
from fabric_cat_tools.TOM import connect_semantic_model

with connect_semantic_model(dataset = 'AdventureWorks', workspace = None, readonly = True) as tom:
    tom.used_in_sort_by(
        column = tom.model.Tables['Geography'].Columns['City']
    )

Parameters

column

Required; The TOM column object.

Returns


used_size

Obtains the used of a hierarchy or relationship within a semantic model.

import fabric_cat_tools as fct
from fabric_cat_tools.TOM import connect_semantic_model

with connect_semantic_model(dataset = 'AdventureWorks', workspace = None, readonly = True) as tom:
    tom.used_size(
        object = tom.model.Tables['Geography'].Hierarchies['Geo Hierarchy']
    )

Parameters

object

Required; The TOM object.

Returns


Direct Lake migration

The following process automates the migration of an import/DirectQuery model to a new Direct Lake model. The first step is specifically applicable to models which use Power Query to perform data transformations. If your model does not use Power Query, you must migrate the base tables used in your semantic model to a Fabric lakehouse.

Check out Nikola Ilic's terrific blog post on this topic!

Check out my blog post on this topic!

Prerequisites

  • Make sure you enable XMLA Read/Write for your capacity
  • Make sure you have a lakehouse in a Fabric workspace
  • Enable the following setting: Workspace -> Workspace Settings -> General -> Data model settings -> Users can edit data models in the Power BI service

Instructions

  1. Download this notebook. Use version 0.2.1 or higher only.
  2. Make sure you are in the 'Data Engineering' persona. Click the icon at the bottom left corner of your Workspace screen and select 'Data Engineering'
  3. In your workspace, select 'New -> Import notebook' and import the notebook from step 1.
  4. Add your lakehouse to your Fabric notebook
  5. Follow the instructions within the notebook.

The migration process

Note

The first 4 steps are only necessary if you have logic in Power Query. Otherwise, you will need to migrate your semantic model source tables to lakehouse tables.

  1. The first step of the notebook creates a Power Query Template (.pqt) file which eases the migration of Power Query logic to Dataflows Gen2.
  2. After the .pqt file is created, sync files from your OneLake file explorer
  3. Navigate to your lakehouse (this is critical!). From your lakehouse, create a new Dataflows Gen2, and import the Power Query Template file. Doing this step from your lakehouse will automatically set the destination for all tables to this lakehouse (instead of having to manually map each one).
  4. Publish the Dataflow Gen2 and wait for it to finish creating the delta lake tables in your lakehouse.
  5. Back in the notebook, the next step will create your new Direct Lake semantic model with the name of your choice, taking all the relevant properties from the orignal semantic model and refreshing/framing your new semantic model.

Note

As of version 0.2.1, calculated tables are also migrated to Direct Lake (as data tables with their DAX expression stored as model annotations in the new semantic model). Additionally, Field Parameters are migrated as they were in the original semantic model (as a calculated table).

  1. Finally, you can easily rebind your all reports which use the import/DQ semantic model to the new Direct Lake semantic model in one click.

Completing these steps will do the following:

  • Offload your Power Query logic to Dataflows Gen2 inside of Fabric (where it can be maintained and development can continue).
  • Dataflows Gen2 will create delta tables in your Fabric lakehouse. These tables can then be used for your Direct Lake model.
  • Create a new semantic model in Direct Lake mode containing all the standard tables and columns, calculation groups, measures, relationships, hierarchies, roles, row level security, perspectives, and translations from your original semantic model.
  • Viable calculated tables are migrated to the new semantic model as data tables. Delta tables are dynamically generated in the lakehouse to support the Direct Lake model. The calculated table DAX logic is stored as model annotations in the new semantic model.
  • Field parameters are migrated to the new semantic model as they were in the original semantic model (as calculated tables). Any calculated columns used in field parameters are automatically removed in the new semantic model's field parameter(s).
  • Non-supported objects are not transferred (i.e. calculated columns, relationships using columns with unsupported data types etc.).
  • Reports used by your original semantic model will be rebinded to your new semantic model.