top of page

Exporting Dataverse Schema: A PowerShell Script for Table and Column Documentation

  • Writer: Sofia Ng
    Sofia Ng
  • 12 minutes ago
  • 6 min read

Managing a Microsoft Dataverse environment can be complex, especially when you need to document your custom tables and columns. Whether you're conducting an audit, planning a migration, or simply need comprehensive documentation of your schema, having a reliable way to export your table and column metadata is invaluable.

In this post, I'll walk you through a PowerShell script that efficiently extracts table and column information from your Dataverse environment, filtered by publisher prefix, and exports it to CSV files for easy analysis and documentation.

Cartoon-style illustration of cute, anthropomorphic tables and columns with smiling faces, arms, and legs, happily marching in a line from a computer cloud into filing cabinets, like cheerful data minions.

What This Script Does

This PowerShell script connects to your Dataverse environment and:

  • Fetches all custom tables with a specific publisher prefix (e.g., "csm_")

  • Extracts comprehensive table metadata including display names, descriptions, and system properties

  • Retrieves detailed column information including data types, constraints, and validation rules

  • Exports everything to CSV files for easy analysis in Excel or other tools


Key Features

Filtered Export

The script focuses on tables with your specified prefix, making it perfect for:

  • Documenting solution-specific tables

  • Auditing custom development work

  • Preparing migration documentation

Comprehensive Metadata

For tables, it captures:

  • Logical and display names

  • Descriptions

  • Schema names

  • Primary key information

  • Activity and privacy flags

For columns, it includes:

  • All basic properties (names, types, descriptions)

  • Type-specific constraints (max length, precision, min/max values)

  • System flags (primary key, required level, etc.)

  • Date format specifications

Error Handling

The script includes error handling for:

  • Authentication issues

  • API rate limiting

  • Malformed OData responses

  • Network connectivity problems


Prerequisites

Before running the script, ensure you have:

  1. PowerShell 5.1 or later

  2. MSAL.PS module installed:

Install-Module MSAL.PS -Scope CurrentUser
  1. An Azure AD app registration with appropriate permissions

  2. System Administrator or System Customizer role in Dataverse


Configuration

The script requires minimal configuration at the top:

$EnvUrl       = "https://yourorg.crm6.dynamics.com"
$TenantId     = "your-tenant-id"
$ClientId     = "your-client-id"
$ClientSecret = "your-client-secret"
$Prefix       = "csm_"  # Your publisher prefix

How It Works

1. Authentication

The script uses the client credentials flow to authenticate with your Dataverse environment. This approach is ideal for automated scenarios and doesn't require user interaction.

2. Table Discovery

It queries the EntityDefinitions collection, filtering for tables that match your specified prefix. This ensures you only get the custom tables relevant to your solution.

3. Column Enumeration

For each discovered table, the script:

  • Fetches all attributes using the base AttributeMetadata properties

  • Makes additional API calls to get type-specific metadata (like max length for strings, precision for decimals)

  • Combines all information into comprehensive column records

4. Data Export

Both tables and columns are sorted logically and exported to CSV files with UTF-8 encoding, ensuring compatibility with Excel and other analysis tools.


Advanced Features

Type-Specific Metadata Retrieval

One of the script's most powerful features is its ability to cast attribute metadata to derived types. This allows it to retrieve:

  • String/Memo fields: Maximum length constraints

  • Decimal/Double/Money fields: Precision and min/max values

  • Integer fields: Min/max value ranges

  • DateTime fields: Format specifications

Pagination Handling

The script automatically handles OData pagination, ensuring you get complete results even for environments with hundreds of tables and thousands of columns.

NextLink Sanitisation

A unique feature is the Sanitise-DvNextLink function, which cleans up malformed pagination URLs that sometimes occur in Dataverse responses. This makes the script more resilient in production environments.


Output Files

The script generates two CSV files:

tables.csv

Contains one row per custom table with columns like:

  • LogicalName, DisplayName, Description

  • SchemaName, PrimaryIdAttribute

  • IsActivity, IsPrivate, IsCustomEntity flags

columns.csv

Contains one row per column with details such as:

  • Table and column names (logical and display)

  • AttributeType and system flags

  • Type-specific constraints (MaxLength, Precision, etc.)

  • Required level and validation information


Use Cases

This script is particularly valuable for:

Documentation Projects

Generate comprehensive schema documentation for compliance or knowledge transfer purposes.

Migration Planning

Understand your current schema before moving to a new environment or upgrading solutions.

Solution Analysis

Analyse the complexity and structure of custom solutions, identify unused fields, or plan optimisation efforts.

Compliance Audits

Provide auditors with detailed schema information including field types, constraints, and descriptions.



Conclusion

This PowerShell script provides a robust, automated way to document your Dataverse schema. By focusing on your custom tables and providing comprehensive metadata extraction, it saves hours of manual documentation work while ensuring accuracy and completeness.

The script's error handling and pagination features make it suitable for production environments, while its CSV output format ensures compatibility with your existing analysis and documentation tools.

Whether you're a Dataverse administrator, solution architect, or developer, having reliable schema export capabilities is essential for maintaining and evolving your Dataverse solutions effectively.


Script


#region ---------- CONFIG ----------

$EnvUrl = "https://xxxxxxx.crm6.dynamics.com" # Dataverse environment URL

$TenantId = "xxxxxxx-xxxxx-xxxxx-xxxxx-xxxxxx"

$ClientId = "xxxxxxx-xxxxx-xxxxx-xxxxx-xxxxxx"

$ClientSecret= "xxxxxxxxxxxxxxxxxxxxxxxxxxxx"


# Filter: only tables with this prefix

$Prefix = "csm_" # publisher prefix


# Optional: include annotations (labels are flattened separately below)

$IncludeAnnotations = $false


# Output paths (can be local or UNC paths)

$TablesPath = "C:\Users\dataverse\tables.csv"

$ColumnsPath = "C:\Users\reports\dataverse\columns.csv"



#region ---------- SETUP ----------

[Net.ServicePointManager]::SecurityProtocol = [Net.SecurityProtocolType]::Tls12


# Auth (client credentials) — requires: Install-Module MSAL.PS -Scope CurrentUser

$secure = ($ClientSecret | ConvertTo-SecureString -AsPlainText -Force)

$token = (Get-MsalToken -TenantId $TenantId -ClientId $ClientId -ClientSecret $secure -Scopes "$EnvUrl/.default").AccessToken


$base = "$EnvUrl/api/data/v9.2"

$hdrs = @{

Authorization = "Bearer $token"

"OData-MaxVersion" = "4.0"

"OData-Version" = "4.0"

Accept = "application/json"

}


# Prefer headers

$prefer = @("odata.maxpagesize=500")

if ($IncludeAnnotations) { $prefer += 'odata.include-annotations="*"' }

$hdrs["Prefer"] = ($prefer -join ", ")


#region ---------- HELPERS ----------

function Get-Label {

param([object]$labelObj)

if ($null -eq $labelObj) { return $null }

$loc = $labelObj.LocalizedLabels

if ($null -eq $loc -or $loc.Count -eq 0) { return $null }

($loc | Select-Object -First 1).Label

}


function Invoke-DvGet {

param(

[Parameter(Mandatory=$true)][string]$Url,

[Parameter(Mandatory=$true)][hashtable]$Headers

)

Invoke-RestMethod -Method GET -Uri $Url -Headers $Headers -ErrorAction Stop

}


function Sanitise-DvNextLink {

param([string]$nextLink)

$allowed = @('$select','$filter','$orderby','$expand','$skiptoken','api-version')

$uri = [System.Uri]$nextLink

$qs = [System.Web.HttpUtility]::ParseQueryString($uri.Query)


$newQs = New-Object System.Collections.Specialized.NameValueCollection

foreach ($k in $qs.Keys) {

if ($allowed -contains $k) { $newQs.Add($k, $qs[$k]) }

}


$builder = [System.UriBuilder]$uri

$pairs = @()

foreach ($k in $newQs.Keys) {

$pairs += ("{0}={1}" -f [System.Web.HttpUtility]::UrlEncode($k), [System.Web.HttpUtility]::UrlEncode($newQs[$k]))

}

$builder.Query = ($pairs -join '&')

$builder.Uri.AbsoluteUri

}


function Get-All {

param([Parameter(Mandatory=$true)][string]$Url)


$items = @()

$next = $Url


do {

try {

$resp = Invoke-DvGet -Url $next -Headers $hdrs

}

catch {

Write-Warning ("GET failed on: {0}" -f $next)

Write-Warning $_.Exception.Message


$sanitised = Sanitise-DvNextLink -nextLink $next

if ($sanitised -ne $next) {

Write-Verbose ("Retrying with sanitised nextLink: {0}" -f $sanitised)

$resp = Invoke-DvGet -Url $sanitised -Headers $hdrs

$next = $sanitised

} else {

throw

}

}


if ($resp.value) { $items += $resp.value }

$next = $resp.'@odata.nextLink'

if ($next) { $next = Sanitise-DvNextLink -nextLink $next }

} while ($next)


$items

}


# Map AttributeType -> derived metadata type name and its $select

$AttributeTypeToDerived = @{

"String" = @{ Type = "Microsoft.Dynamics.CRM.StringAttributeMetadata"; Select = "MaxLength" }

"Memo" = @{ Type = "Microsoft.Dynamics.CRM.MemoAttributeMetadata"; Select = "MaxLength" }

"Decimal" = @{ Type = "Microsoft.Dynamics.CRM.DecimalAttributeMetadata"; Select = "Precision,MinValue,MaxValue" }

"Double" = @{ Type = "Microsoft.Dynamics.CRM.DoubleAttributeMetadata"; Select = "Precision,MinValue,MaxValue" }

"Integer" = @{ Type = "Microsoft.Dynamics.CRM.IntegerAttributeMetadata"; Select = "MinValue,MaxValue" }

"Money" = @{ Type = "Microsoft.Dynamics.CRM.MoneyAttributeMetadata"; Select = "Precision,MinValue,MaxValue" }

"DateTime" = @{ Type = "Microsoft.Dynamics.CRM.DateTimeAttributeMetadata"; Select = "Format" }

}


function Get-TypeSpecificMeta {

param(

[Parameter(Mandatory=$true)][string]$entityLogicalName,

[Parameter(Mandatory=$true)][string]$attributeLogicalName,

[Parameter(Mandatory=$true)][string]$attributeType

)


if (-not $AttributeTypeToDerived.ContainsKey($attributeType)) {

return @{}

}


$info = $AttributeTypeToDerived[$attributeType]

$dtype = $info.Type

$dsel = $info.Select


# Casted read of the single attribute (no $expand)

$url = ("{0}/EntityDefinitions(LogicalName='{1}')/Attributes(LogicalName='{2}')/{3}?`$select={4}" -f $base, $entityLogicalName, $attributeLogicalName, $dtype, $dsel)


try {

$r = Invoke-DvGet -Url $url -Headers $hdrs

}

catch {

return @{} # skip if cast not available

}


$out = @{}

foreach ($k in ($dsel -split ',')) {

$trim = $k.Trim()

if ($r.PSObject.Properties.Name -contains $trim) {

$out[$trim] = $r.$trim

}

}

$out

}


#region ---------- FETCH TABLES (filtered to prefix) ----------

$entitySelect = "LogicalName,DisplayName,Description,SchemaName,PrimaryIdAttribute,CollectionSchemaName,IsActivity,IsPrivate,IsCustomEntity"

$allEntities = Get-All ("$base/EntityDefinitions" + '?$select=' + $entitySelect)



$entities = $allEntities | Where-Object {

$_.LogicalName -like ($Prefix + "*")

}


$tables = $entities | ForEach-Object {

[PSCustomObject]@{

LogicalName = $_.LogicalName

DisplayName = Get-Label $_.DisplayName

Description = Get-Label $_.Description

SchemaName = $_.SchemaName

PrimaryIdAttribute = $_.PrimaryIdAttribute

CollectionName = $_.CollectionSchemaName

IsActivity = $_.IsActivity

IsPrivate = $_.IsPrivate

IsCustomEntity = $_.IsCustomEntity

}

}


#region ---------- FETCH COLUMNS (base + derived cast per attribute) ----------

$columns = @()


# Base properties available on AttributeMetadata

$attrBaseSelect = "LogicalName,AttributeType,SchemaName,DisplayName,Description,IsValidForRead,IsPrimaryId,IsPrimaryName,IsLogical,RequiredLevel"


foreach ($e in $entities) {

$baseUrl = ("$base/EntityDefinitions(LogicalName='{0}')/Attributes" -f $e.LogicalName) + ('?$select=' + $attrBaseSelect)

$attrs = Get-All $baseUrl


foreach ($a in $attrs) {

# Type-specific follow-up (if any)

$typeMeta = Get-TypeSpecificMeta -entityLogicalName $e.LogicalName -attributeLogicalName $a.LogicalName -attributeType $a.AttributeType


$maxLen = if ($typeMeta.ContainsKey('MaxLength')) { $typeMeta['MaxLength'] } else { $null }

$precision= if ($typeMeta.ContainsKey('Precision')) { $typeMeta['Precision'] } else { $null }

$minValue = if ($typeMeta.ContainsKey('MinValue')) { $typeMeta['MinValue'] } else { $null }

$maxValue = if ($typeMeta.ContainsKey('MaxValue')) { $typeMeta['MaxValue'] } else { $null }

$format = if ($typeMeta.ContainsKey('Format')) { $typeMeta['Format'] } else { $null }


$columns += [PSCustomObject]@{

TableLogicalName = $e.LogicalName

TableDisplayName = Get-Label $e.DisplayName


ColumnLogicalName = $a.LogicalName

ColumnDisplayName = Get-Label $a.DisplayName

ColumnDescription = Get-Label $a.Description


AttributeType = $a.AttributeType

IsPrimaryId = $a.IsPrimaryId

IsPrimaryName = $a.IsPrimaryName

IsLogical = $a.IsLogical

IsValidForRead = $a.IsValidForRead

RequiredLevel = if ($null -ne $a.RequiredLevel) { $a.RequiredLevel.Value } else { $null }


MaxLength = $maxLen

Precision = $precision

MinValue = $minValue

MaxValue = $maxValue

Format = $format

}

}

}


#region ---------- OUTPUT ----------

$tablesPath = Join-Path -Path (Get-Location) -ChildPath "tables.csv"

$columnsPath = Join-Path -Path (Get-Location) -ChildPath "columns.csv"


$tables | Sort-Object LogicalName | Export-Csv -NoTypeInformation -Encoding UTF8 -Path $tablesPath

$columns | Sort-Object TableLogicalName, ColumnLogicalName | Export-Csv -NoTypeInformation -Encoding UTF8 -Path $columnsPath


Write-Host "Wrote $tablesPath and $columnsPath"

Comments

Rated 0 out of 5 stars.
No ratings yet

Add a rating

Contact Us

QUESTIONS?

WE'RE HERE TO HELP

  • LinkedIn

© 2023 by Ava Technology Solutions. Proudly created with Wix.com

bottom of page