Author | Chrissy LeMaire (@cl), netnerds.net |
Availability | Windows, Linux, macOS |
Aliases : Write-DbaDataTable
Want to see the source code for this command? Check out Write-DbaDbTableData on GitHub.
Want to see the Bill Of Health for this command? Check out Write-DbaDbTableData.
Performs high-speed bulk inserts of data into SQL Server tables using SqlBulkCopy.
Imports data from various sources (CSV files, DataTables, DataSets, PowerShell objects) into SQL Server tables using SqlBulkCopy for optimal performance. This command handles the heavy lifting of data type conversion from PowerShell to SQL Server, automatically creates missing tables when needed, and provides fine-grained control over bulk copy operations. Commonly used for data migration, ETL processes, and importing large datasets where INSERT statements would be too slow.
Write-DbaDbTableData -SqlInstance <DbaInstanceParameter>
[-SqlCredential <PSCredential>]
[-Database <Object>]
-InputObject <Object>
[-Table] <String>
[[-Schema] <String>]
[-BatchSize <Int32>]
[-NotifyAfter <Int32>]
[-AutoCreateTable]
[-NoTableLock]
[-CheckConstraints]
[-FireTriggers]
[-KeepIdentity]
[-KeepNulls]
[-Truncate]
[-BulkCopyTimeOut <Int32>]
[-ColumnMap <Hashtable>]
[-EnableException]
[-UseDynamicStringLength]
[-WhatIf]
[-Confirm]
[<CommonParameters>]
PS C:\> $DataTable = Import-Csv C:\temp\customers.csv
PS C:\> Write-DbaDbTableData -SqlInstance sql2014 -InputObject $DataTable -Table mydb.dbo.customers
Performs a bulk insert of all the data in customers.csv into database mydb, schema dbo, table customers. A progress bar will be shown as rows are inserted. If the destination table does not exist,
the import will be halted.
PS C:\> $tableName = "MyTestData"
PS C:\> $query = "SELECT name, create_date, owner_sid FROM sys.databases"
PS C:\> $dataset = Invoke-DbaQuery -SqlInstance 'localhost,1417' -SqlCredential $containerCred -Database master -Query $query -As DataSet
PS C:\> $dataset | Write-DbaDbTableData -SqlInstance 'localhost,1417' -SqlCredential $containerCred -Database tempdb -Table $tableName -AutoCreateTable
Pulls data from a SQL Server instance and then performs a bulk insert of the dataset to a new, auto-generated table tempdb.dbo.MyTestData.
PS C:\> $DataTable = Import-Csv C:\temp\customers.csv
PS C:\> Write-DbaDbTableData -SqlInstance sql2014 -InputObject $DataTable -Table mydb.dbo.customers -AutoCreateTable -Confirm
Performs a bulk insert of all the data in customers.csv. If mydb.dbo.customers does not exist, it will be created with inefficient but forgiving DataTypes.
Prompts for confirmation before a variety of steps.
PS C:\> $DataTable = Import-Csv C:\temp\customers.csv
PS C:\> Write-DbaDbTableData -SqlInstance sql2014 -InputObject $DataTable -Table mydb.dbo.customers -Truncate
Performs a bulk insert of all the data in customers.csv. Prior to importing into mydb.dbo.customers, the user is informed that the table will be truncated and asks for confirmation. The user is
prompted again to perform the import.
PS C:\> $DataTable = Import-Csv C:\temp\customers.csv
PS C:\> Write-DbaDbTableData -SqlInstance sql2014 -InputObject $DataTable -Database mydb -Table customers -KeepNulls
Performs a bulk insert of all the data in customers.csv into mydb.dbo.customers. Because Schema was not specified, dbo was used. NULL values in the destination table will be preserved.
PS C:\> $passwd = (Get-Credential NoUsernameNeeded).Password
PS C:\> $AzureCredential = New-Object System.Management.Automation.PSCredential("AzureAccount"),$passwd)
PS C:\> $DataTable = Import-Csv C:\temp\customers.csv
PS C:\> Write-DbaDbTableData -SqlInstance AzureDB.database.windows.net -InputObject $DataTable -Database mydb -Table customers -KeepNulls -SqlCredential $AzureCredential -BulkCopyTimeOut 300
This performs the same operation as the previous example, but against a SQL Azure Database instance using the required credentials.
PS C:\> $process = Get-Process
PS C:\> Write-DbaDbTableData -InputObject $process -SqlInstance sql2014 -Table "[[DbName]]].[Schema.With.Dots].[`"[Process]]`"]" -AutoCreateTable
Creates a table based on the Process object with over 60 columns, converted from PowerShell data types to SQL Server data types. After the table is created a bulk insert is performed to add process
information into the table
Writes the results of Get-Process to a table named: "[Process]" in schema named: Schema.With.Dots in database named: [DbName]
The Table name, Schema name and Database name must be wrapped in square brackets [ ]
Special characters like " must be escaped by a ` character.
In addition any actual instance of the ] character must be escaped by being duplicated.
This is an example of the type conversion in action. All process properties are converted, including special types like TimeSpan. Script properties are resolved before the type conversion starts
thanks to ConvertTo-DbaDataTable.
PS C:\> $server = Connect-DbaInstance -SqlInstance SRV1
PS C:\> $server.Invoke("CREATE TABLE tempdb.dbo.test (col1 INT, col2 VARCHAR(100))")
PS C:\> $data = Invoke-DbaQuery -SqlInstance $server -Query "SELECT 123 AS value1, 'Hello world' AS value2" -As DataSet
PS C:\> $data | Write-DbaDbTableData -SqlInstance $server -Table 'tempdb.dbo.test' -ColumnMap @{ value1 = 'col1' ; value2 = 'col2' }
The dataset column 'value1' is inserted into SQL column 'col1' and dataset column value2 is inserted into the SQL Column 'col2'. All other columns are ignored and therefore null or default values.
The target SQL Server instance or instances.
Alias | |
Required | True |
Pipeline | false |
Default Value |
Accepts various data formats including DataTable, DataSet, CSV files, or PowerShell objects for bulk insertion.
Use DataSet for optimal performance as all records import in a single SqlBulkCopy call. DataTable also performs well but avoid piping directly as it converts to slower DataRow processing.
PowerShell objects are automatically converted to DataTable format before import.
Alias | DataTable |
Required | True |
Pipeline | true (ByValue) |
Default Value |
Specifies the destination table using one, two, or three-part naming (database.schema.table). Supports temp tables with # prefix.
Use square brackets for special characters: [Schema.Name].[Table]] for tables containing brackets. Three-part names override the Database parameter.
Combine with -AutoCreateTable to create missing tables, though manual table creation provides better data type control.
Alias | |
Required | True |
Pipeline | false |
Default Value |
Login to the target instance using alternative credentials. Accepts PowerShell credentials (Get-Credential).
Windows Authentication, SQL Server Authentication, Active Directory - Password, and Active Directory - Integrated are all supported.
For MFA support, please use Connect-DbaInstance.
Alias | |
Required | False |
Pipeline | false |
Default Value |
Specifies the target database for the bulk insert operation. Required when using one or two-part table names.
Use this when you need to target a specific database different from the default connection database.
Alias | |
Required | False |
Pipeline | false |
Default Value |
Sets the schema for the destination table when not specified in the table name. Defaults to 'dbo'.
Use this when working with non-default schemas or when security policies require specific schema targeting.
Alias | |
Required | False |
Pipeline | false |
Default Value | dbo |
Controls how many rows are sent to SQL Server in each batch operation. Defaults to 50,000 rows.
Lower values (5,000-10,000) work better for wide tables or limited memory, while higher values improve performance for narrow tables with sufficient resources.
Alias | |
Required | False |
Pipeline | false |
Default Value | 50000 |
Determines how frequently progress notifications appear during the import operation. Defaults to every 5,000 rows.
Set higher for less frequent updates on large imports, or lower for more granular progress tracking on smaller datasets.
Alias | |
Required | False |
Pipeline | false |
Default Value | 5000 |
Automatically creates the destination table when it doesn't exist, using data types inferred from the source data.
Convenient for quick imports but creates generic data types like NVARCHAR(MAX). For production use, manually create tables with appropriate data types and constraints.
Alias | |
Required | False |
Pipeline | false |
Default Value | False |
Disables the default TABLOCK hint during bulk insert operations, allowing concurrent access to the destination table.
Use when importing to tables that need concurrent read access, though this may reduce import performance compared to the default exclusive lock.
Alias | |
Required | False |
Pipeline | false |
Default Value | False |
Enforces check constraints during the bulk insert operation instead of the default behavior of bypassing them.
Use when data integrity validation is critical, though this reduces import performance. Constraints are normally checked after bulk operations complete.
Alias | |
Required | False |
Pipeline | false |
Default Value | False |
Executes INSERT triggers during the bulk copy operation instead of bypassing them for performance.
Essential when triggers maintain audit trails, calculated fields, or related table updates. Significantly impacts import speed but preserves all database logic.
Alias | |
Required | False |
Pipeline | false |
Default Value | False |
Preserves identity column values from the source data instead of generating new sequential values.
Critical for maintaining referential integrity when importing related tables or restoring data with existing identity dependencies.
Alias | |
Required | False |
Pipeline | false |
Default Value | False |
Maintains NULL values from source data instead of replacing them with column default values.
Use when NULL has specific business meaning in your data or when you need to preserve exact source data representation including missing values.
Alias | |
Required | False |
Pipeline | false |
Default Value | False |
Removes all existing data from the destination table before performing the bulk insert operation.
Useful for refreshing tables with new data while maintaining table structure, indexes, and permissions. Always prompts for confirmation before execution.
Alias | |
Required | False |
Pipeline | false |
Default Value | False |
Sets the maximum time in seconds to wait for the bulk copy operation to complete. Defaults to 5,000 seconds.
Increase for very large datasets or slow storage systems. Set to 0 for unlimited timeout when importing millions of rows.
Alias | |
Required | False |
Pipeline | false |
Default Value | 5000 |
Defines custom mapping between source and destination columns using a hashtable when automatic column mapping fails.
Use when column names differ between source and target, or when you need to import only specific columns. Format: @.
Alias | |
Required | False |
Pipeline | false |
Default Value |
By default, when something goes wrong we try to catch it, interpret it and give you a friendly warning message.
This avoids overwhelming you with "sea of red" exceptions, but is inconvenient because it basically disables advanced scripting.
Using this switch turns this "nice by default" feature off and enables you to catch exceptions with your own try/catch.
Alias | |
Required | False |
Pipeline | false |
Default Value | False |
Creates string columns with lengths based on source data MaxLength property instead of defaulting to NVARCHAR(MAX).
Improves storage efficiency and query performance when AutoCreateTable is used, but requires source data to provide accurate length information.
Alias | |
Required | False |
Pipeline | false |
Default Value | False |
If this switch is enabled, no actions are performed but informational messages will be displayed that explain what would happen if the command were to run.
Alias | wi |
Required | False |
Pipeline | false |
Default Value |
If this switch is enabled, you will be prompted for confirmation before executing any operations that change state.
Alias | cf |
Required | False |
Pipeline | false |
Default Value |