Hi thereI have written a PS script to export a batch of tables listed in a .txt file to CSV:$server = "XXX"$database = "XXX"$TableFile = XXX.txt'#Delcare Connection Variables$connectionTemplate = "Data Source=$server;Integrated Security=SSPI;Initial Catalog=$database;"$connectionString = [string]::Format($connectionTemplate, $server, $database)$connection = New-Object System.Data.SqlClient.SqlConnection$connection.ConnectionString = $connectionString $command = New-Object System.Data.SqlClient.SqlCommand# Loop through all tables and export a CSV of the Table DataGet-Content $TableFile | ForEach-Object { $queryData = "SELECT * FROM XXX.[$_] WITH (NOLOCK)" #Specify the output location of your dump file $extractFile = "XXX\$_.csv" $command.CommandText = $queryData $command.Connection = $connection $SqlAdapter = New-Object System.Data.SqlClient.SqlDataAdapter $SqlAdapter.SelectCommand = $command $DataSet = New-Object System.Data.DataSet $SqlAdapter.Fill($DataSet) $connection.Close() $DataSet.Tables[0] | Export-Csv $extractFile -NoTypeInformation -Delimiter "|" #write-output $queryData}This works great until the table is more than a few GB - a 10GB tables can take a day to export.Is there anything I can change to significantly reduce this such as exporting in batches etc, using DataStream etv?Thanks in advanceSteve
↧