Take the 2-minute tour ×
Database Administrators Stack Exchange is a question and answer site for database professionals who wish to improve their database skills and learn from others in the community. It's 100% free, no registration required.

I have a SQL Sever table with 2GB of data. I can create the script for the file. But the script file is not opening. I want to split that script file into multiple script files.

share|improve this question

closed as unclear what you're asking by Mark Storey-Smith, Paul White, Kin, bluefeet, RolandoMySQLDBA Jan 12 at 1:23

Please clarify your specific problem or add additional details to highlight exactly what you need. As it's currently written, it’s hard to tell exactly what you're asking. See the How to Ask page for help clarifying this question.If this question can be reworded to fit the rules in the help center, please edit the question.

1  
You need to add more detail about what you are attempting to do. Are you getting an error message? Are you attempting to script the data or the table design? What have you tried? –  Max Vernon Jan 10 at 6:15
1  
Are you trying to open the script file with Management Studio? –  vonPryz Jan 10 at 10:55
    
See if this answer helps you. –  Kin Jan 10 at 14:25

2 Answers 2

As @vonPryz said, I have the same feeling that you're trying to open a huge sql file (and SSMS's generated files are not the best in the world, as in they're pretty verbose by default).

As you most probably saw already, SSMS is not properly working with big files. It fails to properly parse them. I would suggest that you don't generate script files for your data, because this isn't MySQL's dump. Use full backups or, if you really want only one table, use bcp exports.

Here's the link to the documentation of the bcp tool. It's easy to use and you shouldn't have any problem to import it on another server.

share|improve this answer

I wrote some time ago a Powershell script that splits a huge SQL script into more manageable chunks. (Reason for this was to export whole database from 2008R2 to a 2008 system in different an organization.)

Smaller files can be executed with Management Studio or by running sqlcmd -i. The script is below. It uses 25000 rows for limit, which in my test case creates sql files about 6 Mb each. Adjust the $MAX to suit your particular needs.

$src = "c:\temp\myBigScript.sql"
$dst = "c:\temp\mySmallScript-{0}.sql"
$reader = new-object IO.StreamReader($src)
$count = 1
$filename = ($dst -f ($count)) 

$row = 0
$MAX = 25000

$sb = new-object Text.StringBuilder
# Read the source row by row into a buffer
while(($line = $reader.ReadLine()) -ne $null){
    $row++
    [void]$sb.Append($line)
    [void]$sb.Append([Environment]::NewLine)

    # Till $MAX rows are read
    if($row -ge $MAX){
        # Write the buffer contents into a file and reset the buffer
        add-content -path $filename -value $($sb.ToString())
        $sb = new-object Text.StringBuilder
        ++$count
        $filename = ($dst -f ($count))
        $row = 0
    }
}
$reader.Close()

As how input the data, a for /l loop in a cmd session works pretty well. Change the loop upper limit to the actual number of files instead of using 8.

for /l %i in (1 1 8) do sqlcmd -S myServer\myInstance -E -i mySmallScript-%i.sql
share|improve this answer

Not the answer you're looking for? Browse other questions tagged or ask your own question.