Batch file to compress and copy a folder

Recently, I needed to compress all contents of a folder (including subfolders) to an archive file, and copy it to a remote (network) location, all from a command line. In other words, I wanted to do this:

compressAndCopyFolder <sourceFolder> <destinationFolder> <archiveFileName>

Here’s how each parameter would work:

  • sourceFolder is the folder, along with all subfolders, to be added to the archive.
  • destinationFolder is the folder where the archive would be created.
  • archiveFileName is¬†the file name of the archive

I prefer 7zip for archiving, so I could have done this simply using one command:

7z.exe" a -r "%destinationFolder%\%archiveFileName%.7z" %sourceFolder%\*

The problem with this is that it is inefficient to work with a large archive file over the network — there’s a lot of chatter going on to add files to an archive file. Much faster to create the archive locally, then copy the final archive file to the network. So, instead of one command, we have two:

echo off
if "%3"=="" (
	echo usage: 7zfolder ^<sourcefolder^> ^<destinationfolder^> ^<destinationfilename^>
	echo Note: The suffix .7z will be added at the end of destinationFilename.
	goto :eof
if not exist "%1" (
	echo ERROR: Source folder does not exist: %1
	goto :eof
if not exist "%2" (
	echo ERROR: Destination folder does not exist: %2
	goto :eof
if exist "%2\%3.7z" (
	@echo ERROR: Destination file already exists in destination folder: %2\%3
	goto :eof

@"c:\Program Files\7-Zip\7z.exe" a -r "%temp%\%3.7z" %1\*
@copy "%temp%\%3.7z" %2\%3.7z /z

Now, from a command line, I just do this…

7zfolder c:\mysource \\myserver\mydest archive

… and all files in c:\mysource will be added to an archive \\myserver\mydest\archive.7z.

Don’t you just love batch files?

Separating SQL script files generated by Microsoft SQL (by type)

There’s one great feature in SQL: the “Generate Scripts” command. Unfortunately, it has one limitation: the default filenames of scripts look something like this:


I’d much prefer the filenames to match the object name, without the owner (‘dbo’) or object type. In other words, I’d prefer the above two files to look like this:


How do we get from point A to point B without a lot of manual file copies and renames? We use the FOR command!

First, create a subdirectory for each object type (Table, StoredProcedure, UserDefinedFunction, View, Schema, Trigger, and User), then run the following from a command prompt.

for %i in (*.User.sql) do for /f "delims=., tokens=1-3" %j in ("%i") do move %i %k\%j.%l
for %i in (*.Schema.sql) do for /f "delims=., tokens=1-3" %j in ("%i") do move %i %k\%j.%l
for %i in (*.Trigger.sql) do for /f "delims=., tokens=1-3" %j in ("%i") do move %i %k\%j.%l
for %i in (*.sql) do for /f "delims=., tokens=1-4" %j in ("%i") do move %i %l\%k.%m

The command above will look for each file with a *.sql extension in the current directory. For each of those files, it copies it to a directory based on the type of file, and ensures the new file only includes the object name and the .sql extension. So much cleaner now!

Note that this will not work if any folder in the full path to the SQL files has a period in it!

Updated 2009.11.10 to support Trigger, User, and Schema scripts.

Automatically generate (partial) XML format files for BCP

I’ve been working with a lot of raw data files lately — data files that come in some delimited format which I must move into a SQL database. Most of the files are CSV files, and some of them have over 100 columns.

Using BCP to import CSV files into SQL is a pleasure. The only annoyance is writing those XML format files. Actually, the annoyance is not writing them, it’s going through the CSV files (which, of course, are undocumented) to determine what delimiters are used in each row.

Here’s a sample of what I mean:


Fortunately, most CSV files are consistent in their use of quotes, but going through dozens of columns to determine the terminators is a pain. The terminators in the above example aren’t just commas; they could also be quotes. Column three, for example, ends with a comma followed by a quote.

To generate the <record> section of my format files, I wrote the following script, which reads the first line of a text file, finds each comma, determines if it is preceded or followed by a quote, and generates the appropriate XML.

//get filename
var args = WScript.Arguments;
if (args.length == 0)
	WScript.Echo('Usage: getdelims.vbs <filename>');

var filename = args(0);

//read file
var fso = new ActiveXObject("Scripting.FileSystemObject");
var file = fso.OpenTextFile(filename,1);
var contents = file.ReadLine();
file = null;
fso = null;

//find commas
var cnt = 0;
for (var i = 1; i < contents.length; i++)
	if ( contents.substr(i,1) != ',' ) continue;
	delim = ",";
	if ( contents.substr(i-1,1) == '"' )
		delim = '&quot;,';
	if ( i+1 < contents.length && contents.substr(i+1,1) == '"' )
		delim += '&quot;';
	WScript.Echo('\t<FIELD ID="' + cnt + '" xsi:type="CharTerm" TERMINATOR="' + delim + '" />');

The output can be copy/pasted right into your format file. The example content above would generate the following.

        <field ID="1" xsi:type="CharTerm" TERMINATOR="," />
        <field ID="2" xsi:type="CharTerm" TERMINATOR="," />
        <field ID="3" xsi:type="CharTerm" TERMINATOR=",&quot;" />
        <field ID="4" xsi:type="CharTerm" TERMINATOR="&quot;,&quot;" />
        <field ID="5" xsi:type="CharTerm" TERMINATOR="&quot;," />
        <field ID="6" xsi:type="CharTerm" TERMINATOR="," />
        <field ID="7" xsi:type="CharTerm" TERMINATOR="," />

A few minutes writing a script, and I won’t be looking at CSV files with too many columns ever again (at least, not for the reason of writing XML format files).

Scripting SQL database builds

A project I’m working on requires importing data from hundreds of megabytes of delimited text files into a staging database, then converting and migrating the data into an operations database. (Forgive the naming conventions. The “staging” database is used as an intermediary between the text files and the final database. The “operations” database is the one the application uses.)

What’s the best way to do automate this? Ideally, I need to be able to re-run the import at any time, preferably with a single command. The process should purge the staging database, rebuild tables (since schemas may change), reload all data, and run the necessary migration scripts. Since the final product is merely a web interface to the imported data, there is no harm in wiping and recreating everything from scratch, as there is no “user data” of note. (Actually there is, but it is isolated from this process.)

The solution I came up with used a series of SQL scripts executed using sqlcmd (part of MS SQL 2005; use osql for older versions) and data import tasks using bcp (part of MS SQL), all tied together using batch files. Continue reading

Toggling your HOSTS file

I use my laptop (a highly-recommended Toshiba Portege M100) for most of my development work. Typically, I develop on my laptop (a sandbox environment), then push data to another server (either a staging or production server). One of the problems I run into is my ever-changing physical location. When I’m at home, certain Web sites need internal IP addresses; at other places, they need external IP addresses.

The solution to this problem is by toggling the use of your HOSTS file. I enterd all the static entries into my HOSTS file, and created a small batch file to “toggle” the HOSTS file on and off. (When the HOSTS file exists, it is used, so renaming it to something other than HOSTS – with no file extension – turns it “off”.)

The batch file for toggling the HOSTS file is:

if exist c:\windows\system32\drivers\etc\hosts goto :hosts_off
ren c:\windows\system32\drivers\etc\hosts.toggle hosts
goto :end
ren c:\windows\system32\drivers\etc\hosts hosts.toggle
goto :end
ipconfig /flushdns

The batch file simply changes the file from hosts to hosts.toggle and back again. The command ipconfig /flushdns is executed at the end to ensure the DNS cache is flushed.