FileHash generator to match silverlight manifest

The above is the source of the below code.  I needed to use this to check that the generated Hash used in a silverlight manifest was indeed correct.  One of my customers was having trouble installing on a win7 64 bit system.  This same build works fine elsewhere.  In the end we used a different whitelabel (same code different graphics and therefore checksums).

using System;
using System.Security.Cryptography;
using System.IO;

namespace FileHashSample
{
public class FileHash
{
public FileHash()
{
return;
}

        public string ComputeHash(string filePath)
{
string filePathNormalized = System.IO.Path.GetFullPath(filePath);
SHA1 sha = new SHA1Managed();
FileStream fs = new FileStream(filePathNormalized, FileMode.Open, FileAccess.Read);
byte[] byteHash = sha.ComputeHash(fs);
fs.Close();
return Convert.ToBase64String(byteHash, 0, byteHash.Length);
}

        public static void Main(string[] args)
{
if (args.Length == 0)
{
Console.WriteLine(“Please Enter a File Path”);
return;
}
string filePath = System.IO.Path.GetFullPath(args[0]);
FileHash objFileHash = new FileHash();
Console.WriteLine(“File Path is {0}”, filePath);
Console.WriteLine(“File Hash is {0}”, objFileHash.ComputeHash(filePath));
return;
}
}

}

How to simulate a tfs build locally

When tfs is used as a build server it builds the visual studio projects differently to the way the IDE does.
Typically it will dump all of the build artifacts into a single folder (which breaks almost any post-build scripts).
However if you are building web applications it has a clever option that builds a useful web applications folder with everything that is needed to be deployed.
This happens for any web project where the output directory is redirected.

msbuild Whatever.sln /p:configuration=Release /p:OutDir=c:builds

This is useful when you have a web site project in visual studio 2010 (say a silverlight web project) that you would like to deploy but don’t have enough space on the build server to install the right tools.
This is quite powerful but would be useful if the documentation for this was more obvious.

Simple Powershell database script

I know that sql server 2008 has a powershell scriptlet that does something similar to this.
However this works without any additional code.
It also works really well in a psake script.
This is a sql script runner that treats each file as a single group of transactions.
$a = new-module script-block {
function Exec-Sql
{
param([System.String] $filename, [System.String] $connectionString)
if (![System.IO.File]::Exists($filename))
{
Write-Host “No such file as $filename”
}
else
{
Write-Host $filename
$connection = new-object System.Data.SqlClient.SqlConnection -argumentList $connectionString
$connection.Open()
$command = “”
foreach ($str2 in [System.IO.File]::ReadAllLines($filename))
{
if (($str2.Trim().ToLower() -eq “go”))
{
$cmd = $connection.CreateCommand();
$cmd.CommandText = $command
$b=$cmd.ExecuteNonQuery()
$command = “”
}
else
{
$command += “`n$str2”
}
}
if ($command -ne “”)
{
$cmd = $connection.CreateCommand();
$cmd.CommandText = $command
$b=$cmd.ExecuteNonQuery()
}
$connection.Close();
}
}
}

Retro Tools Still Useful

Of  late I have been finding that I am using the older unix tools more and more frequently (on windows…).  These seem to solve the problem presented in a minimal fashion (and without artificial limits or attempts to be too helpful).

For example there is awk.  This solves the same kind of parsing problems that excel does with it’s text-to-columns option. awk is designed to operate with tabular data and perform repetitive operations on it.

This can effectively provide queries and transforms of say a CSV file in a similar manner that can be applied with XSLT.

gawk -F”: ” “{print $2}” sort2.txt

The above takes a file seperated by colons and prints the second column.

I have also been experimenting with the vim editor.

I started using vim a while ago when my eeepc linux netbook did not have emacs installed (and I was without an internet connection to get it).

I found it very useful when I was presented with a 2MB xml document that I needed to parse.  Opening an xml document in VS2010 is fine provided that the document is not big and on a single line.  The VS2010 editor tried to be clever and do graphical things with the document – but this took all of the processing power of the machine (which itself is impressive since this was my work quad core desktop). vim however opened the file painlessly and permitted searches which is all I really needed.  I even found a vim script that allows the insertion of a newline after a given pattern. This allowed me to turn the monster xml into something that VS2010 could look at without hanging.

Subsequently I have been combining vim and powershell.  Powershell can be started on unc paths.  Vim can be started from within Powershell.  This combination overcomes the historical dos limitation of forccing you to map a drive letter if you want a command prompt.

This may look like unix but is actually powershell:

ls -fil NLog.dll -r | % { $_.VersionInfo} | less
It lists the name and file version of a given file over a range of directories.
I had found that we had an eclectic mix of file versions.
Here is a useful article on why vim is so powerful.
VS2010 has a great free addon called vsvim.  This is a partial (yet useful) implementation of vim within the VS editor.  [You can switch it off if it becomes a problem or a non-vim user wants to use your machine].  This can speed up certain kinds of editing operations once you can remember the vim editor commands.