The behavior of FlagsAttribute is probably not what you suspect

Let’s create another enum:

enum Foo
{
    A,
    B,
    C,
    D
}

You add the FlagsAttribute:

[FlagsAttribute]
enum Foo
{
    A,
    B,
    C,
    D
}

Meaning you want to use the Enum as a Flag, so you can combine them. For example:

Foo foo = Foo.B | Foo.C | Foo.D;

Later, you pass this value on, and you want to test for the presence of Foo.A:

// foo is the same foo as previous 
var hasA = (foo & Foo.A) == Foo.A;

Console.WriteLine("hasA: {0}", hasA);

You think that hasA is false. Is it? It’s not:

Does foo include Foo.A?
Does foo include Foo.A?

How come? Applying the FlagsAttribute doesn’t DO anything with the generated constants for your enum members.

As per the documentation you still need to do it yourself:

Define enumeration constants in powers of two, that is, 1, 2, 4, 8, and so on. This means the individual flags in combined enumeration constants do not overlap.

So we update our enum:

[FlagsAttribute]
enum Foo
{
    A = 1,
    B = 2,
    C = 4,
    D = 8
}

and then we test our code again:

Foo foo = Foo.B | Foo.C | Foo.D;
var hasA = (foo & Foo.A) == Foo.A;

Console.WriteLine("hasA: {0}", hasA);

And the result is:

Does foo include Foo.A? It does!
Does foo include Foo.A? It does!

Success!

Hope you have a good one,

-Kristof

PS: please not that I should have added a None enum member, as per the documentation:

Use None as the name of the flag enumerated constant whose value is zero. You cannot use the None enumerated constant in a bitwise AND operation to test for a flag because the result is always zero.

Foreach now captures variables! (Access to modified closure)

Foreach has changed in C# 5.0!

Consider the following piece of code in C# < 5.0:

public class Test
{
    public static void Main()
    {
        var words = new[] { "foo", "bar", "baz", "beer" };
        var actions = new List<Action>();
        foreach (string word in words)
        {
            actions.Add(() => Console.WriteLine(word));
        }

        actions.ForEach(e => e());
    }
}

What will this print?

Some of you will see the warning that ReSharper will print on line 9.

Access to foreach variable in closure. May have different behaviour when compiled with different versions of compiler

Notice the second sentence, and remember this warning, we’ll get back to it!

Now go ahead, try and run this in Visual Studio 2010. This will be your result:

beer beer beer beer

While I do love beer, this is not what I expect.

So how do we fix it? Well, either let ReSharper fix it (Alt+Enter -> Enter), or manual, capture the current word in a different variable:

public class Test
{
    public static void Main()
    {
        var words = new[] { "foo", "bar", "baz", "beer" };
        var actions = new List<Action>();
        foreach (string word in words)
        {
            string temp = word;
            actions.Add(() => Console.WriteLine(temp));
        }

        actions.ForEach(e => e());
    }
}

Problem solved. The code above has identical results in Visual Studio 2012.

However…

Using the first piece of code (without our temp variable) in Visual Studio 2012 the result is as follows:

foo bar baz beer

Wait what?

The compiler has changed (note that even for .NET 3.5, 4, and 4.5 in Visual Studio 2012 the 4.5 compiler is used!).

Meaning that our variable word is now declared inside of the foreach loop, and not outside.

This change can be found in the C# 5.0 spec, page 247-248, found on your machine when you’ve installed VS2012 (not Express) in: C:\Program Files (x86)\Microsoft Visual Studio 11.0\VC#\Specifications\1033

If v was declared outside of the while loop, it would be shared among all iterations, and its value after the for loop would be the final value, 13, which is what the invocation of f would print. Instead, because each iteration has its own variable v, the one captured by f in the first iteration will continue to hold the value 7, which is what will be printed. (Note: earlier versions of C# declared v outside of the while loop.)

Note 1: read the file to get the meaning of ‘v’ and those values (like 13 and 7).
Note 2: I’ve tweeted to some guys to get the spec online.

While this is not necessarily a problem for projects coming from 2010 and upgrading to 2012, it can be an issue when you are doing round-tripping, for example in mixed teams. Developers using 2012 need to use the old behavior.

Worse, your build system is not at 2012 yet, so your result are different!

Watch out for this!

Have a good one,

-Kristof

When using an enum in PowerShell, use the member’s name, not the member’s value

Consider the following enum in C#:

enum State
{
    Started,
    Stopped,
    Unknown
}

Note that I have not added an explicit value for the enum members. They will be generated by the compiler. As stated in the C# spec:

… its associated value is set implicitly, as follows:

  • If the enum member is the first enum member declared in the enum type, its associated value is zero.
  • Otherwise, the associated value of the enum member is obtained by increasing the associated value of the textually preceding enum member by one. This increased value must be within the range of values that can be represented by the underlying type, otherwise a compile-time error occurs.

Found at http://www.microsoft.com/en-us/download/details.aspx?id=7029, page 400-401 (I can’t find the version for 4.5 though…).

Now what are the consequences of this? Consider the following piece of PowerShell:

$result = $serviceController.GetServiceStatus()
if($result -eq 1)
{
    MyLib.StartService()
}

This will work, because PowerShell implicitly converts the int to the actual enum member.

However since we are assuming the value can go wrong. In the next version you add extra values, say for example to represent a starting/stopping service:

enum State
{
    Starting,
    Started,
    Stopping,
    Stopped,
    Unknown
}

Since now all the values are shifted when you run your PowerShell again you start the service when it’s already started 😉 .

Solution?

First of all (as a consumer), use the enum’s member name instead of its value:

$result = $serviceController.GetServiceStatus()
if($result -eq [MyLib.State]::Stopped)
{
    MyLib.StartService()
}

This will ensure that you get the value for Started, not for anything else.

As a developer of a library you should ensure that you never mess up the order of an enum, by adding new values as last, or (prefered) set the value yourself:

enum State
{
    Started = 0,
    Stopped = 1,
    Unknown = 2,
}

Becomes:

enum State
{
    Starting = 3,
    Started = 0,
    Stopping = 4,
    Stopped = 1,
    Unknown = 2,
}

And now you can also perfectly reorder them so the numbers are sequential:

enum State
{
    Started = 0,
    Stopped = 1,
    Unknown = 2,
    Starting = 3,
    Stopping = 4,
}

Hope you have a good one,

-Kristof

ServerConnection and Login failed for user. Reason: Attempting to use an NT account name with SQL Server Authentication

Today I had to work with the ServerConnection class.

This class provides a method to specify the connection to the Server class.

So usage would be like this:

# http://sqlblog.com/blogs/allen_white/archive/2008/04/28/create-database-from-powershell.aspx
[System.Reflection.Assembly]::LoadWithPartialName('Microsoft.SqlServer.SMO')  
[System.Reflection.Assembly]::LoadWithPartialName('Microsoft.SqlServer.ConnectionInfo')  

$sqlServer = "server"
$username = "username"
$password= "password"

$serverConnection = new-object Microsoft.SqlServer.Management.Common.ServerConnection($sqlServer, $username, $password)

$server = new-object Microsoft.SqlServer.Management.Smo.Server($serverConnection)

Write-Host ("SQL Version: {0}" -f $server.Information.Version)

$server.databases | % { Write-Host $_.Name }

Now this works for SQL accounts, but not for domain accounts.

My username was in the form of DOMAIN\username, but that failed.

Checking the SQL Server log, it yielded this:

Login failed for user. Reason: Attempting to use an NT account name with SQL Server Authentication
Login failed for user. Reason: Attempting to use an NT account name with SQL Server Authentication

So to use the domain account with this object you need to create the $serverConnection like this AND you need to specify your username in the form of: [email protected] (FQN). Entering DOMAIN\username doesn’t seem to work.

$sqlServer = "server"
$username = "[email protected]"
$password= "password"

$serverConnection = new-object Microsoft.SqlServer.Management.Common.ServerConnection($sqlServer)

$serverConnection.ConnectAsUser = $true
$serverConnection.ConnectAsUsername = $username
$serverConnection.ConnectAsUserPassword = $password

You need to use the ConnectAsUsername and ConnectAsUserPassword to use domain accounts and set the ConnectAsUser property to true.

When I connect with those options I get the access I need.

You can verify it by executing the following query:

$conn.ExecuteScalar("SELECT SUSER_NAME() as [foo]")

Which nicely yields:

[email protected]

Have a good one,

-Kristof

TeamCity reporting that it’s out of diskspace

I installed TeamCity on a server (just the professional edition).

The server was installed under D:\TeamCity, with it’s (only) agent under D:\TeamCity\buildAgent.

I then started creating a project. At one point I modified the VSC Checkout rules, and then stuff went downhill.

First of all, a VSC root is not under some sort of revision control like build configurations, which is a pain in the *** to track your changes. Even the Checkout rules aren’t (which are build configuration specific).

Anyway, this was the error:

Free disk space requirement
[18:18:07][Free disk space requirement] Removing files to meet 3.0Gb of free disk space required for directory D:\TeamCity\buildAgent\work\cc94054af2903311 (only 0.0b is free now).
[18:18:07][Free disk space requirement] Removing files to meet 3.0Gb of free disk space required for directory D:\TeamCity\buildAgent\temp (only 12.7Gb is free now).
[18:18:07][Free disk space requirement] Free disk space requirement of 3.0Gb could not be met for directory D:\TeamCity\buildAgent\work\cc94054af2903311 (only 0.0b is free)
[18:18:07][Free disk space requirement] Free disk space requirement of 3.0Gb could not be met for directory D:\TeamCity\buildAgent\temp (only 12.7Gb is free)

Weird errors, because the disk actually had >3.0GB free (not 3.0Gb, that’s a typo on their side).

I checked all the disks, maybe it tried to write to the C: for temp files, but that one had enough space too.

Upon further investigation I saw I made a typo in the Checkout rules of a particular VCS root which caused the fail:

I set the rule like this:

+:Foo=Bar

As you can see, it misses the >

So the correct one should be

+:Foo=>Bar

It’s very annoying they don’t have error checking for that.

Have a good one,

-Kristof

ASP.NET MVC Mobile: where is my style?

I was playing with ASP.NET MVC Mobile to start something quickly for my phone.

I then updated the NuGet packages, and my style was GONE.

Seems that jQuery 1.9.* is incompatible with jQuery.Mobile 1.2.

According to the website 1.3 is out, so the package should soon be there, so until then you can revert from jQuery 1.9.* to 1.8.3 with the following code in your Package Manager Console:

uninstall-package jQuery -Force # ignore that some have dependencies on jQuery
install-package jQuery -Version 1.8.3

jQuery downgrade

Have a good one,

-Kristof

WebClient not sending credentials? Here’s why!

TL;DR version here.
This post applies to more than just GitHub, read the rest to see the behavior!

I was playing with the GitHub API (more specifically generating a new OAuth token).

So what you need to do, as per the documentation, is to post a certain JSON string to https://api.github.com/authorizations, with Basic Authentication. I’m going to use the WebClient class for this.

This is the JSON string that you need to post:

{
  "scopes": [
    "repo"
  ],
  "note": "API test"
}

Now this is the code I used:

var content = new 
			{
				scopes = new[] { "repo" },
				note = "API test",
			};

var webClient = new WebClient
	                {
				Credentials = new NetworkCredential("*****", "*****"),
	                };

// JsonConvert is from NewtonSoft.Json, very handy!
string serializedObject = JsonConvert.SerializeObject(content);

string reply = webClient.UploadString(new Uri("https://api.github.com/authorizations"), "POST", serializedObject);

dynamic deserializedReply = JsonConvert.DeserializeObject(reply);

Console.WriteLine(deserializedReply.token);
Console.ReadLine();

However, when using this piece of code I always get a 404 not found.

Reading through the Github API documentation yields the following:

There are three ways to authenticate through GitHub API v3. Requests that require authentication will return 404, instead of 403, in some places. This is to prevent the accidental leakage of private repositories to unauthorized users.

(emphasis mine)

So there’s a good change that we just hit Github’s security through obscurity, so we do normally get a 403.

In fact, I tested it with Github Enterprise, and that one just returns a 403, so that’s how I figured out that the URL I was calling is correct (Github Enterprise doesn’t hide information like the regular does):

403 forbidden from a Github Enterprise instance

Let’s try the same code on a Simple IIS website with basic authentication:

Screenshot (12)

I then simplified the code to just download the contents of the website, with a simple GET:

var webClient = new WebClient
	                {
		                Credentials = new NetworkCredential("*****", "*****"),
	                };

string reply = webClient.DownloadString(new Uri("http://localhost/CredentialTest"));

Console.WriteLine(reply);
Console.ReadLine();

So that gives me the response (Default.aspx contains ‘Hi, it works’).

Now what’s going on? Is it the POST that conflicts?

Using the same code, but instead of DownloadString, I upload some arbitrary piece of text with UploadString which by default uses POST.

var webClient = new WebClient
	                {
		                Credentials = new NetworkCredential("*****", "*****"),
	                };

string reply = webClient.UploadString(new Uri("http://localhost/CredentialTest/Default.aspx"), "somerandomstuff");

Console.WriteLine(reply);
Console.ReadLine();

Please note that I post directly to Default.aspx. IIS doesn’t allow postings to directories (I’m sure you can enable it).

Anyway, this also works.

Next step? I was thinking that WebClient maybe only sends the credentials over when it detects that the machines are in the same domain / workstation?

Let’s find out with Fiddler.

I first monitored the flow for the console app to IIS and I was surprised to see that there were actually two requests, and what’s even more weird is that the first request doesn’t send the credentials (notice I still use POST, to mimic our code to connect to GitHub):

First request to IIS

Instead of returning a 403 on the file, IIS nicely returns a 401 with the WWW-Authenticate header:

first response

The WebClient is then smart enough to resend the request WITH the credentials:

second request

And then the server nicely responds with a 200, and the contents are sent (notice the picture are JUST the headers).

second response

This flow is always the same, whether it is GET or POST (what’s weird is that when you want to post a 2GB file, you send it, server replies 401, and you need to send the 2GB file again…).

Now that we know that our WebClient is behaving correctly, I decided to go and look at the request and response from GitHub:

The first request (like with IIS), doesn’t contain the Authorization header:

First GitHub request

However, in contrast to IIS, GitHub doesn’t play nice. It doesn’t send a 401 with a WWW-Authenticate header, it just returns 404 (or 403 on Github Enterprise).

First and only response from GitHub

For GitHub it is perfectly valid to send a 404 if it doesn’t want to disclose information.

The only problem is just that the WebClient doesn’t know what to do and thus, we need to do stuff ourselves!

We need to manually inject the headers when calling Github (or any webserver) at the first request:

var content = new 
			{
				scopes = new[] { "repo" },
				note = "API test",
			};

var webClient = new WebClient();

// replace webClient.Credentials = new NetWorkCredentials("*****","******") by these 2 lines

// create credentials, base64 encode of username:password
string credentials = Convert.ToBase64String(Encoding.ASCII.GetBytes("*****" + ":" + "*****"));

// Inject this string as the Authorization header
webClient.Headers[HttpRequestHeader.Authorization] = string.Format("Basic {0}", credentials);

// Continue as you are used to!
string serializedObject = JsonConvert.SerializeObject(content);

string reply = webClient.UploadString(new Uri("https://api.github.com/authorizations"), "POST", serializedObject);

dynamic deserializedReply = JsonConvert.DeserializeObject(reply);

Console.WriteLine(deserializedReply.token);
Console.ReadLine();

And we have our token!

Borat Great SuccessHave a good one!

-Kristof

Powershell, Where-Object and capturing the output

Again, one that thing I couldn’t find because there is just no documentation on Powershell.

Try this:

$foo = SomethingThatResturnsAList

$foo = $foo | Where-Object { $_.Name -like "*something*" }

PassOnFooToSomethingElse -List $foo

Awesome, every sane developer would think: right, that would work. Except in Powershell.

It appears that $foo on line 5 is $null

WHAT?

Okay, let’s surround it with braces, maybe that’ll solve stuff:

$foo = ( $foo | Where-Object { $_.Name -like "*something*" } )

But that doesn’t work either.

Now for the solution, I’d love to point you out to a link on the world wide web where you can view the documentation, but honestly, I just can’t find it…

The solution is to replace the 3th line by this: $( … ), like this:

$foo = $( $foo | Where-Object { $_.Name -like "*something*" } )

And that works.

Happy coding & have a good one,

-Kristof

Powershell Remove-Item and symbolic links

Let’s say you’ve got a symbolic link which points to another folder and you want to delete the symbolic link through Powershell. Prepare for some weird stuff! Consider the following test script:

New-Item "SymbolicTest" -Type Directory
Set-Location "SymbolicTest"
New-Item "Source" -Type Directory
Set-Location "Source"
New-Item "Test1.txt"  -Type File
New-Item "Test2.txt"  -Type File
New-Item "Test3.txt"  -Type File
Set-Location "../"
# now some cmd since Powershell can't natively create symbolic links
cmd /c mklink /J `"Target`" `"Source`" 
ls

Let’s check the structure: dir in powershell Okay, so no difference, both are considered directories. Now let’s check with cmd: dir in cmd Okay, weird, cmd understands it. Now let’s try to remove the Target link. Remember, we only want to delete the link, not the contents of the Target folder (and thus implicitly not the contents of the Source folder). With Remove-Item you can delete items. Let’s try it out:

Remove-Item .\Target

This yields the following output:

deletion powershell

Wait what? Deletion of the children. Just for the sake of it I typed Y. This yielded another error: deletion powershell force prompt Okay, again, with -Force powershell content gone Target was gone, however also the contents of Source. Auch? Now what? Well, reading through the documentation of Remove-Item I found nothing about symbolic links.

So what now? Seems that Powershell doesn’t have knowledge about symbolic links. It wasn’t taken into account. He just follows the link like it’s a normal directory.

So what’s the solution? I decided to use the good old cmd (from Powershell!):

cmd /c rmdir .\Target

rmdir works!

Marvelous. Problem solved!

Actually, in retrospect, since I needed to use cmd to create the link, it would have made more sense to also just delete the link with cmd!

Have a good one,

-Kristof