Exchange Web Services

I’ve been tinkering around with the Exchange Managed API (ver 1.1) for a few days and have to say it’s FAR easier than I expected (at least for my simple tasks). I know it’s an old technology, but have to say simple things amuse me.

Although I’ll be tinkering with Exchange 2010 soon, I’ve been doing some initial work against Exchange Online and so far, so good. In this day and age I *shouldn’t* be surprised when something is quick and easy (particularly high level language with high level app) but I am.

I was expecting I’d be able to point Visual Studio to a WSDL location such as http://blah blah/EWS/exchange.wsdl (or whatever) but so far I’ve had to initialise the connection using code.




ExchangeService ConnectMeDammit


// specify Exchange version….

ExchangeService service = new ExchangeService(ExchangeVersion.Exchange2010_SP1);


      // setup credentials for login.

service.Credentials = new WebCredentials(““, “mypassword”);

// try and detect server via email address.

service.AutodiscoverUrl(““, RedirectionUrlValidationCallback);

  return service;




and hey preso… have a service connection which then allows a bunch of functionality, far more than I need.


Send an email…. no problem:

EmailMessage message = new EmailMessage( service );

message.Subject = “super secret subject”;

message.Body = new MessageBody(“Do I have a nice body?”);

message.ToRecipients.Add(  ““);




Sending, querying folders, deleting all available. 



Azure pet project.

Over the last few weeks I’ve been putting the final touches to a pet project of mine. (mentioned previously on my old blog).

Basically it’s a large in memory tree that accepts a bunch of key/value pairs and performs a depth first comparrison against the tree to see what matches. It’s all first year comp sci material but I still see definite potential on using it at work.

The last piece of the puzzle has been to “Azure-ize” it. Given the memory requirements aren’t huge (1G of RAM is fine) and the CPU isn’t really ever stressed a “small” Azure instance does the job nicely.

Then came the disappointment. I’d used Azure queus many times in the past but it was always used between various Azure compute instances. In those cases 500 ops per second weren’t an issue. The catch now is that the push/pops are now going over the internet.


I think I got a max of 2 push’s per second. Given that I originally wrote this alerting engine (as I call it) to provide a high speed matching/alerting framework this was a major major blow. I *assume* (not proved) that this is because all queue messages are copied into 3 locations before the syncronous operation returns. This ofcourse takes time.

Sooo to get around this issue I’m switching the input Azure queue to just be a REST interface (will be good enough) and the output results can still be a queue. This is mostly coded and I believe will easily out perform anything we currently have.


There are a number of situations where I can see this simple alert/matching framework to provide some real benefit.

1) Matching documents (k/v pairs) that are being indexed by search engines (ESP/FS4SP/Lucene etc).

2) Be used by Sharepoint as documents are uploaded and will provide a notification mechanism to the admins (if this is required).

3) Will provide a good opportunity at work to merge both Azure and Search into a real customer installation.