martin-brennan

articles / archive / about

Could not successfully convert UCS-2 string to UTF-8

We were performing a massive data migration at work from MSSQL to MySQL using the Migration Wizard in MySQL Workbench. When performing the step that copied the data from one server to the other, we got the following error for some nvarchar columns:

Could not successfully convert UCS-2 string to UTF-8

Or alternatively, we also got Error during charset conversion of wstring: No error. This infuriating error took us an hour and a half to diagnose and work around. I finally stumbled upon this link to a bug on the MySQL bugs website relating to MySQL Workbench http://bugs.mysql.com/bug.php?id=69629.

The problem, it turns out, lays with Microsoft’s ODBC Drivers:

This bug is triggered due to improper UCS-2 strings being generated by Microsoft’s ODBC drivers.

And the solution is to select the FreeTDS ODBC Drivers in the first step of the migration wizard of MySQL Workbench. The migration then works perfectly with no further encoding complaints. I hope this helps someone else out with this frustrating problem!

FreeTDS ODBC Drivers

The SMS That Started It All

Today I want to show you the SMS that started it all and began my career as a software developer.

IMG_0462

One day in October 2010 my manager and I sat down to discuss a database solution for our company. We had unsuccessfully tried to use another company’s solution for our business needs but found them lacking, and we were already running an Access database for some of our data. At this time I had no experience with databases whatsoever, not even the basics such as tables, columns and primary keys. Jacob wanted to gauge my interest in learning about databasing and Visual Basic so we could develop our own solution. Needless to say I was very keen. (more…)

Failed to build gem native extension on OSX Mavericks

When installing certain gems on Max OSX Mavericks you may run into some issues where either the gem install or bundle install command throws the error “Failed to build gem native extension”, usually in extconf.rb. This has happened specifically to me with the json and mysql2 gems. Some gems may work exactly as expected.

The reason for this odd behaviour is because in the latest release of XCode 5.1, Apple are treating any unknown command line options as errors. The XCode 5.1 release notes read:

“The Apple LLVM compiler in Xcode 5.1 treats unrecognized command-line options as errors. This issue has been seen when building both Python native extensions and Ruby Gems, where some invalid compiler options are currently specified. Projects using invalid compiler options will need to be changed to remove those options. To help ease that transition, the compiler will temporarily accept an option to downgrade the error to a warning: -Wno-error=unused-command-line-argument-hard-error-in-future”

What this means that if any gems use invalid command line arguments in their build or install process, the error “Failed to build gem native extension” will be thrown. As a temporary fix, you can try running either bundle install or gem install with the following argument, which downgrades the error to a warning:

ARCHFLAGS=-Wno-error=unused-command-line-argument-hard-error-in-future gem install {gem_name_here}
ARCHFLAGS=-Wno-error=unused-command-line-argument-hard-error-in-future bundle install

This should clear up most errors. The fix is temporary because Apple have said they will be removing support for the argument in the future, which means any gems with invalid arguments will need to update their build commands.

mysql2

This fix did not work when installing the mysql2 gem, I got the same error but it required an extra step to fix. I had to run the command brew install mysql. Afterwards I could just run bundle install and the problem was fixed!

Another Year, Another Redesign

We are in the future of 2014 and since it’s been about a year since I’ve changed this blog’s design, I thought time would be ripe to do it again! It’s taken me quite while longer than I would have like this time, but I have a good excuse for that in the form of the birth of my son Patrick Noel Allen Brennan!

I’ve based this design on themes I’ve seen for Octopress, with a large area to place more importance on the content of the blog. I’ve also added a sidebar back in because the site is now 100% width, with a focus on search. I’ve been using a great search plugin called Relevanssi that replaces the default WordPress search, and adds features like indexing, result weighting and search hit highlighting.

For the sidebar icons I used the Simple Icons at http://simpleicons.org/ and once again I’ve used the excellent highlightjs for syntax highlighting of code snippets, for example on the JavaScript I used to highlight the currently selected page!

var page = window.location.href;
if (page.indexOf('archive') != -1) {
	var link = document.getElementById('nav_archive');
	link.className = 'selected';
} else if (page.indexOf('about') != -1) {
	var link = document.getElementById('nav_about');
	link.className = 'selected';
} else {
	var link = document.getElementById('nav_articles');
	link.className = 'selected';
}

Let me know what you think! Here is a comparison of old vs. new!

New

New Design

Old
Old Design

Add Open With Sublime Text 2 to Windows Context Menu

Just a quick tip, you can use this .bat file to add Open with Sublime Text 2 to the Windows right click context menu. I assume that you would need administrator privileges to run this command.

@echo off
SET st2Path=C:\Program Files\Sublime Text 2\sublime_text.exe

rem add it for all file types
@reg add "HKEY_CLASSES_ROOT\*\shell\Open with Sublime Text 2" /t REG_SZ /v "" /d "Open with Sublime Text 2" /f
@reg add "HKEY_CLASSES_ROOT\*\shell\Open with Sublime Text 2" /t REG_EXPAND_SZ /v "Icon" /d "%st2Path%,0" /f
@reg add "HKEY_CLASSES_ROOT\*\shell\Open with Sublime Text 2\command" /t REG_SZ /v "" /d "%st2Path% \"%%1\"" /f

rem add it for folders
@reg add "HKEY_CLASSES_ROOT\Folder\shell\Open with Sublime Text 2" /t REG_SZ /v "" /d "Open with Sublime Text 2" /f
@reg add "HKEY_CLASSES_ROOT\Folder\shell\Open with Sublime Text 2" /t REG_EXPAND_SZ /v "Icon" /d "%st2Path%,0" /f
@reg add "HKEY_CLASSES_ROOT\Folder\shell\Open with Sublime Text 2\command" /t REG_SZ /v "" /d "%st2Path% \"%%1\"" /f
pause

ContentResult or ViewResult Response Based on Context

In .NET MVC there are times where you may want to return either a ContentResult or ViewResult depending on the situation. For example you may want to return an application/json result for an API request but a structured view with timing information (such as when using MiniProfiler) when visiting the route in the browser.

I had to find out how to construct the different types of results separately first. The main properties that need to be set for a new ViewResult object are the ViewName and the ViewData. ViewData has a Model property where you can set the model that you would like to return with the view. The SimpleResponse model is a simple model that just has a JSON property that I use to display the JSON output of an API route in a nicely-formatted way.

var view = new ViewResult();
view.ViewName = "~/Views/Sys/Response.cshtml";
view.ViewData.Model = new Models.SimpleResponse() { JSON = APIResponse.Serialize(obj) }; //This just uses Newtonsoft.Json to serialize the response
return view;

For a ContentResult, you only need to set the content type and the content of the response. Because the API I am developing uses JSON responses, the content type is application/json; charset=UTF-8.

var content = new ContentResult();
content.Content = Serialize(obj);
content.ContentType = "application/json; charset=UTF-8";
return content;

Now, we can put this all together into a practical example. Inside a class called APIResponse I have a method called Resolve.

/// <summary>
/// Figures out whether the response should be a ContentResult or
/// a ViewResult. If the request is local and the request contenttype
/// is not "application/json; charset=UTF-8", then that means a developer
/// is looking at the page in a browser. This renders the Response.cshtml
/// View with MiniProfiler so query statistics can be observed.
/// 
/// Otherwise, a ContentResult is returned with the JSON serialized object
/// which is passed in from the controller.
/// </summary>
/// <param name="obj">The object to serialize to JSON from the controller.</param>
public static ActionResult Resolve(object obj)
{
	if (HttpContext.Current.Request.IsLocal && HttpContext.Current.Request.ContentType != "application/json; charset=UTF-8")
	{
		var view = new ViewResult();
		view.ViewName = "~/Views/Sys/Response.cshtml";
		view.ViewData.Model = new Models.SimpleResponse() { JSON = WOLASAPI.lib.APIResponse.Serialize(obj) };
		return view;
	}
	else
	{
		var content = new ContentResult();
		content.Content = Serialize(obj);
		content.ContentType = "application/json; charset=UTF-8";
		return content;
	}
}

This method can then be called from your controller like so:

public ActionResult GetPerson(int id) {
    var person = db.person.Find(id);
    return APIResponse.Resolve(person);
}

Custom UTC DateTime Model Binding for MVC

When working with sending UTC DateTimes via HTTP POST to a .NET Web API or MVC route, the dates are converted by the model binder into the server’s local time format before being inserted into the database or being used for whatever you want. This is because .NET MVC handles DateTime Model Binding for HTTP POST and GET differently.

The initially surprising piece of information that it transpires that it actually matters whether you have set the HTTP method to be a GET or a POST.

There are perfectly valid reasons for this that are described in the article that this quote is from, but sometimes you want to keep the DateTimes as UTC to be inserted into the database over HTTP POST. Thankfully, this is fairly easy to do by implementing custom UTC DateTime model binding. (more…)

Enable Gzip Compression in IIS

For an API I am building, I needed to enable Gzip compression in IIS Express for JSON and came across: http://stackoverflow.com/questions/10102743/gzip-response-on-iis-express.

There are just two commands to run. This example is for IIS Express but the same commands work in IIS:

cd %PROGRAMFILES%\IIS Express
appcmd set config -section:urlCompression /doDynamicCompression:true
appcmd set config /section:staticContent /+[fileExtension='.json',mimeType='application/json']
appcmd.exe set config -section:system.webServer/httpCompression /+"dynamicTypes.[mimeType='application/json',enabled='True']" /commit:apphost

Then, restart IIS or IIS Express afterwards. Gzip compression is very beneficial, and it lowered response sizes for JSON API requests dramatically, sometimes by greater than 50%.

Service Timestamp Issue for DocuSign

I ran into this service timestamp issue with our server that sends Envelopes to DocuSign, the e-signature service that we use at work. The Envelopes are sent through a SOAP web service endpoint, which we are sending to via a Window service hosted on an Amazon EC2 instance. We got this error out of nowhere, which can occur for any web service, not specifically DocuSign’s.

An error was discovered processing the header -- WSE065: Creation time of the timestamp is in the future.
This typically indicates lack of synchronization between sender and receiver clocks.
Make sure the clocks are synchronized or use the timeToleranceInSeconds element in the microsoft.web.services3 configuration section to adjust tolerance for lack of clock synchronization.

Server stack trace: at System.ServiceModel.Channels.ServiceChannel.HandleReply(ProxyOperationRuntime operation, ProxyRpc& rpc)
at System.ServiceModel.Channels.ServiceChannel.Call(String action, Boolean oneway, ProxyOperationRuntime operation, Object[] ins, Object[] outs, TimeSpan timeout)
at System.ServiceModel.Channels.ServiceChannelProxy.InvokeService(IMethodCallMessage methodCall, ProxyOperationRuntime operation) 
at System.ServiceModel.Channels.ServiceChannelProxy.Invoke(IMessage message)
Exception rethrown at [0]: at Microsoft.VisualBasic.CompilerServices.Symbols.Container.InvokeMethod(Method TargetProcedure, Object[] Arguments, Boolean[] CopyBack, BindingFlags Flags)
at Microsoft.VisualBasic.CompilerServices.NewLateBinding.ObjectLateGet(Object Instance, Type Type, String MemberName, Object[] Arguments, String[] ArgumentNames, Type[] TypeArguments, Boolean[] CopyBack)
at Microsoft.VisualBasic.CompilerServices.NewLateBinding.LateGet(Object Instance, Type Type, String MemberName, Object[] Arguments, String[] ArgumentNames, Type[] TypeArguments, Boolean[] CopyBack)
at DocuSignLibrary.Document.SendToDocuSign(Envelope Document, List`1 Recipients, Object Envelope

I found the solution in the article Automatic time update in Windows (sync with internet time a.k.a NTP server ). I checked the server time and it was about 5 minutes ahead of the correct UTC time. I have no idea how it got out of sync, or how it continues to do so, but to fix it all I had to do was right click on the clock, click Adjust Date/Time > Internet Time > Change Settings > Update Now. It fixed the service error and Envelopes continued to send, but we have no idea why the time continues to slip out of sync. If anyone else has experienced this problem let me know in the comments!

.NET MVC 4 Model Binding Null on Post

This was an extremely frustrating problem I came across when building an API using .NET MVC 4. I was using HTTP POST to send a model to the API using JSON, and relying on the built in Model-binding to correctly interpret the JSON and convert it to a model. However, for one particular model none of the properties were binding correctly. After a lot of head scratching and threatening my computer screen, I found this post by Anders Åberg:

.NET MVC 4 Model Binding null on POST

Basically, take this model for an example of how the problem occurs. This represents a message in some sort of chat system:

public class ChatMessage
{
    public int ID { get; set; }
    public string Message { get; set; }
    public bool Read { get; set; }
}

And your API for example has a Controller method that accepts a HTTP POST request to add a new message:

[HttpPost]
public ActionResult Send( ChatMessage message)
{
    var id = Chat.Send(message);
    return Content(id);
}

And then the JSON POST body would look something like this:

{
	ID: null,
	Message: "Hey how's it going?",
	Read: false
}

You would expect that the model would be bound correctly and that all the properties would be filled in correctly right? Right?! WRONG! Because I’ve named the parameter for the Send method the same thing as one of the model’s properties, Message. This really confuses the model binder because it’s trying to just use the message property and binding it to the model, instead of just counting as a property of the model.

There are two things you can do to get around this:

  1. 1. Don’t name any of your method parameters the same thing as any of your model’s properties! E.g. newMessage instead of message.
  2. 2. Don’t name your model properties anything similar to the name of the model class, as you may want to use it as a variable name. E.g. MessageText instead of Message.