New Earswick Tag Rugby Tournement July 2009

July 14, 2009

winoes at York July 2009

Originally uploaded by Wertster

Our merry band of rugby-bandits descended onto the turf at our latest tag-rugby tournament in York on the 11th July, hosted by New Earswick RLFC. With a low turn-out we got to work and had a great time, winning out overall in what was a really good natured day. Hopefully we’ll get a bigger turn-out next year. Looks like our next stop is now the Winoes Tour to the Anglet Beach Rugby Festival in Biarritz on the 25th July, where we’ll joust with our French cousins in the shadow of what I hear is a well stocked beer tent.


The CI, TDD and OO Love Triangle

July 14, 2009

I would not describe myself as a purist in any aspect of my life, least of all my software engineering practices. I’ve been re-establishing myself as a coder after a long while in the more abstract realms of architecture and design, and on the return-path I’m seeing things in a somewhat different light, and I think it’s an interesting observation hence this post.

Object Orientation was an approach I previously adopted to partition my problem-space into a group of components – so I could start hacking, rather than putting on the ‘everything is an object and so is its mother’ spectacles. I never quite got my head around why colleagues and ‘real developers’ would sit for hours debating the 99th level of object decomposition and whether an object had to be so fine-grained it should actually contain no code?

My return journey to the world of software development has intersected with new (I say ‘new’ given I was a full-on hacker in the mid 90’s so this is new to me) and emerging trends around Continuous Integration and Test Driven Development and I have to say these intersections have made a huge impression on me to the extent that I’m now passionate about the benefit brought about by these techniques. Maybe I’m so pro-CI and pro-TDD because I lived through the appalling software engineering practices of the 80’s and early 90’s, or may be it’s just because it makes so…much…sense.

So to my point. Continuous Integration has a simple benefit. You know when your code-base has been tainted. The more frequently you get to know that the better. I’ll avoid quoting the scientists who waffle about cost exponentials for fixing defects today versus tomorrow blah. No – it just…makes….so….much….sense ! I didn’t need a PhD in chin rubbing to work that out.

Next Test-Driven Development. Right this one took time, just a little time to get my head around, even though I was convinced that CI was the way to go no questions asked. I found myself asking the stereotypical questions like “won’t I be 50% productive if I spend half my time writing tests for the code I am writing?”. Let me just pause while I give myself 10 lashes for being so narrow-minded. THWACK, THWACK, THWACK, THWACK, THWACK, THWACK, THWACK, THWACK, THWACK, THWACK…and one for luck THHHHHWACK!!  So it’s a natural inclination to feel like test-cases are just code that you’d traditionally throw into the code-base, but when you consider the fact that you’re actually building inherent certainty into the code-base at every step of the way the lightbulb goes on in a big way. When I sat back and appreciated that I wouldn’t have to run huge amounts of code to verify and exercise a small method I just changed it….just….makes….so…much….sense. All the regression testing scenarios I was carrying round in my head ! The lack of repeatability in my approach to testing !! The gradual feeling of dread that I’d become used-to, after deploying a stable version of code, in case I had to, god-forbid, change it at some point??! It all cries out for an inherent approach that means you lock-down every function you implement as you implement it – such that you can purge all that baggage from your poor little cranial-walnut. It….just….makes….so….much…sense. I was soon a TDD convert, but only at that point did it strike me like a bolt from the blue…finally my life made sense….finally I could truly embrace the love-triangle of CI->TDD->OO.

The effectiveness of my CI/TDD regime relied totally on the level of granularity I could descend to with my class definitions, and the simplification of each and every function to it’s thinnest possible form.  Real fundamental Object Orientation finally became the underpinning foundation – notice I place it in that underpinning role – as I’ve only now seen OO as an enabler for the ‘common-sense’ and ‘value-adding’ techniques of CI and TDD. I’m a little weird that way – but I found my way into rigorous OO not via the brain-washing from the chin-rubbing ranks of “my object is more objecty than yours” brigade, but instead through the simple realisation that I can squeeze ever more value from my CI/TDD framework if I force my test-cases into finer and finer levels of granularity (within reason of course, I aint lost my roots yet !! )

So there you have. CI drives me to TDD. TDD drives me to appreciate why OO should exist, NOT vice versa…

Ruby Code Quality and Metric_Fu

July 9, 2009

I’ve been putting together a continuous integration framework for my Ruby projects and was very pleased to find the great work by Jake Scruggs ( ) by the name of Metric_Fu. On the whole this aggregation of tools covers all the key ‘quality’ bases and generates some elegant graphical reports that can be simply published alongside other build artifacts in CruiseControl.rb for example. On the whole the installation is very simple, but I hit some issues with the code and had to apply some class method overrides to get it to work.


My base environment is OSX 10.5.7, Ruby 1.8.6, Gem 1.3.3. I installed the gems jscruggs-metric_fu (1.1.1), rcov (, flog (2.1.2), flay (1.3.0), reek (1.1.3), roodi (1.4.0) and Saikuro (1.1.0). Following the instructions at I was complete in short order, and ran the rake task as per the example.

The first problem I hit was a method naming issue where the a Flog#flog_files method was being called which didn’t exist in the versions I was using. I ended up with a quick-n-dirty fix which involved amending the /gems/flog-2.1.2/bin/flog script to remove the call to flog_files, replacing it with the correct method name flog:

flogger = options
flogger.flog ARGV

Second problem I hit was some floating point and divide-by-zero issue in the MetricFu::Generator#round_to_tenths method. I don’t profess to understand the core issue here, but the value of ‘NaN’ was appearing in the calculation – so I added a ‘very’ primitve override to filter this condition (elegance is not my middle name I you hadn’t realised already):

module MetricFu
  class Generator
    def round_to_tenths(decimal)
      decimal=0.0 if decimal.to_s.eql?('NaN')
      (decimal.to_i * 10).round / 10.0

Next I hit an issue with rcov not returning any results into the dashboard? I checked this out and located an error in the rcov.txt file generated as part of the execution. This simply informed me that the rcov commandline generated within the metric_fu rcov generator class was incorrect. After a little experimentation I worked out that the –include-file parameter was the issue…so I again redefined the problematic class method in my Rakefile to remove this issue:

Module MetricFu
  class Rcov
    def emit
        FileUtils.rm_rf(MetricFu::Rcov.metric_directory, :verbose => false)
        test_files = FileList[*MetricFu.rcov[:test_files]].join(' ')
        rcov_opts = MetricFu.rcov[:rcov_opts].join(' ')
        output = ">> #{MetricFu::Rcov.metric_directory}/rcov.txt"
        #PROBLEM: `rcov --include-file #{test_files}  #{rcov_opts} #{output}`
        `rcov #{test_files}  #{rcov_opts} #{output}`
      rescue LoadError
        if RUBY_PLATFORM =~ /java/
          puts 'running in jruby - rcov tasks not available'
          puts 'sudo gem install rcov # if you want the rcov tasks'

Next I hit a problem when modifying my declared MetricFu::Configuration object to change the location of the outputs of the analysis, to reduce the pollution in my project folder hierarchy (yep…I probably do have some deeper issues). I had added the following lines to the sample provide by Jake: do |config|
  config.base_directory = ENV['CC_BUILD_ARTIFACTS'] || 'quality'
  config.scratch_directory = File.join(config.base_directory, 'scratch')
  config.output_directory = File.join(config.base_directory, 'output')
  config.data_directory = File.join(config.base_directory, '_data')

With this new folder structure I also had to add a tweek to the Saikuro configuration entry in the same sample, to make sure the Saikuro outputs are picked up by the report aggregator. Without this tweek I was just getting a blank Saikuro template page offered by Metric_Fu:

(Ignore the space between the colon and the ‘o’ – it’s giving me smilies for some reason)

config.saikuro  = { : output_directory => 
                              File.join(config.scratch_directory, 'saikuro'), 

This change caused a failure in the MetricFu::Graph#generate method – which on closer inspection appeared to be hardwired in expecting a depth of 3 folders in the locations specified for these outputs? So again I re-declared this class in my Rakefile to make the specific change necessary to support any level of nesting. This fixed the problem and my folder feng-shui returned to equilibrium…mmmmmmm.

Module MetricFu
  class Graph
    def generate
      puts "Generating graphs"
      Dir[File.join(MetricFu.data_directory, '*.yml')].sort.each do |metric_file|
        puts "Generating graphs for #{metric_file}"
        #PROBLEM: date = metric_file.split('/')[3].split('.')[0]
        date = metric_file.split('/').last.split('.')[0]
        metrics = YAML::load(
        self.clazz.each do |grapher|
          grapher.get_metrics(metrics, date)
      self.clazz.each do |grapher|

So my installation of metric_fu works perfectly now and I’m very impressed with the output it’s providing so long as I don’t dwell too much on the fact that it’s telling me my code quality is somewhere between criminal and insane !!!

I hope these notes may help others experiencing similar frustration in trying to get the install working…and again my thanks to Jake Scruggs for this piece of goodness.

Cheers all !

Bridlington Beach Rugby 2009

June 30, 2009

Post tournament Winoes

Originally uploaded by Wertster

Our veteran team of ex-ex players, affectionately known as the Leeds Winoes, made another showing at the Brid beach-tag tournament on the 27th June. We made a close 3rd place behind West Leeds (2nd) and Bridlington-A (1st) in a great competition. Next stop is our first ‘international’ in Biarritz, France on the 25th July when we take on the best of the European circuit. Might have recovered by then.

VMWare Fusion XP VM Losing DNS !

October 25, 2008

I’ve been running OSX Vmware Fusion 1.x and XP SP2 & SP3 for over a year and it’s been ROCK SOLID ! I run a web-connected OSX host, and a XP VM VPN’d into a corporate network all day, every day and I have not had a single problem. Until this week…when my calm seas were interrupted…

Out of the blue I notice witin the XP SP3 VM was failing to resolve DNS queries. OK. Why? I basically ran a number of diagonstics, checked driver versions and everything checked out in terms of VM integrity.No idea. The worst kind of problem…

I checked out the net and located numerous threads deliberating over the XP “Unable to flush DNS cache” type of errors, with long and elaborate threads falling into the detail of comparing router firmware versions and other such infinate variables. Eject..Eject…

It was only when I ran the VMWare packet sniffer on the OSX host I could see that the XP VM was requesting DNS, and the resoponses from the OSX host were being dispatched. From my understanding of low level IP it appeared that all was performing as expected. However I then started thinking about reasons why UDP packets were being neglected by the XP VM’s IP stack….BINGO !

I then checked out the Windows XP Event Viewer under the Security event list, and there I see all my DNS responses (from the OS X host) arriving back at my XP VM as UDP packets, all being summarily discarded by my failed/corrupted firewall. Couple of minutes later, having run the ‘support’ utility from the firewall supplier, the flood-gates openend and UDP/DNS was back in business.

Symptoms I encountered in XP:

  1. DNS resolution (i.e. ping within XP VM failing but direct direct addressing worked ok (i.e. ping
  2. nslookup in the console returned ‘no response from server’ errors in response to queries.
  3. Right-clicking on the network connection icon in XP, and executing Repair proceeded through all steps apart from the final DNS cache at which point Unable to repair connection was returned.
  4. ipconfig /registerdns failed with a non-specific error

In hindsight the symptoms all point to UDP return-path and firewall but verifying the request path with the OSX VMWare vmnet-sniffer utility (located in /Library/Application Support/VMWare Fusion folder) made this a whole lot simpler.

Where a Semantic Contract Fits…

October 10, 2008

I’ve been posting about the rise of the informal semantic contract relating to web-services and the deficiencies of XML Schema in adequately communicating the capability of anything other than a trivial service. Formalising a semantic contract by enriching a baseline structural contact (WSDL/XSD) with semantic or content-based constraints, effectively creates a smaller window of well-formedness, through which a consumer must navigate the well-formedness of their payload in issuing a request. Other factors such as incremental implementation of a complex business service ‘behind’ the generalised service interface compound the need for a semantic contract.

To clarify the relationship between structural and semantic, I happened upon a great picture which I’ve annotated…

Validating XML with Schema in Ruby

October 10, 2008

I posted a long time back about my troubles in finding a way of performing schema validation in ruby (see Ruby and Xml Schema). At that time I was using REXML and only able to perform well-formedness checks based on basic structural integrity, but had no way to take an XSD and validate an instance document.

I’m pleased to say that there is a way to do this now, namely libxml-ruby. It’s available as a gem (gem install libxml-ruby) and the process is pretty simple:

  document = LibXML::XML::Document.file(@xml_filename)
  schema =
  result = document.validate_schema(schema) do |message,flag|
    puts message

I’ve found this to be a very neat piece of code for dealing with the kind of schema integrity checking I’m looking for, and as I blend this with a number of other java-based parses using the Ruby Java Bridge I get a pretty good, consistent perspective on validity.