Rails’ named scopes are fantastic for making readable code with efficient SQL behind the scenes. Given a Person model, you could define a scope called ‘adult’ which could add an SQL WHERE for age>18, and let you write pretty code like Person.adult to find all adults. Great! Chaining them together gets even better with code like Person.adult.male.with_glasses.
The problem comes if you do this in a single action with lots of different chained scopes to show different data. You end up with a whole bunch of SQL queries, which can get expensive. In the system I’m currently building, which includes geospatial queries, this gets far too slow to use. So, instead, let’s get one big list of objects and filter them in Ruby when we need them.
Simple enough. The only thing is, the ActiveRecord chaining syntax is lovely. I don’t want to write tons of chained selects with blocks and all that faff. Instead, let’s extend Array to give us the syntax we like. The trick is to wrap an Array inside another class and then use method_missing to delegate. We then make sure that returned objects aren’t Arrays, but our new type instead.
It’s best explained with the code:
The other day I started reading The Geek Manifesto, by Mark Henderson, and, given my particular interests, I started near the end with the chapter on Geeks and Greens before starting properly at the beginning.
I strongly suggest reading it, but to summarise, the traditional green movement is generally unscientific and deeply political, and this has led to counterproductive politicisation and scepticism of scientific issues like climate change.
The greens latched onto a few things correctly by accident, like climate change research, but are still vastly wrong on many others. They come with too much junk science to be credible, and this leaks into the real science that they accidentally picked up. The association of the Green Party, for instance, with homeopathy and other alternative medicine approaches, is just sad in this day and age (although I believe it changed recently, it wouldn’t surprise me if the party still includes many who do believe this junk).
Add to that that many traditional greens have far-left politics which puts a lot of people off, and you end up with the conclusion that they *cannot* be trusted to be the main voice on important matters of science like climate change, which should essentially be an apolitical issue.
Whether it’s NGOs like Greenpeace or Friends of the Earth, or political parties like the Green Party here in the UK, they are just too embedded in their history to be able to deal with this properly. They can’t give up their hatred of nuclear power, for instance, or their opposition to GM science, both of which are probably going to be essential tools for our future, without alienating their core membership.
I agree with Mark that a new approach is needed, but I don’t think those organisations can be rescued ‘from the inside’ as he suggests. They can carry on if they like, but even in the unlikely event that they change their views, the public’s perception of them would still be the same. We need something new, something reality-based, something that aims to have the clean high-tech future that I personally want.
Even the word ‘green’ is now rendered useless by these viewpoints, so we have to just drop it and talk about ‘sustainability’ and ‘clean’ instead. If someone could come up with a catchy name for it that’s half as good as ‘green’ though, that would be lovely.
The newly-named cleanweb is a strange place. It’s all about social change, better sustainability, and other such noble goals, but it’s shackled by the demands of today’s economy; people with mortgages need proper jobs and investors need returns. The potential is enormous, but the opportunity to really explore it feels tiny. Some companies (like AMEE) do explore this space, but are still bound to go with the most commercially-viable opportunities. There’s very little space for blue-sky thinking.
I have a gut feeling that anything that truly changes behaviour will have to do so without the prospect of financial payback. By buying into the current assumptions about how the world should run, you’re maybe holding back the really ground-breaking ideas. There are some things that will work, sure, but thousands of ideas that could make the world better will never get off the ground because of a lack of business model.
What we need is a social-good model instead. We don’t need more space for entrepreneurs, but instead we need space for people who want to change the world for the better to do so. Not for money, but because it’s a good and noble and important thing to do. We need some way of supporting and developing great ideas that come out of events like the London Green Hackathon, even if they don’t have a way to make money for someone (though of course some do and that’s great).
We need a Cleanweb R&D Foundation of some kind, with a remit to simply make the world better and try things out. That would be a place that could really attack some truly innovative ideas. And it would be a hell of a lot of fun.
Somehow, this year feels different.
Maybe it’s because of the increase in public awareness of the problems with the world at the end of 2011. A greater understanding that governments and people don’t run the world any more, that we have given money too much power, that we are running full tilt at a cliff-edge; it feels like it’s adding up.
Charlie Brooker said recently that 2011 felt like a series finale; I’m not sure about that, it felt to me more like the buildup in the penultimate episode. 2012 seems like the actual last episode when the big events will happen (probably with a few cliff-hangers left over for 2013).
This impending feeling that big events are coming could be imaginary of course, but I hope not. We need big events now (and I don’t mean the obvious 2012 fake-apocalypse fantasy). We are racing so fast towards collapse that we have to do something now, or we will be a cautionary tale in the Cockroach Book Of Ancient Civilisations. We still have the ability to save ourselves (just); I hope that this year brings the will to engage with the problem.
We need to grow up, realise the full-spectrum danger we are in, and do something about it. We need to mature as a species, and fast. We need to get over our selfish approach and make a better society for the future, together.
I am a software engineer; that makes me next to useless in a collapsed society. My daughter quite literally needs a functioning, liberal, caring society to keep her alive. Moving to a farmhouse, becoming self-sufficient and letting everyone else collapse is no good for me. A functioning technological society is an essential life support system.
With all that in mind, I am going into this year with an objective to make best use of the skills I have to attack the big problems where I can, and if it doesn’t work out, to fail magnificently having done my best.
I hope you’ll join me. We’ve got a hell of a job to do.
The last week has been an interesting one for online energy monitoring. Not content with deprecating Powermeter a few weeks ago, Google decided to properly kill it off. And then, as if to say ‘us too!’ (as they did with the initial launch), Microsoft killed Hohm.
Both these efforts were unsuccessful because the companies that put them out weren’t serious about them. There is a place for experimentation, but if you’re a massive company, even your small experiments seem larger than other people’s entire companies. By half-heartedly grabbing the attention around energy monitoring, these experiments crowded out the people who are building a business around this, like Currentcost, Enio, Pachube and many more.
The fact is that online energy monitoring is not dead - the demise of these services will simply provide more room for other companies to innovate without being stepped on. Neither Powermeter or Hohm were doing anything particularly innovative that hadn’t been done many times before anyway! Powermeter also suffered from being a silo of data, not a hub. You could put data in, but not get it out, which is a bit pointless. A truly useful service in this space needs to act as an aggregator, letting data flow through it so that other innovations can cluster around it, like the Social Meter project, for instance.
Innovation needs openness, and by smothering a potentially innovative space with their distinctly non-innovative services, MS and Google did nobody any favours. We are better off without them.
Recently, I was writing tests for a piece of code, and came up against some strange problems. The code talks to an external web service using RestClient (1.6.1), so of course the web service is mocked for the tests, in this case using Mocha (0.9.8).
The problem came when I tried to test failure modes; for instance, what happens when the service comes back with a 401 due to bad credentials? RestClient raises an exception for this containing the response, so the mock has to do the same.
stub(:code => 401, :body => nil)
So far so good. The exception is raised, and all is well. The problem comes when we try to access the response code later on in our error-handling code:
rescue RestClient::Exception => e puts e.response.code
Seems reasonable, right? The response object should be a Mocha::Mock that responds to #code and gives back 401. Unfortunately not. We get a ‘stack too deep’ error thrown inside a recursive method_missing call. Huh?
This really should work, so let’s take a look inside the gems. It turns out that RestClient mixes something special into the response object (in this case the mock) for compatibility. The gem used to return Net::HTTP::Response objects, not RestClient::Response objects, so in order to retain compatibility with old code, it mixes in the behaviour of the Net::HTTP class to responses contained inside exceptions. This is taken from restclient/exceptions.rb:
# Compatibility : make the Response act like a
# Net::HTTPResponse when needed module ResponseForException def method_missing symbol, *args if net_http_res.respond_to?(symbol) warn "[warning] ..." net_http_res.send symbol, *args else super end end end class Exception < RuntimeError attr_accessor :message, :response def initialize response = nil, initial_response_code = nil @response = response @initial_response_code = initial_response_code # compatibility: this make the exception behave like
# a Net::HTTPResponse response.extend ResponseForException if response end end
As you can see, it does this by adding its own method_missing, which accesses the net_http_res function of the response. In our mock, that doesn’t exist, so of course method_missing is called ad infinitum. So, let’s add it:
stub(:code => 401, :body => nil, :net_http_res => nil)
Better. Does it work now? Hm. No. It’s still digging a method_missing hole to the Earth’s core.
The real problem is that the method_missing that is mixed in by RestClient::Exception is masking the one in Mocha::Mock, and that one is what does the actual handling of the stub code that we pass in. So, even though we added net_http_res up there, the bit of code that handles it is never called, so we’re still stuffed.
Now, there’s not a lot we can do about that without rewriting the gems. If I was going to do that, I’d try to make sure that RestClient::Exception doesn’t mask existing method_missing functions, but instead aliases them and makes sure they are still called. Not sure how easy that would be, I’ve not tried it, but it would be worth a shot.
I would try to make some general rule here about making sure your code plays nice with other code, but to be honest, I doubt you can consider every case where someone might be subverting your code by giving it things like mock objects (which could implement the mocking in any number of ways) instead of real ones. I guess there will always be a gap somewhere.
The solution I hit upon in the end was to change the stub. Instead of using Mocha, which relies on method_missing to implement its stubbing, I changed to OpenStruct, which doesn’t, and is sufficient in this case.
OpenStruct.new(:code => 401, :body => nil, :net_http_res => nil)
Yay, by using a stub that doesn’t require method_missing to operate, it all works nicely now! My tests are saved and I can get on with some real work… about time.
Last year, I went to the Rewired State Carbon & Energy hack day (#rscarbon), which was a great event. I went along with an idea, and even better actually got to build it! We got a great team together, and despite initial technical teething problems, had a demo running in time for the show and tell at the end (by the skin of our teeth). I’ve been meaning to write it up for ages, but instead, I decided to make a video.
In 2006, I wrote to my MP about the Legislative and Regulator Reform Bill (now Act). The bill contained extremely dangerous powers for Ministers to change anything they liked, giving them the ability to get away with pretty much anything without a proper vote. I campaigned against it, with many others, through Save Parliament.
Anyway, my MP happens to be Francis Maude, who is now Minister for the Cabinet Office. It turns out that his government is now advancing a bill with similar powers, the Public Bodies Bill. Now, I admit I’ve not had chance to follow this in detail, so I don’t know if it’s as bad, or if it’s more limited. However, others who know more might be interested to know what Francis Maude had to say about those sort of powers four years ago.
“I am particularly concerned at the potential for Parliament to be bypassed by the order-making powers contained in Part 1 of the Billl. These powers are extremely constitutionally significant.”
“It is imperative that the circumstances in which these powers can be used are limited and clearly set out in the Bill.”
The full scans are available on my Flickr stream, for anyone who is interested.
Maybe the PBB is consistent with what he said 4 years ago, but maybe not. I’d be interested to know.