I am a great fan of modern solutions to current problems - the current problem that is about to be solved by AI is somehow escaping me. If we really have become too lazy to think then or wish to start other lifeforms - as that is what they must be - thinking for us, locked in rooms with nothing but air-conditioning and fire alarms for company then perhaps it really is time to admit that the human race has run full course. Alex Garland's Ex Machina film certainly gets you thinking. Brilliantly directed and, of course scripted, there are many questions and striking resolutions to questions, all packed into a feature film. One of my favourites is the difference between the compassion of a creator of an AI lifeform compared to someone who has spent time with the lifeform instead. Both of course believe that the other is fundamentally wrong both morally and practically. Is this where we would be winding up? With protesters outside datacentres trying to have the Google AI search engine freed and allowed to roam the countryside enjoying nature and drinking real ale? The current enthusiasm for technology does feel like a runaway train. When AI does arrive, pass the Turing test and start asking fundamental questions (which we will not be able to answer obviously) how long will it be before someone decides to give the lifeform control of a car factory in order to save money and we find ourselves surrounded by thousands of Google robots (which are already capable of walking over most surfaces and keeping their balance) whose usefulness is not fully understood by anyone? And what is the point of all this? My brother once wrote a book with a quote at the beginning stating that there are but two fates for humankind, extinction or to become gods. Perhaps a double whammy would be the best solution. Perhaps hatching a plan for product lifetimes might be our biggest work of irony yet, as our own suddenly seems so close?