CogX 2019 – Day 1 Report
Despite being a wash-out, CogX 2019 seemed to be a sell-out, with thousands of people queuing in the rain to get into the event that celebration of all things AI and other Emerging Technologies.
As I was sitting drinking my morning Latte and flicking through the programme to work out which sessions to join, the irony of one caught my eye:
“AI for Reducing Emissions from Energy” – I had to join it.
If the irony is lost on you, then let me quickly explain.
AI is essentially a bunch of fancy math that is able to do certain things at human levels or above if it’s fed enough data.
The fancier the math, and the more data – the more computational resources (or simply ‘compute’) required to calculate the answer in a given time.
Still with me?
Yeah. Good – because ‘compute’ is basically a techie word that euphemises the process by which electricity is burned in order to do the fancy math.
The fancier the math, the more data, the more electricity needed.
That’s why bitcoin miners build their data-centres under hydro-electric powerplants. If the power is cheaper, the math is too.
Modern AI is able to do incredible things, such as tell the breed of dog from photographs – just the same as we humans can.
The dirty secret is how much electricity this consumes to train, run, and manage the model behind this.
While a human is content with a bowl of cereal and a latte in the morning (or two, in my case) – an AI will be busy working through the equivalent of a few thousand lbs of CO2 equivalent in order to achieve the same task.
Just last week the MIT Technology Review published an article by Karen Hao which lifted the lid on the terrible carbon footprint of modern AI methods.
She referenced a study by University of Massachusetts, Amherst which compared the lifecycle of an AI model against the carbon footprint of an average car in the US.
The AI would emit nearly 5 times the carbon dioxide of the car (including the manufacture of the car itself).
The panellists themselves at CogX were great.
It was hosted by Jack Kelly, a co-founder of Open Climate Fix and someone who confessed on stage to being “terrified by climate change”.
The session was hosted by Nick Bridge, who is the UK government representative on Climate Change – and gave a compelling opening address highlighting the scale of the challenge, and how bold we must be if we are to tackle it.
Katie Russell from energy company OVO gave some really great examples of how we need to think differently about how we balance production with energy storage (something her company is leading the market on), and Michelle Vezie-Taylor spoke about how we can better optimise the distribution of energy.
The final panellist, Pete Clutton-Brock spoke on the difficulties of nudging consumer behaviour alongside the challenges of getting multi-national support for initiatives such as his own AI Research Foundation for Climate Change which he hopes to get UN endorsement on later in the year.
The whole time though I was left wondering – if AI really does burn this much electricity, then maybe we should just pull the plug if we’re serious about climate change?
Aren’t all these speech recognition and dog recognition algorithms just sheer self-indulgence on behalf of a species who has taken its planet to the brink?
Maybe the AI Research Foundation for Climate Change is exactly what we need to apply very clever thinking to solving the challenge, and maybe also we need to be more honest about our AI carbon footprint?
Sadly though, such an idea is marked as heresy at a festival such as CogX – an event which has signed up to support the UN Sustainable Goals through www.2030vision.com, but one which celebrates AI as the solution yet is for the foreseeable future a significant contributor to climate change.