The idea that humans can dominate nature and rule over it has popped up quite recently in human history and has come to sweep the planet, and to change and degrade its natural systems. But where does this idea come from, how has it influenced human history and what will come after its collapse amid the climate crisis?