• vivendi@programming.dev
    link
    fedilink
    English
    arrow-up
    3
    ·
    edit-2
    15 hours ago

    According to https://arxiv.org/abs/2405.21015

    The absolute most monstrous, energy guzzling model tested needed 10 MW of power to train.

    Most models need less than that, and non-frontier models can even be trained on gaming hardware with comparatively little energy consumption.

    That paper by the way says there is a 2.4x increase YoY for model training compute, BUT that paper doesn’t mention DeepSeek, which rocked the western AI world with comparatively little training cost (2.7 M GPU Hours in total)

    Some companies offset their model training environmental damage with renewable and whatever bullshit, so the actual daily usage cost is more important than the huge cost at the start (Drop by drop is an ocean formed - Persian proverb)