Overall, when tested on 40 prompts, DeepSeek was found to have a similar energy efficiency to the Meta model, but DeepSeek tended to generate much longer responses and therefore was found to use 87% more energy.

      • peanuts4life@lemmy.blahaj.zone
        link
        fedilink
        English
        arrow-up
        3
        ·
        edit-2
        2 days ago

        Yes, sorry, where I live it’s pretty normal for cars to be diesel powered. What I meant by my comparison was that a train, when measured uncritically, uses more energy to run than a car due to it’s size and behavior, but that when compared fairly, the train has obvious gains and tradeoffs.

        Deepseek as a 600b model is more efficient than the 400b llama model (a more fair size comparison), because it’s a mixed experts model with less active parameters, and when run in the R1 reasoning configuration, it is probably still more efficient than a dense model of comparable intelligence.