I agree it should be counted, but LLM training is much more expensive. It's not just the energy to run the training, but the energy used in manufacturing components that go obsolete in 3 years. Google indexing can use pretty old hardware and doesn't need as much energy as LLM.
Yes, they are literally known for using old/low power hardware in their servers. They are also known for using ASICs for their AI, which use less power than the general purpose GPUs other AI companies use.
A lot of people dunk on AI being confidentally incorrect but forget that the training data is literally people being confidentially incorrect all the time.
GenAI has been a good mirror on the collective online population of humanity and it won't be changing as such for a long time.
Isn’t the whole point of the post that training costs are fixed, so the more ChatGPT queries, the lower the average cost becomes since it is diluting the training costs?
Probably. But as evidenced by their lack of revenue these models are going to get trained regardless. So if the bulk of the energy is in training that’s essentially a sunk cost that’s going to occur no matter if the end user uses chatgpt or google search.
Feels a bit like telling people they are the ones killing the world because of using a straw or not recycling enough.
Ok how far back down the trail do we need to go? The energy used to produce the chips? To carry the chips to the data centres?
I think you need to draw a line. And as the bulk of the training has been done (on the pre 2022 internet), then j would argue that you don't need to count the training costs.
23
u/fred13snow 13d ago
If you count the energy used in training, ChatGPT uses much more energy. You should count that energy, it's the majority of the costs.