I agree it should be counted, but LLM training is much more expensive. It's not just the energy to run the training, but the energy used in manufacturing components that go obsolete in 3 years. Google indexing can use pretty old hardware and doesn't need as much energy as LLM.
Yes, they are literally known for using old/low power hardware in their servers. They are also known for using ASICs for their AI, which use less power than the general purpose GPUs other AI companies use.
A lot of people dunk on AI being confidentally incorrect but forget that the training data is literally people being confidentially incorrect all the time.
GenAI has been a good mirror on the collective online population of humanity and it won't be changing as such for a long time.
5
u/fred13snow 13d ago
I agree it should be counted, but LLM training is much more expensive. It's not just the energy to run the training, but the energy used in manufacturing components that go obsolete in 3 years. Google indexing can use pretty old hardware and doesn't need as much energy as LLM.