๐๐บ๐ฝ๐ฟ๐ฒ๐๐๐ถ๐๐ฒ ๐ฐ๐ผ๐บ๐ฝ๐ฎ๐ฟ๐ถ๐๐ผ๐ป. โก
Sam Altman recently responded to criticism about ๐๐ ๐ฒ๐ป๐ฒ๐ฟ๐ด๐ ๐๐๐ฒ with an argument that made many people pause.
People often talk about how much electricity it takes to ๐ฐ๐ฟ๐ฒ๐ฎ๐๐ฒ ๐ฎ๐ป๐ฑ ๐๐ฟ๐ฎ๐ถ๐ป ๐ฎ๐ป ๐๐ ๐บ๐ผ๐ฑ๐ฒ๐น.
But we rarely compare it with what it takes to โtrainโ a human.
Think about it:
โ roughly ๐ฎ๐ฌ ๐๐ฒ๐ฎ๐ฟ๐ ๐ผ๐ณ ๐น๐ถ๐ณ๐ฒ
โ food, education systems, and infrastructure
โ centuries of accumulated human knowledge
His point is simple.
The fair comparison is not ๐๐ ๐๐ฟ๐ฎ๐ถ๐ป๐ถ๐ป๐ด vs a human answering one question.
It is the energy required for a ๐๐ฟ๐ฎ๐ถ๐ป๐ฒ๐ฑ ๐๐ ๐บ๐ผ๐ฑ๐ฒ๐น to answer a question compared with the energy used by a ๐๐ฟ๐ฎ๐ถ๐ป๐ฒ๐ฑ ๐ต๐๐บ๐ฎ๐ป ๐ฏ๐ฟ๐ฎ๐ถ๐ป.
Once the model is trained, he believes AI may already be approaching similar or even better ๐ฒ๐ป๐ฒ๐ฟ๐ด๐ ๐ฒ๐ณ๐ณ๐ถ๐ฐ๐ถ๐ฒ๐ป๐ฐ๐ ๐ฝ๐ฒ๐ฟ ๐พ๐๐ฒ๐ฟ๐.
Working in AI, I find this framing interesting.
Maybe the real debate is not simply how much energy AI uses.
Maybe it is ๐ต๐ผ๐ ๐๐ฒ ๐ฐ๐ผ๐บ๐ฝ๐ฎ๐ฟ๐ฒ ๐ถ๐ป๐๐ฒ๐น๐น๐ถ๐ด๐ฒ๐ป๐ฐ๐ฒ ๐ฎ๐ป๐ฑ ๐ฒ๐ณ๐ณ๐ถ๐ฐ๐ถ๐ฒ๐ป๐ฐ๐.



