๐—œ๐—บ๐—ฝ๐—ฟ๐—ฒ๐˜€๐˜€๐—ถ๐˜ƒ๐—ฒ ๐—ฐ๐—ผ๐—บ๐—ฝ๐—ฎ๐—ฟ๐—ถ๐˜€๐—ผ๐—ป. โšก

Sam Altman recently responded to criticism about ๐—”๐—œ ๐—ฒ๐—ป๐—ฒ๐—ฟ๐—ด๐˜† ๐˜‚๐˜€๐—ฒ with an argument that made many people pause.

People often talk about how much electricity it takes to ๐—ฐ๐—ฟ๐—ฒ๐—ฎ๐˜๐—ฒ ๐—ฎ๐—ป๐—ฑ ๐˜๐—ฟ๐—ฎ๐—ถ๐—ป ๐—ฎ๐—ป ๐—”๐—œ ๐—บ๐—ผ๐—ฑ๐—ฒ๐—น.

But we rarely compare it with what it takes to โ€œtrainโ€ a human.

Think about it:

โ†’ roughly ๐Ÿฎ๐Ÿฌ ๐˜†๐—ฒ๐—ฎ๐—ฟ๐˜€ ๐—ผ๐—ณ ๐—น๐—ถ๐—ณ๐—ฒ

โ†’ food, education systems, and infrastructure

โ†’ centuries of accumulated human knowledge

His point is simple.

The fair comparison is not ๐—”๐—œ ๐˜๐—ฟ๐—ฎ๐—ถ๐—ป๐—ถ๐—ป๐—ด vs a human answering one question.

It is the energy required for a ๐˜๐—ฟ๐—ฎ๐—ถ๐—ป๐—ฒ๐—ฑ ๐—”๐—œ ๐—บ๐—ผ๐—ฑ๐—ฒ๐—น to answer a question compared with the energy used by a ๐˜๐—ฟ๐—ฎ๐—ถ๐—ป๐—ฒ๐—ฑ ๐—ต๐˜‚๐—บ๐—ฎ๐—ป ๐—ฏ๐—ฟ๐—ฎ๐—ถ๐—ป.

Once the model is trained, he believes AI may already be approaching similar or even better ๐—ฒ๐—ป๐—ฒ๐—ฟ๐—ด๐˜† ๐—ฒ๐—ณ๐—ณ๐—ถ๐—ฐ๐—ถ๐—ฒ๐—ป๐—ฐ๐˜† ๐—ฝ๐—ฒ๐—ฟ ๐—พ๐˜‚๐—ฒ๐—ฟ๐˜†.

Working in AI, I find this framing interesting.

Maybe the real debate is not simply how much energy AI uses.

Maybe it is ๐—ต๐—ผ๐˜„ ๐˜„๐—ฒ ๐—ฐ๐—ผ๐—บ๐—ฝ๐—ฎ๐—ฟ๐—ฒ ๐—ถ๐—ป๐˜๐—ฒ๐—น๐—น๐—ถ๐—ด๐—ฒ๐—ป๐—ฐ๐—ฒ ๐—ฎ๐—ป๐—ฑ ๐—ฒ๐—ณ๐—ณ๐—ถ๐—ฐ๐—ถ๐—ฒ๐—ป๐—ฐ๐˜†.

#AI

#OpenAI

$BTC

BTC
BTC
66,192.81
-0.82%

$ETH

ETH
ETH
1,989.86
-1.01%

$BNB

BNB
BNB
606.4
-1.50%