@Teapode
Mistakes in the video<br><a href="https://www.youtube.com/watch?v=3__HO-akNC8&t=260">4:20</a> CPUs use around 300W, GPUs use 300W also. There are no difference. <br><a href="https://www.youtube.com/watch?v=3__HO-akNC8&t=290">4:50</a> You could run large language model on your laptop. Localy. You dont need a datacenter for that. <br><a href="https://www.youtube.com/watch?v=3__HO-akNC8&t=330">5:30</a> That training is for investors only. Model do not get much better with 1000X more data in them. And now as 95% of content is generated by AI, AI is getting trained on AI. And it doesnt work anymore. So old, pre AI models, trained on people and not in an AI echochamber could be better. <br><a href="https://www.youtube.com/watch?v=3__HO-akNC8&t=600">10:00</a> and rest of the video: US has a shity grid, datacenter has nothing to do with it :)<br><a href="https://www.youtube.com/watch?v=3__HO-akNC8&t=1140">19:00</a> grid is so shity, that companies are building their own powerplants