1 Comment

> At the hardware layer, we are seeing a transition from CPUs to GPUs to TPUs, which are more efficient for training massive (30B+ parameter) models.

Why isn’t this a main investment opportunity? ie Google, Amazon who are leading the pack with TPUs

Expand full comment