More

    AI GPUs will soon need more power than a small country, as HBM memory growth spirals out of control




    • Future AI memory chips could demand more power than entire industrial zones combined
    • 6TB of memory in one GPU sounds amazing until you see the power draw
    • HBM8 stacks are impressive in theory, but terrifying in practice for any energy-conscious enterprise

    The relentless drive to expand AI processing power is ushering in a new era for memory technology, but it comes at a cost that raises practical and environmental concerns, experts have warned.

    Research by Korea Advanced Institute of Science & Technology (KAIST) and Terabyte Interconnection and Package Laboratory (TERA) suggests by 2035, AI GPU accelerators equipped with 6TB of HBM could become a reality.

    https://cdn.mos.cms.futurecdn.net/qEYc2gRYcGJqvrizgBWCnZ.png



    Source link

    Latest articles

    spot_imgspot_img

    Related articles

    spot_imgspot_img