More

    What is a Mixture of Experts model?


    Mixture of Experts (MoE) is an AI architecture which seeks to reduce the cost and improve the performance of AI models by sharing the internal processing workload across a number of smaller sub models.

    The concept first appeared in a 1991 paper written by Geoffrey Hinton of Toronto university, one of the pioneers of AI. Strictly speaking, these smaller MoE models are not experts, they are simply discreet neural networks which are given sub-tasks to do in order to complete the main task.

    https://cdn.mos.cms.futurecdn.net/k5ijepmN2zo3pHtm9HCJMK.png



    Source link

    Latest articles

    spot_imgspot_img

    Related articles

    Leave a reply

    Please enter your comment!
    Please enter your name here

    spot_imgspot_img