How MediaTek is building AI from Edge to Cloud

    ChatGPT query takes nearly 10 times more electricity to process than a Google search. This growth comes at a cost, with the processes required to run generative AI applications much more power-intensive than previous innovations. At the same time, continued development requires more complex training models and ever-expanding budgets. The result is data centre power demand is predicted to grow 160 per cent by 2030 as AI goes mainstream.

    “One of the biggest challenges is energy consumption,” says Adam King, Vice President and General Manager of client computing at MediaTek. “Companies are building new generative AI models, and training and running them takes enormous amounts of energy.”

    Alexandru Cioba, Senior Research Scientist at MediaTek, says that developers are hitting a glass ceiling in terms of power and data consumption – and he foresees a potentially more serious blockage. “The larger these training models become, the more hours and manpower it will take to make them succeed,” he says. “We’re coming to a point where a team of experts will only be able to iterate on a model five to ten times during their working lives – and these intervals are going to get longer.”

    One step towards addressing these challenges could come via edge computing, a model that brings data processing and storage as close as possible, to be processed on a device itself. Doing so reduces reliance on connectivity and saves power. Smart cars offer an example: critical computing such as seatbelt activation or autonomous emergency braking systems could occur onboard. Smart traffic management systems or collision warning might take place at the edge, while general updates remain in the cloud.