Close Menu
Cryprovideos
    What's Hot

    Bybit Regains Floor In 2025 After Historic Hack, CoinGecko Finds

    January 30, 2026

    Morning Minute: Washington Simply Gave Crypto the Inexperienced Gentle – Decrypt

    January 30, 2026

    What Position Is Left for Decentralized GPU Networks in AI?

    January 30, 2026
    Facebook X (Twitter) Instagram
    Cryprovideos
    • Home
    • Crypto News
    • Bitcoin
    • Altcoins
    • Markets
    Cryprovideos
    Home»Crypto News»What Position Is Left for Decentralized GPU Networks in AI?
    What Position Is Left for Decentralized GPU Networks in AI?
    Crypto News

    What Position Is Left for Decentralized GPU Networks in AI?

    By Crypto EditorJanuary 30, 2026No Comments6 Mins Read
    Share
    Facebook Twitter LinkedIn Pinterest Email


    Decentralized GPU networks are pitching themselves as a lower-cost layer for working AI workloads, whereas coaching the newest fashions stays concentrated inside hyperscale information facilities.

    Frontier AI coaching includes constructing the biggest and most superior techniques, a course of that requires 1000’s of GPUs to function in tight synchronization.

    That degree of coordination makes decentralized networks impractical for top-end AI coaching, the place web latency and reliability can’t match the tightly coupled {hardware} in centralized information facilities.

    Most AI workloads in manufacturing don’t resemble large-scale mannequin coaching, opening area for decentralized networks to deal with inference and on a regular basis duties.

    “What we’re starting to see is that many open-source and different fashions have gotten compact sufficient and sufficiently optimized to run very effectively on client GPUs,” Mitch Liu, co-founder and CEO of Theta Community, instructed Cointelegraph. “That is making a shift towards open-source, extra environment friendly fashions and extra economical processing approaches.”

    What Position Is Left for Decentralized GPU Networks in AI?
    Coaching frontier AI fashions is very GPU-intensive and stays concentrated in hyperscale information facilities. Supply: Derya Unutmaz

    From frontier AI coaching to on a regular basis inference

    Frontier coaching is concentrated amongst a number of hyperscale operators, as working massive coaching jobs is dear and complicated. The newest AI {hardware}, like Nvidia’s Vera Rubin, is designed to optimize efficiency inside built-in information middle environments.

    “You’ll be able to consider frontier AI mannequin coaching like constructing a skyscraper,” Nökkvi Dan Ellidason, CEO of infrastructure firm Ovia Methods (previously Gaimin), instructed Cointelegraph. “In a centralized information middle, all the employees are on the identical scaffold, passing bricks by hand.”

    That degree of integration leaves little room for the unfastened coordination and variable latency typical of distributed networks.

    “To construct the identical skyscraper [in a decentralized network], they should mail every brick to 1 one other over the open web, which is very inefficient,” Ellidason continued.

    NVidia, Business, Decentralization, AI, GPU, Features
    AI giants proceed to soak up a rising share of world GPU provide. Supply: Sam Altman

    Meta educated its Llama 4 AI mannequin utilizing a cluster of greater than 100,000 Nvidia H100 GPUs. OpenAI doesn’t disclose the scale of the GPU clusters used to coach its fashions, however infrastructure lead Anuj Saharan mentioned GPT-5 was launched with assist from greater than 200,000 GPUs, with out specifying how a lot of that capability was used for coaching versus inference or different workloads.

    Inference refers to working educated fashions to generate responses for customers and functions. Ellidason mentioned the AI market has reached an “inference tipping level.” Whereas coaching dominated GPU demand as just lately as 2024, he estimated that as a lot as 70% of demand is pushed by inference, brokers and prediction workloads in 2026.

    “This has turned compute from a analysis price right into a steady, scaling utility price,” Ellidason mentioned. “Thus, the demand multiplier by means of inside loops makes decentralized computing a viable choice within the hybrid compute dialog.”

    Associated: Why crypto’s infrastructure hasn’t caught up with its beliefs

    The place decentralized GPU networks really match

    Decentralized GPU networks are finest suited to workloads that may be break up, routed and executed independently, with out requiring fixed synchronization between machines.

    “Inference is the amount enterprise, and it scales with each deployed mannequin and agent loop,” Evgeny Ponomarev, co-founder of decentralized computing platform Fluence, instructed Cointelegraph. “That’s the place price, elasticity and geographic unfold matter greater than good interconnects.”

    In apply, that makes decentralized and gaming-grade GPUs in client environments a greater match for manufacturing workloads that prioritize throughput and adaptability over tight coordination.

    NVidia, Business, Decentralization, AI, GPU, Features
    Low hourly costs for client GPUs illustrate why decentralized networks goal inference fairly than large-scale mannequin coaching. Supply: Salad.com

    “Client GPUs, with decrease VRAM and residential web connections, don’t make sense for coaching or workloads which are extremely delicate to latency,” Bob Miles, CEO of Salad Applied sciences — an aggregator for idle client GPUs — instructed Cointelegraph.

    “As we speak, they’re extra suited to AI drug discovery, text-to-image/video and enormous scale information processing pipelines — any workload that’s price delicate, client GPUs excel on worth efficiency.”

    Decentralized GPU networks are additionally well-suited to duties equivalent to gathering, cleansing and getting ready information for mannequin coaching. Such duties usually require broad entry to the open net and will be run in parallel with out tight coordination.

    This kind of work is tough to run effectively inside hyperscale information facilities with out in depth proxy infrastructure, Miles mentioned.

    When serving customers all all over the world, a decentralized mannequin can have a geographic benefit, as it could possibly cut back the distances requests should journey and a number of community hops earlier than reaching a knowledge middle, which might improve latency.

    “In a decentralized mannequin, GPUs are distributed throughout many areas globally, usually a lot nearer to finish customers. Consequently, the latency between the person and the GPU will be considerably decrease in comparison with routing visitors to a centralized information middle,” mentioned Liu of Theta Community.

    Theta Community is going through a lawsuit filed in Los Angeles in December 2025 by two former workers alleging fraud and token manipulation. Liu mentioned he couldn’t touch upon the matter as a result of it’s pending litigation. Theta has beforehand denied the allegations.

    Associated: How AI crypto buying and selling will make and break human roles

    A complementary layer in AI computing

    Frontier AI coaching will stay centralized for the foreseeable future, however AI computing is shifting away to inference, brokers and manufacturing workloads that require looser coordination. These workloads reward price effectivity, geographic distribution and elasticity.

    “This cycle has seen the rise of many open-source fashions that aren’t on the scale of techniques like ChatGPT, however are nonetheless succesful sufficient to run on private computer systems outfitted with GPUs such because the RTX 4090 or 5090,” Liu’s co-founder and Theta tech chief Jieyi Lengthy, instructed Cointelegraph.

    With that degree of {hardware}, customers can run diffusion fashions, 3D reconstruction fashions and different significant workloads domestically, creating a chance for retail customers to share their GPU assets, in line with Lengthy.

    Decentralized GPU networks usually are not a substitute for hyperscalers, however they’re turning into a complementary layer.

    As client {hardware} grows extra succesful and open-source fashions develop into extra environment friendly, a widening class of AI duties can transfer outdoors centralized information facilities, permitting decentralized fashions to slot in the AI stack.

    Journal: 6 weirdest gadgets individuals have used to mine Bitcoin and crypto