OpenXLA, yakavhurika sosi purojekiti yekumhanyisa uye kurerutsa muchina kudzidza

OpenXLA

OpenXLA ndeye-yakagadziridzwa yakavhurika sosi ML compiler ecosystem

Munguva pfupi yapfuura, makambani makuru ari kuita budiriro mumunda wekudzidza muchina wakaunzwa chirongwa OpenXLA, yakanangana nekuvandudzwa kwezvishandiso kuunganidza uye kukwenenzvera mamodheru emuchina kudzidza masisitimu.

Iyo purojekiti yakatora basa rekuvandudza maturusi anobvumira kubatanidza kuunganidzwa kwemamodheru akagadzirirwa muTensorFlow, PyTorch uye JAX masisitimu ekudzidzisa kwakanaka uye kuuraya pamaGPU akasiyana uye akasarudzika accelerator. Makambani akadai seGoogle, NVIDIA, AMD, Intel, Meta, Apple, Arm, Alibaba neAmazon akabatana nebasa rakabatana repurojekiti.

Iyo OpenXLA Project inopa mamiriro-e-iyo-art ML compiler inogona kukwira mukati mekuoma kweiyo ML zvivakwa. Mbiru dzayo dzakakosha kuita, scalability, portability, kuchinjika uye kuwedzera kwevashandisi. Ne OpenXLA, isu tinoshuvira kuvhura iyo chaiyo inogona yeAI nekumhanyisa kukura kwayo uye kuendesa.

OpenXLA inogonesa vanogadzira kuunganidza uye kukwenenzvera mamodheru kubva kune ese anotungamira ML masisitimu ekudzidziswa kwakanaka uye sevhisi pane dzakasiyana siyana Hardware. Vagadziri vanoshandisa OpenXLA vachaona kuvandudzwa kwakakosha munguva yekudzidziswa, kuita, sevhisi latency, uye pakupedzisira nguva yekushambadzira uye kuverengera mutengo.

Zvinotarisirwa kuti kuburikidza nekubatana nekuedza vezvikwata zvikuru zvekutsvagisa nevamiriri venharaunda, zvinogoneka kukurudzira kuvandudzwa kwemichina yekudzidza masisitimu uye kugadzirisa dambudziko rekutsemuka kwezvivakwa kune akasiyana masisitimu uye zvikwata.

OpenXLA inobvumira kuita tsigiro inoshanda kune akasiyana Hardware, zvisinei negadziriro iyo modhi yekudzidza yemuchina yakavakirwa. OpenXLA inotarisirwa kudzikisa modhi yekudzidziswa nguva, kuvandudza mashandiro, kuderedza latency, kuderedza komputa pamusoro, uye kuderedza nguva yekutengesa.

OpenXLA ine zvikamu zvitatu zvikuru, iyo kodhi iyo yakagoverwa pasi peiyo Apache 2.0 rezinesi:

  1. XLA (yakawedzera mutsara algebra) inobatanidza iyo inokutendera iwe kukwidziridza mamodhi ekudzidza emuchina wekuita kwepamusoro-soro pamapuratifomu akasiyana ehardware, anosanganisira maGPU, maCPU, uye akasarudzika accelerators kubva kune vakasiyana vagadziri.
  2. StableHLO ndeyekutanga kududzirwa uye kuita seti yePamusoro-Level Operations (HLOs) yekushandisa mumashini ekudzidza masisitimu modhi. Inoita sedhizaini pakati pemakina ekudzidza masisitimu uye ma compiler anoshandura modhi kuti imhanye pane chaiyo hardware. Matanho akagadzirirwa kugadzira modhi muStableHLO fomati yePyTorch, TensorFlow uye JAX masimusi. Iyo MHLO suite inoshandiswa sehwaro hweStableHLO, iyo inowedzerwa nerutsigiro rweserialization uye kutonga kweshanduro.
  3. IREE (Intermediate Representation Execution Environment) inyanzvi uye nguva yekumhanya inoshandura modhi yekudzidza yemuchina kuita inomiririra yepasirese yepakati yakavakirwa paMLIR (Intermediate Multi-Level Representation) fomati yeLLVM chirongwa. Zvemaficha, mukana we precompilation (pamberi penguva), tsigiro yekudzora kuyerera, kugona kushandisa zvinhu zvine simba mumamodeli, optimization yeakasiyana maCPU uye maGPU, uye yakaderera pamusoro inosimbiswa.

Nezve mabhenefiti makuru eOpenXLA, zvinotaurwa izvozvo optimal performance yakaitwa pasina kuzama kunyora kodhi mudziyo-chaiwo, kuwedzera kune ipa kunze-kwe-kwe-bhokisi optimizations, kusanganisira kurerutswa kwemataurirwo ealgebraic, kugovaniswa kwakanaka kwendangariro, kuronga kwekuita, tichifunga nezvekudzikiswa kwekunyanya kushandiswa kwekuyeuka uye pamusoro.

Imwe mukana ndeye kurerutsa kuyera uye parallelization yekuverenga. Zvakakwana kuti mugadziri awedzere zvirevo zveiyo subset yeakakosha tensor, pahwaro hwekuti mugadziri anogona kugadzira otomatiki kodhi yeparallel komputa.

Izvo zvakare zvakasimbiswa izvo portability inopiwa nerutsigiro rwemapuratifomu akawanda ehardware, akadai se AMD neNVIDIA GPU, x86 neARM CPU, Google TPU ML Accelerators, AWS Trainium Inferentia IPUs, Graphcore, uye Wafer-Scale Engine Cerebras.

Tsigiro yekubatanidza mawedzero nekuitwa kwemamwe mabasa, serutsigiro rwekunyora zvakadzika muchina kudzidza maprimitives uchishandisa CUDA, HIP, SYCL, Triton nemimwe mitauro yeparallel computing, pamwe neiyo mukana wekugadzirisa manyore emabhodhoro mumamodheru.

Chekupedzisira, kana iwe uchifarira kuziva zvakawanda nezvazvo, unogona kubvunza iyo ruzivo mune inotevera chinongedzo.


Iva wekutanga kutaura

Siya yako yekutaura

Your kero e havazobvumirwi ichibudiswa. Raida minda anozivikanwa ne *

*

*

  1. Inotarisira data: AB Internet Networks 2008 SL
  2. Chinangwa cheiyo data: Kudzora SPAM, manejimendi manejimendi.
  3. Legitimation: Kubvuma kwako
  4. Kutaurirana kwedata
  5. Dhata yekuchengetedza: Dhatabhesi inobatwa neOccentus Networks (EU)
  6. Kodzero: Panguva ipi neipi iwe unogona kudzora, kupora uye kudzima ruzivo rwako