I-Generative Data Intelligence

Ama-LLM aqhubekayo okuqeqesha angaphambili ezizindeni zezimali | Izinsizakalo Zewebhu ze-Amazon

Usuku:

Amamodeli ezilimi ezinkulu (ama-LLM) ngokuvamile aqeqeshwa kumadathasethi amakhulu atholakala esidlangalaleni angama-agnostic wesizinda. Ngokwesibonelo, I-Meta's Llama amamodeli aqeqeshwa kumasethi edatha afana I-CommonCrawl, C4, Wikipedia, kanye I-ArXiv. Lawa madathasethi ahlanganisa izinhlobonhlobo zezihloko nezizinda. Nakuba amamodeli aphumayo enikeza imiphumela emihle ngokumangalisayo yemisebenzi evamile, njengokwenza umbhalo nokuqashelwa kwebhizinisi, kunobufakazi bokuthi amamodeli aqeqeshwe ngamasethi edatha aqondene nesizinda angathuthukisa nakakhulu ukusebenza kwe-LLM. Isibonelo, idatha yokuqeqeshwa esetshenziselwa I-BloombergGPT ingama-51% amadokhumenti aqondene nesizinda, okuhlanganisa izindaba zezimali, amafayili, nezinye izinto zezimali. I-LLM ewumphumela yenza kahle kakhulu kunama-LLM aqeqeshwe kumasethi edatha angaqondene nesizinda lapho ehlolwa emisebenzini eqondene nezezimali. Ababhali be I-BloombergGPT baphethe ngokuthi imodeli yabo idlula wonke amanye amamodeli ahlolelwe imisebenzi emine kwemihlanu yezezimali. Imodeli inikeze ukusebenza okungcono nakakhulu lapho ihlolelwa imisebenzi yezezimali yangaphakathi ye-Bloomberg nge-majini ebanzi—amaphoyinti angama-60 kangcono (kwayi-100). Nakuba ungafunda kabanzi mayelana nemiphumela yokuhlola ephelele ku- iphepha, isampula elandelayo ithathwe ku I-BloombergGPT iphepha lingakunikeza amazwibela enzuzo yokuqeqesha ama-LLM kusetshenziswa idatha eqondene nesizinda sezimali. Njengoba kukhonjisiwe esibonelweni, imodeli ye-BloombergGPT inikeze izimpendulo ezilungile ngenkathi amanye amamodeli angewona wesizinda esithile azabalaza:

Input: CEO of Assicurazioni Generali?
BloombergGPT: Philippe Donnet
GPT-NeoX: Antonio De Lorenzo, Simone Gambarini, Enrico Zanetti
FLAN-T5-XXL: John M Forsyth, Christopher K Peters, {empty string}

Input: CEO of Silicon Valley Bank?
BloombergGPT: Greg Becker
GPT-NeoX: John Hodor, Mike Fridgen, Peter R Briger
FLAN-T5-XXL: Christian Klein, John M Forsyth, John M Forsyth

Input: CEO of IHS Markit Ltd?
BloombergGPT: Lance Uggla
GPT-NeoX: John F Lechleiter, David Steward, Niall O Broin
FLAN-T5-XXL: William M Hawkins, John M Forsyth, John M Forsyth
Input: CEO of Walgreens Boots Alliance?

BloombergGPT: Rosalind Brewer
GPT-NeoX: Mark V Dei, Stefano Pessina, Stefano Pessina
FLAN-T5-XXL: Christian Klein, John M Forsyth, John M Forsyth
Input: CEO of Citigroup Inc?

BloombergGPT: Jane Fraser
GPT-NeoX: Michael L Corbat, Michael L Corbat, Michael L Corbat
FLAN-T5-XXL: Christian Sewing, John M Forsyth, John M Forsyth

Lokhu okuthunyelwe kunikeza umhlahlandlela wokuqeqesha ama-LLMs ikakhulukazi isizinda sezezimali. Sihlanganisa izindawo ezilandelayo ezibalulekile:

  • Ukuqoqwa nokulungiswa kwedatha - Isiqondiso ekutholeni nasekukhetheni idatha yezezimali efanele ukuze uthole ukuqeqeshwa okuyimodeli okuphumelelayo
  • Ukuqhubeka nokuqeqeshwa kwangaphambili kuqhathaniswa nokushuna kahle – Ungayisebenzisa nini indlela ngayinye ukuze uthuthukise ukusebenza kwe-LLM yakho
  • Ukuqeqeshwa kwangaphambi kwesikhathi okuqhubekayo - Amasu okwenza kube lula inqubo eqhubekayo yokuqeqesha, ukonga isikhathi nezinsiza

Lokhu okuthunyelwe kuhlanganisa ubungcweti bethimba locwaningo lwesayensi elisetshenzisiwe ngaphakathi kwe-Amazon Finance Technology kanye nethimba Lengcweti Lomhlaba Wonke le-AWS Lemboni Yezezimali Yomhlaba Wonke. Okunye okuqukethwe kususelwa ephepheni Ukuqeqeshwa Kwangaphambili Okuqhubekayo Okuphumelelayo Kokwakha Amamodeli Olimi Olukhulu Oluqondile Lwesizinda.

Ukuqoqa nokulungiselela idatha yezezimali

Isizinda sidinga ukuqeqeshwa kwangaphambilini okuqhubekayo kwedathasethi enkulu, yekhwalithi ephezulu, eqondene nesizinda esithile. Okulandelayo yizinyathelo eziyinhloko zokukhethwa kwedathasethi yesizinda:

  • Thola imithombo yedatha - Imithombo yedatha engaba khona yekhorasi yesizinda ihlanganisa iwebhu evuliwe, i-Wikipedia, izincwadi, inkundla yezokuxhumana, kanye nemibhalo yangaphakathi.
  • Izihlungi zedatha yesizinda - Ngenxa yokuthi umgomo omkhulu uwukukhetha ikhophasi yesizinda, ungase udinge ukusebenzisa izinyathelo ezengeziwe ukuze uhlunge amasampula angahambisani nesizinda okuhlosiwe. Lokhu kunciphisa ikhophasi engenamsebenzi yokuqeqeshwa kwangaphambi kwesikhathi okuqhubekayo futhi kunciphisa izindleko zokuqeqesha.
  • Ukuqhubekisa phambili - Ungase ucabangele uchungechunge lwezinyathelo zokucubungula kusengaphambili ukuze uthuthukise ikhwalithi yedatha nokusebenza kahle kokuqeqeshwa. Isibonelo, imithombo yedatha ethile ingaqukatha inani elifanelekile lamathokheni anomsindo; ukudonswa kwemali kuthathwa njengesinyathelo esiwusizo sokuthuthukisa ikhwalithi yedatha nokunciphisa izindleko zokuqeqesha.

Ukuze uthuthukise ama-LLM ezezimali, ungasebenzisa imithombo yedatha emibili ebalulekile: Izindaba CommonCrawl kanye nefayela le-SEC. Ukufakwa kwe-SEC kuyisitatimende sezimali noma omunye umbhalo osemthethweni othunyelwe ku-US Securities and Exchange Commission (SEC). Izinkampani ezisohlwini lwasesidlangalaleni kudingeka ukuthi zifake imibhalo ehlukahlukene njalo. Lokhu kudala inani elikhulu lamadokhumenti eminyakeni edlule. I-News CommonCrawl iyidathasethi ekhishwe yi-CommonCrawl ngo-2016. Iqukethe izindatshana zezindaba ezivela kumasayithi wezindaba emhlabeni jikelele.

I-News CommonCrawl iyatholakala ku- Isevisi ye-Amazon Simple Storage (I-Amazon S3) ku- commoncrawl ibhakede ku crawl-data/CC-NEWS/. Ungathola uhlu lwamafayela usebenzisa ifayela le- I-AWS Command Line Interface (AWS CLI) kanye nomyalo olandelayo:

aws s3 ls --recursive s3://commoncrawl/crawl-data/CC-NEWS/

In Ukuqeqeshwa Kwangaphambili Okuqhubekayo Okuphumelelayo Kokwakha Amamodeli Olimi Olukhulu Oluqondile Lwesizinda, ababhali basebenzisa i-URL nendlela esekelwe kumagama angukhiye ukuze bahlunge izindatshana zezindaba zezimali ezindabeni ezijwayelekile. Ngokukhethekile, ababhali bagcina uhlu lwezindaba ezibalulekile zezezimali kanye nesethi yamagama angukhiye ahlobene nezindaba zezezimali. Sihlonza i-athikili njengezindaba zezezimali uma ivela ezitolo zezindaba zezimali noma yimaphi amagama angukhiye avela ku-URL. Le ndlela elula kodwa ephumelelayo yenza ukwazi ukuhlonza izindaba zezimali hhayi ezitolo zezindaba zezezimali kuphela kodwa futhi uxhase ngezimali izingxenye zezitolo zezindaba ezijwayelekile.

Ukufakwa kwe-SEC kuyatholakala ku-inthanethi nge-EDGAR (I-Electronic Data Gathering, Analysis, and Retrieval) ye-SEC, ehlinzeka ngokufinyelela idatha evulekile. Ungakwazi ukuphenya okugciniwe kusuka ku-EDGAR ngokuqondile, noma usebenzise ama-API I-Amazon SageMaker ngemigqa embalwa yekhodi, nganoma yisiphi isikhathi kanye nenani elikhulu lamathikha (okungukuthi, inkomba eyabelwe i-SEC). Ukuze ufunde kabanzi, bheka ku Ukubuyiswa kokugcwalisa kwe-SEC.

Ithebula elilandelayo lifingqa imininingwane eyinhloko yayo yomibili imithombo yedatha.

. Izindaba CommonCrawl Ukufaka i-SEC
Ukuhlanganisa 2016-2022 1993-2022
Usayizi 25.8 bhiliyoni amagama 5.1 bhiliyoni amagama

Ababhali badlula ezinyathelweni ezimbalwa ezengeziwe zokucubungula ngaphambi kokuthi idatha ifakwe ku-algorithm yokuqeqesha. Okokuqala, sibona ukuthi imibhalo ye-SEC iqukethe umbhalo onomsindo ngenxa yokususwa kwamathebula nezibalo, ngakho ababhali basusa imisho emifushane ethathwa njengamalebula ethebula noma ezibalo. Okwesibili, sisebenzisa i-algorithm ye-hashing ezwelayo yendawo ukuze sikhiphe ama-athikili amasha namafayili. Ekugcwaliseni kwe-SEC, sikhipha ileveli yesigaba esikhundleni seleveli yedokhumenti. Okokugcina, sihlanganisa amadokhumenti abe yiyunithi yezinhlamvu ende, siyenze ithokheni, futhi sihlukanise ithokheni ibe izingcezu zobude bokufaka obukhulu obusekelwa imodeli ezoqeqeshwa. Lokhu kuthuthukisa ukusebenza kokuqeqeshwa kwangaphambi kwesikhathi okuqhubekayo futhi kunciphisa izindleko zokuqeqesha.

Ukuqhubeka nokuqeqeshwa kwangaphambili kuqhathaniswa nokushuna kahle

Ama-LLM amaningi atholakalayo ayinhloso evamile futhi awanawo amakhono aqondene nesizinda. Isizinda se-LLM sibonise ukusebenza okukhulu ezizindeni zezokwelapha, ezezimali, noma zesayensi. Ukuze i-LLM ithole ulwazi oluqondene nesizinda, kunezindlela ezine: ukuqeqeshwa kusukela ekuqaleni, ukuqeqeshwa kwangaphambili okuqhubekayo, ukulungiswa kahle kwemiyalo yemisebenzi yesizinda, kanye Nokubuyisa Isizukulwane Esingeziwe (RAG).

Kumamodeli endabuko, ukulungisa kahle kuvame ukusetshenziselwa ukudala amamodeli aqondene nomsebenzi wesizinda. Lokhu kusho ukugcinwa kwamamodeli amaningi emisebenzi eminingi efana nokukhipha ibhizinisi, ukuhlukaniswa kwenhloso, ukuhlaziya imizwa, noma ukuphendula imibuzo. Ngokufika kwama-LLM, isidingo sokugcina amamodeli ahlukene sesiphelelwe yisikhathi ngokusebenzisa amasu afana nokufunda okungaphakathi kokuqukethwe noma ukwazisa. Lokhu kusindisa umzamo odingekayo ukuze kugcinwe inqwaba yamamodeli emisebenzi ehlobene kodwa ehlukile.

Ngokuqondakalayo, ungaqeqesha ama-LLM kusukela ekuqaleni ngedatha eqondene nesizinda. Nakuba umsebenzi omningi wokudala ama-LLM esizinda ugxile ekuqeqeshweni kusukela ekuqaleni, ubiza kakhulu. Isibonelo, imodeli ye-GPT-4 ibiza ngaphezulu kwezingu-$ 100 million ukuqeqesha. Lawa mamodeli aqeqeshwe ngengxube yedatha yesizinda esivulekile kanye nedatha yesizinda. Ukuqeqeshwa kwangaphambi kwesikhathi okuqhubekayo kungasiza amamodeli athole ulwazi oluqondene nesizinda ngaphandle kokufaka izindleko zokuqeqeshwa kwangaphambili kusukela ekuqaleni ngoba uqeqesha kusengaphambili isizinda esivulekile se-LLM ngedatha yesizinda kuphela.

Ngokucushwa kahle kweziyalezo ngomsebenzi, awukwazi ukwenza imodeli ithole ulwazi lwesizinda ngoba i-LLM ithola kuphela ulwazi lwesizinda oluqukethwe kudathasethi yokulungisa kahle imiyalelo. Ngaphandle kwalapho kusetshenziswa idathasethi enkulu kakhulu yokulungisa kahle iziyalezo kusetshenziswa, akwanele ukuthola ulwazi lwesizinda. Ukuthola idathasethi yemfundo yekhwalithi ephezulu kuvamise ukuba yinselele futhi kuyisizathu sokusebenzisa ama-LLM kuqala. Futhi, ukucushwa kahle kweziyalezo kumsebenzi owodwa kungaphazamisa ukusebenza kweminye imisebenzi (njengoba kubonakala ku leli phepha). Kodwa-ke, ukulungisa kahle iziyalezo kubiza kakhulu kunezinye zezinye izindlela zokuqeqeshwa kwangaphambili.

Isibalo esilandelayo siqhathanisa ukulungisa kahle okujwayelekile komsebenzi othile. vs i-paradigm yokufunda ngaphakathi kokuqukethwe nama-LLM.

I-RAG iyindlela ephumelela kakhulu yokuqondisa i-LLM ukuze ikhiqize izimpendulo ezisekelwe esizindeni. Nakuba ingaqondisa imodeli ukuze ikhiqize izimpendulo ngokunikeza amaqiniso asuka esizindeni njengolwazi olusizayo, ayitholi ulimi oluqondene nesizinda ngoba i-LLM isathembele kusitayela solimi okungezona ezesizinda ukuze ikhiqize izimpendulo.

Ukuqeqeshwa kwangaphambi kwesikhathi okuqhubekayo kuyisisekelo esiphakathi phakathi kokuqeqeshwa kwangaphambili kanye nokucushwa kahle kweziyalezo ngokwezindleko kuyilapho kungenye indlela eqinile yokuzuza ulwazi nesitayela esiqondene nesizinda. Inganikeza imodeli evamile lapho ukucushwa okwengeziwe kwemiyalelo kudatha yemiyalo elinganiselwe kungenziwa. Ukuqeqeshwa kwangaphambi kwesikhathi okuqhubekayo kungaba isu elingabizi kakhulu lezizinda ezikhethekile lapho isethi yemisebenzi ephansi komfula inkulu noma ingaziwa futhi idatha yokushuna imiyalelo enelebula ilinganiselwe. Kwezinye izimo, ukulungiswa kahle kweziyalezo noma i-RAG ingase ifaneleke kakhulu.

Ukuze ufunde kabanzi mayelana nokulungisa kahle, i-RAG, nokuqeqeshwa okuyimodeli, bheka Lungisa kahle imodeli yesisekelo, Ukubuyisa Isizukulwane Esingeziwe (RAG), Futhi Qeqesha Imodeli nge-Amazon SageMaker, ngokulandelana. Kulokhu okuthunyelwe, sigxila ekuqeqesheni kwangaphambili okuphumelelayo okuqhubekayo.

Indlela yokuqeqesha okuqhubekayo okuqhubekayo

Ukuqeqeshwa kwangaphambi kwesikhathi okuqhubekayo kuqukethe le ndlela elandelayo:

  • I-Domain-Adaptive Continual Pre-training (DACP) – Ephepheni Ukuqeqeshwa Kwangaphambili Okuqhubekayo Okuphumelelayo Kokwakha Amamodeli Olimi Olukhulu Oluqondile Lwesizinda, ababhali baqhubeka beqeqesha kusengaphambili imodeli yolimi lwe-Pythia kwikhorasi yezezimali ukuze ijwayelane nesizinda sezimali. Inhloso wukwakha ama-LLM ezezimali ngokuphakela imininingwane evela kuyo yonke isizinda sezimali ibe imodeli enemithombo evulekile. Ngenxa yokuthi ikhorasi yokuqeqesha iqukethe wonke amasethi edatha akhethiwe esizindeni, imodeli ewumphumela kufanele ithole ulwazi oluqondene nezezimali, ngaleyo ndlela ibe imodeli eguquguqukayo yemisebenzi ehlukahlukene yezezimali. Lokhu kubangela amamodeli we-FinPythia.
  • I-Task-Adaptive Continual Pre-training (TACP) - Ababhali baqeqesha kusengaphambili amamodeli ngokuqhubekayo kudatha yomsebenzi enelebula nengenamalebula ukuze bawahlanganisele imisebenzi ethile. Ezimweni ezithile, onjiniyela bangase bakhethe amamodeli aletha ukusebenza okungcono eqenjini lemisebenzi engaphakathi kwesizinda kunemodeli ejwayelekile yesizinda. I-TACP yakhelwe njengokuqhubeka kokuqeqeshwa kwangaphambili okuhloswe ukuthuthukisa ukusebenza kwemisebenzi eqondiwe, ngaphandle kwezidingo zedatha enelebula. Ngokukhethekile, ababhali baqhubeka beqeqesha kusengaphambili amamodeli anemithombo evulekile kumathokheni omsebenzi (ngaphandle kwamalebula). Umkhawulo oyinhloko we-TACP usekwakheni ama-LLM aqondene nomsebenzi othile esikhundleni sama-LLM ayisisekelo, ngenxa yokusetshenziswa kuphela kwedatha yomsebenzi engenamalebula yokuqeqeshwa. Nakuba i-DACP isebenzisa ikhophasi enkulu kakhulu, ibiza kakhulu. Ukulinganisa le mikhawulo, ababhali bahlongoza izindlela ezimbili ezihlose ukwakha ama-LLM esisekelo aqondene nesizinda ngenkathi kugcinwa ukusebenza okuphezulu emisebenzini eqondiwe:
  • I-DACP Esebenzayo Esebenzayo (ETS-DACP) - Ababhali bahlongoza ukukhetha isethi encane yekhorasi yezezimali efana kakhulu nedatha yomsebenzi kusetshenziswa ukushumeka okufanayo. Lesi siqeshana sisetshenziselwa ukuqeqeshwa kwangaphambilini okuqhubekayo ukuze kuphumelele kakhudlwana. Ikakhulukazi, ababhali baqhubeka nokuqeqesha kusengaphambili i-LLM enomthombo ovulekile kukhorasi encane ekhishwe kukhorasi yezezimali eseduze nemisebenzi eqondiwe ekusatshalalisweni. Lokhu kungasiza ukuthuthukisa ukusebenza komsebenzi ngoba samukela imodeli ekusabalaliseni amathokheni omsebenzi naphezu kokuthi idatha enelebula ingadingeki.
  • I-Efficient Task-Agnostic DACP (ETA-DACP) – Ababhali bahlongoza ukusebenzisa amamethrikhi afana nokudideka kanye nohlobo lwethokheni lwe-entropy olungadingi idatha yomsebenzi ukuze kukhethwe amasampuli avela kunkampani yezezimali ukuze aqhubeke nokuqeqeshwa kwangaphambili okuphumelelayo. Le ndlela iklanyelwe ukubhekana nezimo lapho idatha yomsebenzi ingatholakali khona noma amamodeli esizinda aguquguqukayo esizinda esibanzi akhethwa. Ababhali bamukela izilinganiso ezimbili ukuze bakhethe amasampula edatha abalulekile ekutholeni ulwazi lwesizinda kusuka kusethi yedatha yesizinda sokuqeqeshwa kwangaphambili: ubusha nokuhlukahluka. Ubusha, obukalwa ngokudideka okurekhodwe imodeli eqondiwe, kubhekise olwazini obelungakaze lubonwe yi-LLM ngaphambilini. Idatha enobusha obuphakeme ikhombisa ulwazi olusha lwe-LLM, futhi idatha enjalo ibhekwa njengobunzima kakhulu ukuyifunda. Lokhu kubuyekeza ama-LLM ajwayelekile ngolwazi olujulile lwesizinda ngesikhathi sokuqeqeshwa kwangaphambi kwesikhathi okuqhubekayo. Ukwehlukahlukana, ngakolunye uhlangothi, kuthwebula ukwehlukahlukana kokusatshalaliswa kwezinhlobo zamathokheni kukhorasi yesizinda, okuye kwabhalwa njengesici esiwusizo ocwaningweni lokufunda ikharikhulamu ekufaniseni ulimi.

Isibalo esilandelayo siqhathanisa isibonelo se-ETS-DACP (kwesokunxele) vs. ETA-DACP (kwesokudla).

Samukela amasu amasampula amabili ukuze sikhethe amaphoyinti edatha kunkampani yezezimali ekhethiwe: amasampula aqinile kanye namasampula athambile. Okokuqala kwenziwa ngokuklelisa ikhorasi yezezimali kuqala ngamamethrikhi ahambisanayo bese kukhethwa amasampula aphezulu, lapho u-k enqunywa kusengaphambili ngokuya ngesabelomali sokuqeqesha. Kokugcina, ababhali babela izisindo zesampula zamaphoyinti edatha ngayinye ngokwamanani e-metric, bese benza isampula ngokungahleliwe amaphuzu edatha ka-k ukuze kuhlangatshezwane nesabelomali sokuqeqesha.

Umphumela nokuhlaziya

Ababhali bahlola ama-LLM ezezimali aba umphumela ochungechungeni lwemisebenzi yezezimali ukuze baphenye ukusebenza kahle kokuqeqeshwa kwangaphambilini okuqhubekayo:

  • I-Financial Phrase Bank - Umsebenzi wokuhlukanisa imizwa ezindabeni zezezimali.
  • I-FiQA SA - Umsebenzi wokuhlukanisa imizwa ngokususelwa ezindabeni zezezimali nezihloko.
  • Isihloko - Umsebenzi wokuhlukanisa kanambambili wokuthi isihloko sebhizinisi lezezimali siqukethe ulwazi oluthile.
  • Ner – Umsebenzi wokukhishwa kwebhizinisi oqanjwe ngokwezezimali ngokusekelwe esigabeni sokuhlola ubungozi besikweletu semibiko ye-SEC. Amagama kulo msebenzi achazwe ngokuthi PER, LOC, ORG, kanye ne-MISC.

Ngenxa yokuthi ama-LLM ezezimali acushwe kahle, ababhali bahlola amamodeli ngendlela eyishothi engu-5 yomsebenzi ngamunye ngenxa yokuqina. Ngokwesilinganiso, i-FinPythia 6.9B idlula i-Pythia 6.9B ngo-10% kuyo yonke imisebenzi emine, okubonisa ukusebenza kahle kokuqeqeshwa kwangaphambili okuqondile kwesizinda esithile. Kumodeli ye-1B, ukuthuthukiswa akujulile, kodwa ukusebenza kusathuthuka ngo-2% ngokwesilinganiso.

Isibalo esilandelayo sibonisa umehluko wokusebenza ngaphambi nangemuva kwe-DACP kuwo womabili amamodeli.

Isibalo esilandelayo sibonisa izibonelo ezimbili zekhwalithi ezikhiqizwe i-Pythia 6.9B ne-FinPythia 6.9B. Emibuzweni emibili ehlobene nezezimali mayelana nomphathi womtshalizimali kanye negama lezezimali, i-Pythia 6.9B ayiliqondi igama noma ilazi igama, kuyilapho i-FinPythia 6.9B ikhiqiza izimpendulo ezinemininingwane ngendlela efanele. Izibonelo zekhwalithi zibonisa ukuthi ukuqeqeshwa kwangaphambi kwesikhathi okuqhubekayo kwenza ama-LLM athole ulwazi lwesizinda phakathi nenqubo.

Ithebula elilandelayo liqhathanisa izindlela ezahlukene eziqhubekayo zokuqeqesha eziqhubekayo. I-ETA-DACP-ppl i-ETA-DACP esekelwe ekudidekeni (okusha), futhi i-ETA-DACP-ent isekelwe ku-entropy (ukwehlukahlukana). I-ETS-DACP-com ifana ne-DACP ngokukhethwa kwedatha ngokulinganisa wonke amamethrikhi amathathu. Okulandelayo kukhona okumbalwa okuthathwe emiphumeleni:

  • Izindlela zokukhetha idatha ziyasebenza - Badlula ukuqeqeshwa okujwayelekile okuqhubekayo okujwayelekile ngo-10% wedatha yokuqeqeshwa. Ukuqeqeshwa kwangaphambili okuqhubekayo okuphumelelayo okuhlanganisa i-Task-Similar DACP (ETS-DACP), i-Task-Agnostic DACP esekelwe ku-entropy (ESA-DACP-ent) kanye ne-Task-Similar DACP esekelwe kuwo wonke amamethrikhi amathathu (ETS-DACP-com) idlula i-DACP evamile ngokwesilinganiso naphezu kweqiniso lokuthi baqeqeshwe ngo-10% kuphela wekhophasi yezezimali.
  • Ukukhetha idatha eqaphela umsebenzi kusebenza kangcono ngokuhambisana nocwaningo lwamamodeli ezilimi ezincane – I-ETS-DACP irekhoda ukusebenza okumaphakathi okuhle kakhulu kuzo zonke izindlela futhi, ngokusekelwe kuwo womathathu amamethrikhi, irekhoda ukusebenza komsebenzi okwesibili okuhle kakhulu. Lokhu kuphakamisa ukuthi ukusebenzisa idatha yomsebenzi ongenamalebula kuseyindlela ephumelelayo yokuthuthukisa ukusebenza komsebenzi esimweni sama-LLM.
  • Ukukhethwa kwedatha ye-task-agnostic sekuseduze kwesibili - I-ESA-DACP-ent ilandela ukusebenza kwendlela yokukhetha idatha eqaphela umsebenzi, okusho ukuthi sisengakwazi ukukhulisa ukusebenza komsebenzi ngokukhetha ngenkuthalo amasampula ekhwalithi ephezulu angaboshelwe emisebenzini ethile. Lokhu kuvula indlela yokwakha ama-LLM ezezimali kuso sonke isizinda kuyilapho kuzuzwa ukusebenza okuphakeme komsebenzi.

Umbuzo owodwa obalulekile mayelana nokuqeqeshwa kwangaphambilini okuqhubekayo ukuthi ingabe kunomthelela omubi ekusebenzeni kwemisebenzi engeyona eyesizinda. Ababhali baphinde bahlole imodeli eqhubekayo eqeqeshwe ngaphambilini emisebenzini emine ejwayelekile esetshenziswa kabanzi: I-ARC, i-MMLU, i-TruthQA, ne-HellaSwag, ekala ikhono lokuphendula imibuzo, ukucabanga, nokuqedwa. Ababhali bathola ukuthi ukuqeqeshwa kwangaphambi kwesikhathi okuqhubekayo akuthinti kabi ukusebenza okungekhona kwesizinda. Ukuze uthole imininingwane eyengeziwe, bheka Ukuqeqeshwa Kwangaphambili Okuqhubekayo Okuphumelelayo Kokwakha Amamodeli Olimi Olukhulu Oluqondile Lwesizinda.

Isiphetho

Lokhu okuthunyelwe kunikeze imininingwane ekuqoqweni kwedatha kanye namasu aqhubekayo okuqeqesha ama-LLM esizindeni sezimali. Ungaqala ukuqeqesha ama-LLM akho ngemisebenzi yezezimali usebenzisa Ukuqeqeshwa kwe-Amazon SageMaker or I-Amazon Bedrock namuhla.


Mayelana Ababhali

Yong Xie ungusosayensi osetshenzisiwe ku-Amazon FinTech. Ugxile ekuthuthukiseni amamodeli olimi amakhulu kanye nezicelo ze-Generative AI zezimali.

Karan Aggarwal I-Senior Applied Scientist ene-Amazon FinTech egxile ku-Generative AI yamacala okusebenzisa ezezimali. U-Karan unolwazi oluningi ekuhlaziyeni uchungechunge lwesikhathi kanye ne-NLP, enentshisekelo ethile yokufunda ngedatha enelebula elinganiselwe

Aitzaz Ahmad uyi-Applied Science Manager e-Amazon lapho ehola khona ithimba lososayensi abakha izinhlelo zokusebenza ezihlukahlukene Zokufunda Ngomshini kanye ne-Generative AI kwezezimali. Izintshisekelo zakhe zocwaningo ziku-NLP, Generative AI, kanye nama-LLM Agents. Uthole i-PhD ku-Electrical Engineering e-Texas A&M University.

Qingwei Li unguchwepheshe Wokufunda Ngomshini kwa-Amazon Web Services. Wathola iPh.D. Ocwaningweni Lwemisebenzi ngemuva kokuthi ephule i-akhawunti yesibonelelo sikahulumeni socwaningo lomeluleki wakhe futhi ehluleka ukuletha uMklomelo KaNobel ayewuthembisile. Okwamanje usiza amakhasimende asenkonzweni yezezimali ukwakha izixazululo zokufunda ngomshini ku-AWS.

Raghvender Arni uhola Ithimba Lokusheshisa Abathengi (CAT) ngaphakathi kwe-AWS Industries. I-CAT iyiqembu lomhlaba wonke elisebenza ngokubambisana labaklami bamafu ababhekene namakhasimende, onjiniyela bezinhlelo zesofthiwe, ososayensi bedatha, kanye nochwepheshe nabaklami be-AI/ML abaqhuba ukuqamba izinto ezintsha nge-prototyping ethuthukisiwe, futhi baqhubekisele phambili ukusebenza kahle kwamafu ngobuchwepheshe obukhethekile.

indawo_img

Latest Intelligence

indawo_img

Xoxa nathi

Sawubona lapho! Ngingakusiza kanjani?