Stellar inference speed via AutoNAS

Stellar inference speed via AutoNAS

0 Hinnangud
0
Osa
149 of 339
Kestus
42 min
Keel
inglise
Vorming
Kategooria
Teadmiskirjandus

Yonatan Geifman of Deci makes Daniel and Chris buckle up, and takes them on a tour of the ideas behind his amazing new inference platform. It enables AI developers to build, optimize, and deploy blazing-fast deep learning models on any hardware. Don’t blink or you’ll miss it!

Join the discussion

Changelog++ members save 2 minutes on this episode because they made the ads disappear. Join today!

Sponsors:

RudderStack • – Smart customer data pipeline made for developers. RudderStack is the smart customer data pipeline. Connect your whole customer data stack. Warehouse-first, open source Segment alternative. SignalWire • – Build what’s next in communications with video, voice, and messaging APIs powered by elastic cloud infrastructure. Try it today at signalwire.com • and use code SHIPIT for $25 in developer credit. Fastly • – Our bandwidth partner. • Fastly powers fast, secure, and scalable digital experiences. Move beyond your content delivery network to their powerful edge cloud platform. Learn more at fastly.com Featuring:

• Yonatan Geifman – Website • , GitHub • , X • Chris Benson – Website • , GitHub • , LinkedIn • , X • Daniel Whitenack – Website • , GitHub • , X Show Notes:

DeciAn Introduction to the Inference Stack and Inference Acceleration TechniquesDeci and Intel Collaborate to Optimize Deep Learning Inference on Intel’s CPUsDeciNets: A New Efficient Frontier for Computer Vision ModelsWhite paper Something missing or broken? PRs welcome!


Loe ja kuula

Astu lugude lõputusse maailma

  • Suurim valik eestikeelseid audio- ja e-raamatuid
  • Proovi tasuta
  • Loe ja kuula nii palju, kui soovid
  • Lihtne igal ajal tühistada
Proovi tasuta
Device Banner Block-copy 894x1036
Cover for Stellar inference speed via AutoNAS

Muud podcastid, mis võivad sulle meeldida ...