A local YouTube Q&A Engine using Llama.cpp and Microsoft Phi-3-Mini
The cheapest and easiest way for Video Question Answering
Published in
9 min readMay 5, 2024
In my last blog about Microsoft-Phi-3-Mini, I discussed how Small language models (SLMs) like the Phi-3-Mini help with quick experimentations on a user’s local machine. In this blog, we’ll look at how we can prototype a VideoQA engine that runs locally…