Jacob Lee from Langchain
In this episode, Rod Rivera interviews Jacob Lee, a foundational software engineer at Langchain, about building AI applications on the browser using JavaScript. Jacob shares his background and journey into AI development, as well as the role of Langchain in helping JavaScript developers build powerful apps with reasoning capabilities and LLMs. They discuss the advantages of using JavaScript for AI development, the components and modules needed to build a chat application with Langchain, and the importance of choosing the right database and components for specific use cases. Jacob also shares insights on the Python and JavaScript parity in Langchain, serialization and cross-language support, and the future of AI applications at the edge with Web LLMs and WebGPU. In this conversation, Jacob Lee discusses the use of local models in JavaScript development and the advantages of the Langchain framework. He highlights the importance of running smaller models locally for tasks such as image processing and photo editing. Jacob also discusses the split between front-end and Node.js development and suggests offloading smaller tasks to the client side. He emphasizes the ease of getting started with Langchain and the recent developments in the framework, including improved compatibility and smaller bundle sizes. Jacob offers advice for JavaScript developers interested in AI and encourages staying curious and supportive within the community.
Takeaways
- JavaScript is a powerful language for building AI applications, especially for reaching a wide audience through web browsers and other platforms like Electron and React Native.
- Langchain provides a modular framework for building AI applications with reasoning capabilities and LLMs, allowing developers to easily swap in and out different modules and models.
- JavaScript developers can get started with building AI applications using Langchain by following interactive tutorials and courses, such as the one offered by Scrimba.
- Choosing the right components and databases for AI applications in JavaScript depends on the specific use case and the developer's familiarity with certain technologies, such as SQL or GraphQL. Running smaller models locally in JavaScript can improve performance and user experience.
- Offloading smaller tasks to the client side can reduce server load.
- Langchain provides a framework for working with local models in JavaScript.
- Langchain has strong TypeScript support and offers a lightweight bundle size.
- JavaScript developers can get started with AI without extensive knowledge of math and linear algebra.
- Staying up to date with the latest developments in AI and JavaScript is important for success.
- Building in the open and supporting others in the community is crucial for advancing AI in JavaScript.
Episode Transcript
Building AI Applications with JavaScript
Rod Rivera: Welcome to the AI Engineer podcast. Today, I'm thrilled to have Jacob Lee, a foundation software engineer at LangChain. Jacob recently wrote an insightful post on building AI applications in the browser. Welcome, Jacob!
Jacob Lee: Thank you, Rod. It's great to be here.
Jacob's Background and Journey to LangChain
Rod Rivera: Before we dive into browser apps with JavaScript, could you tell us about your journey to where you are now?
Jacob Lee: Absolutely. My background is primarily in developer tools and frameworks, not so much in AI. Before joining LangChain, I founded a company called AutoCode, similar to Replit, where we had an online editor and software platform for deploying code serverlessly.
After leaving AutoCode in January, I did some contracting while figuring out my next steps. That's when I realized how amazing LLMs are. I felt a bit left behind by the incredible advancements happening in Python, but then I stumbled upon the original version of LangChain JS. I actually started as a contributor and user before joining full-time to maintain the product.
Rod Rivera: That's fascinating. Coming from a web-based background rather than machine learning, what new concepts did you need to learn or understand for your work with AI applications?
Jacob Lee: One of the most amazing things about this latest AI revolution is that you don't need a deep understanding of how LLMs work or the theory behind transformers. It can help developers write better prompts and understand when to use LLMs, but the fact that it's a simple API call away is incredibly powerful.
As someone maintaining the project in the LLM space, I've had to pick up some of those foundational pieces. But for application engineers, you just need to get a sense of where and when to use this amazing new technology. The accessibility via API calls or frameworks like LangChain is really powerful.
Introduction to LangChain
Rod Rivera: For those unfamiliar with LangChain, could you explain how it's used?
Jacob Lee: Sure. LangChain is a framework for building powerful apps with reasoning capabilities and LLMs. We're like the plumbing you would use to build with technology like ChatGPT or OpenAI's GPT-4, GPT-3.5, and other models.
Rod Rivera: LangChain is very popular in the Python ecosystem, and you're maintaining the JavaScript/TypeScript version. What would you say to someone starting in the JavaScript world building their applications? How can LangChain help them in their journey?
Jacob Lee: One of the biggest value-adds for new developers is that we help you swap in and out different modules, models, and other pieces like vector stores and embeddings really quickly. You can experiment with OpenAI's models one moment, Anthropic's models another, or Google's in a third. This swappability and ease of trying new technology seamlessly is a big benefit.
We also provide a conceptual and modular framework for working with these apps. There's a firehose of information out there, and LangChain helps put rails on it, guiding you towards good use cases and things you can put into production.
The Future of JavaScript in AI Development
Rod Rivera: Some JavaScript developers are concerned that JavaScript might become less relevant long-term due to the AI innovations happening primarily in Python. What would you say to those who are seeing this massive wave of AI innovation in Python and less so in JavaScript?
Jacob Lee: There's a great saying I've heard many times: "If it can be written in JavaScript, it will be." This is partially due to the power and ubiquity of the browser. Web browsers are one of the most foundational pieces of technology for end-users. If you want to reach a wide audience, you're going to have to use JavaScript at some point.
Given the power of the JavaScript ecosystem, which has spread to building desktop apps with Electron and mobile apps with React Native, that momentum is incredibly powerful. I think you're going to see a lot of this tooling and technology become available in JavaScript, Node, Deno, and the web generally. With tools like LangChain.js, it's now possible to build sophisticated apps in a short amount of time and get bootstrapped up quickly in JavaScript as well.
Getting Started with AI Development in JavaScript
Rod Rivera: If I'm a JavaScript developer who has mostly done frontend web apps, how do I get started with AI development? What would you advise as a first step?
Jacob Lee: At the risk of being a little self-serving, we have great tooling and courses from our friends at Scrimba (scrimba.com). They've created an awesome interactive guide and tutorial on getting started with building LLMs via LangChain.js. They walk you through the basic concepts of prompts, how they format input to the LLM with parameters, and how to combine these pieces to build a starter project.
One popular starter project is building a conversational chat with your documents or PDFs. This lets you take a large amount of unstructured data and ask meaningful questions about it. Scrimba's platform is great because it lets you edit the code inline and run it yourself, giving you a deep understanding of what the code is doing.
Components of a Basic LLM Application
Rod Rivera: The "chat with your data" example seems to have become the "hello world" of LLM applications. Which modules or components does a developer need to use to start building this type of application? How does LangChain help with this?
Jacob Lee: There are a few key pieces:
- The LLM, which is the brain and central locus for understanding and generating natural language.
- Embeddings, which allow you to represent unstructured text in a format that a computer can search.
- A vector database that lets you search through different natural language chunks and find the most relevant ones.
LangChain helps with loading documents, retrieving them, and splitting them into manageable chunks. These are some of the pieces that LangChain really helps with.
Choosing Components for Your LLM Application
Rod Rivera: LangChain has integrations with many vector databases and other components. For someone starting out, this can be overwhelming. What advice do you have on choosing which database or components to use?
Jacob Lee: Many tutorials will pick one option to focus on. For example, the Scrimba course uses Supabase, which is also a SQL database and an open-source Firebase alternative. It has nice features that another database might not have, especially if you're already familiar with SQL.
My general advice would be to start simple and use what you're already familiar with from other web dev projects. Each option has its unique strengths:
- Supabase is great if you're familiar with SQL
- MongoDB Atlas is a good choice if you like MongoDB
- Weaviate has a deep GraphQL integration
- Pinecone has been around the longest and has the most market share
LangChain's JavaScript Implementation
Rod Rivera: How close do you try to keep the JavaScript version of LangChain to the Python API? How "JavaScript-native" are you trying to make it?
Jacob Lee: We recently did a big refactor where we split out the core components and interfaces that make up LangChain. We're aiming for one-to-one parity in the long term for these core pieces. As you go up the stack, integrations are never going to be completely cross-compatible because certain things only run in the browser or rely on deep Python libraries that don't have JavaScript analogs.
We've also been working hard on serialization, which is important for our Langsmith observability and tracing platform. You can send runs to it in Python, and it'll actually deserialize into JavaScript in the playground if you want to rerun or debug something.
Recent Developments in LangChain for JavaScript
Rod Rivera: For someone who hasn't checked LangChain in a while, are there any recent developments or exciting new features you'd like to share?
Jacob Lee: We've made portability and the ability to run in web environments a key focus. We see Next.js Edge Functions and Cloudflare Workers as first-class deployment targets. We want to make sure you feel comfortable and confident deploying to these places with LangChain.
We've also done a lot with our core refactor to keep our bundle size small. Eventually, we want to allow people to choose LangChain core along with maybe one or two packages, so you can pick out different integrations one by one if you really want to keep your dependencies small.
The Intersection of JavaScript and AI
Rod Rivera: Where do you see the intersection of JavaScript and AI heading in the next year or two?
Jacob Lee: JavaScript has the most developers and is the most popular programming language in the world, so I expect to see a lot more web apps being created. We're already seeing a lot of popularity and interest in LangChain in the JavaScript ecosystem.
I think there are exciting possibilities with code generation. Vercel's v0 and fine-tuned models are interesting, and I expect to see more cool use cases as fine-tuning flows get more refined.
Open-source models are going to become increasingly important. We recently saw a demo where someone was able to run our agent loop (which originally required GPT-3) using a 7 billion parameter model that could run locally. That kind of progress in just over a year is really exciting.
Advice for Aspiring AI Developers
Rod Rivera: If you could go back in time and talk to your younger self, what advice would you give?
Jacob Lee: I'd advise keeping your head up and maintaining intellectual curiosity about the field you're in. Try to categorize and understand new developments, even if you're not directly working with them in your day job. It's easy to get too focused on what you're currently working on and miss out on amazing new technologies and paradigms.
Rod Rivera: Is there any last thing you'd like to share with our audience? Where can they find you?
Jacob Lee: You can find me on Twitter at @Hacubu. I post a lot about what's new with LangChain and what web devs are working on in the LLM space.
My final advice would be to keep building, be supportive of what others are working on, and try to build in the open. It's very early for all this tech, so building in the open and being supportive of others is really important. It'll take everyone working together to figure out the best use cases and best practices. So keep exploring, keep building, and stay curious.
Rod Rivera: I fully agree, Jacob. It's important to highlight what's being built and who's building it, especially in the smaller JavaScript AI community. Thank you so much for being here today, Jacob. To everyone listening, please check out LangChain, use it, and follow Jacob's account if you're curious about getting started with building AI apps using JavaScript and LangChain.
Jacob Lee: Thank you very much, Rod. Take care.