Let’s be real; documenting stuff isn’t the most exciting part of any job. But when it comes to prompt engineering, it’s an absolute lifesaver. I know it can feel like extra work, but trust me, it’ll save you from major headaches later on.
This is the 4th blog of the series: Prompt engineering for business applications. Prompt Engineering is complex and requires careful planning and refinement to achieve desired results from AI models. As a software engineer @Google with experience in prompt engineering for major businesses, I will share practical learnings in a blog series to help others unlock the power of AI beyond simple tasks.
Lee Boonstra (they/them) has been a presence in the tech world since 2007, wearing many hats from software engineer to prompt engineer, web developer to technical trainer, and developer advocate.
With eight years of experience at Google under their belt, they now hold the role of SWE Tech Lead at the Google Cloud office of the CTO. Leading innovation projects, Lee aims to disrupt markets and foster collaboration globally. Their expertise in Conversational and Voice technology, alongside (Generative) AI, has led to recognition as a respected public keynote speaker and published author for O’Reilly and Apress. Lee eases tech headaches and celebrates those light bulb moments.
Lee wrote a book for O’Reilly: Hands-on Sencha Touch 2 and lately: the Definitive Guide to Conversational AI with Dialogflow and Google Cloud for Apress.
When it comes to prompt engineering, the choice of language model (LLM) is crucial. Each promp...
Let’s get down to basics and talk about how Large Language Models (LLMs) actually work. Think ...
Prompt engineering for business applications isn’t as simple as asking a question. It’s a comp...
A non-technical explanation of the inner workings of Generative AI
Everyone is talking abou...