The world of Artificial Intelligence (AI) is buzzing with new possibilities, especially with the rise of generative AI, the technology behind tools like ChatGPT. While we often see exciting new applications emerge, the journey of taking these powerful tools and making them work reliably for billions of people is far from simple. LinkedIn recently shared a fascinating look into how they tackled this challenge with their new AI-powered people search, and it offers a powerful lesson for businesses everywhere: building AI at scale is a marathon, not a sprint, and it requires a carefully crafted "cookbook" of best practices.
Imagine searching for someone on LinkedIn. Traditionally, you'd type in keywords like "doctor" or "marketing expert." But what if you needed someone knowledgeable about curing cancer? The old system would likely only find profiles mentioning "cancer," missing out on brilliant oncologists or researchers who might use different terms. LinkedIn's new AI-powered search is a game-changer. You can now ask complex questions in plain English, like "Who is knowledgeable about curing cancer?" The AI, powered by advanced language models (LLMs), understands the meaning behind your words. It knows that "cancer" is related to "oncology" and even "genomics research," so it can find relevant people even if their profiles don't use your exact keywords. More importantly, it doesn't just find the world's top experts; it balances that with who is actually useful to you – perhaps a first-degree connection who can introduce you to a leading researcher.
This might sound like a natural evolution for AI, but LinkedIn's journey took three years, even after tools like ChatGPT became widely known. This extended timeline highlights a crucial reality for businesses: implementing generative AI in real-world, large-scale settings is incredibly challenging. It's a slow, painstaking process of refining and optimizing. As Wenjing Zhang, LinkedIn's VP of Engineering, puts it, "Don't try to do too much all at once." Instead of building one giant AI system for all of LinkedIn, they focused on mastering one area first: AI-powered job search. This earlier success, which helped job seekers without a four-year degree get hired 10% more often, provided the blueprint – their "cookbook" – for tackling the even bigger challenge of people search for their 1.3 billion users.
What exactly is this "cookbook"? It's not just about the AI models themselves, but a whole process for building, training, and refining them for massive user bases. LinkedIn's approach involved several key stages:
In the current AI landscape, there's a lot of excitement about "AI agents" – systems that can perform complex tasks autonomously. However, LinkedIn's leaders are emphasizing a more grounded approach. Erran Berger, VP of Product Engineering, argues that "the real value for enterprises today lies in perfecting recommender systems, not in chasing 'agentic hype'."
Think of it this way: an AI agent is only as good as the tools it uses. You can have the smartest reasoning AI in the world, but if the underlying search engine it relies on is slow or inaccurate, the agent's performance will suffer. LinkedIn's focus on refining its AI-powered search is about building the best possible "tool" first. Their new system even includes an "intelligent query routing layer" that uses AI to decide whether a complex question should go to the new semantic search or the old, reliable keyword search. This pragmatism ensures that users get the best possible experience right now, while laying the foundation for future agentic capabilities.
This philosophy means that for many businesses, the immediate focus should be on improving existing AI-powered systems – like recommendation engines, search functionalities, and personalization tools – rather than solely pursuing the latest agent concepts. Durable, strategic advantage comes from mastering the underlying AI pipeline.
LinkedIn's experience provides a clear roadmap for how AI will mature within the enterprise and impact our daily digital lives:
We're moving beyond the novelty of AI. Companies like LinkedIn are proving that the real power lies in taking AI and making it work seamlessly for millions, even billions, of users. This means future AI applications will be deeply integrated into the tools we use every day, often working behind the scenes to make them smarter and more helpful. Expect search engines to become more conversational and understanding, recommendation systems to become uncannily accurate, and productivity tools to offer more intelligent assistance.
The "cookbook" approach – codifying processes, focusing on specific areas first, and relentlessly optimizing – will become standard practice. Businesses won't just hire AI experts; they'll need to build structured methodologies for developing, deploying, and maintaining AI. This includes investing in robust data pipelines, efficient model training techniques like distillation, and scalable infrastructure. The success of AI will depend less on groundbreaking theoretical models and more on the disciplined engineering and operationalization of AI solutions. This means more emphasis on MLOps (Machine Learning Operations) and building reliable, repeatable AI development cycles.
While AI agents capture the imagination, the practical impact of improved recommender systems will be felt much sooner and more broadly. Think about how platforms suggest content, products, or connections. As these systems become more sophisticated, they will drive greater engagement, personalization, and efficiency. They will be the invisible engines powering everything from e-commerce and streaming services to professional networking and educational platforms. These systems are the critical "tools" that future, more advanced AI agents will leverage.
The LinkedIn story highlights that significant performance gains often come not from the initial AI model, but from post-deployment optimization. Techniques like model distillation, pruning, and clever input summarization will be crucial for making AI affordable and fast enough to serve vast numbers of users. This focus on efficiency will drive innovation in smaller, specialized AI models and more effective hardware utilization (like GPUs). It means AI will become more accessible and less resource-intensive to deploy.
For any organization looking to leverage AI effectively, LinkedIn's "cookbook" offers invaluable guidance:
LinkedIn's journey with its AI-powered people search is a powerful testament to the fact that the true revolution in AI isn't just about creating smarter algorithms, but about the disciplined, pragmatic engineering required to bring those algorithms to life for a global audience. By focusing on a structured approach, relentless optimization, and building robust foundational tools, companies can navigate the complexities of enterprise AI and unlock its transformative potential.