
Core Team
Atoof Shakir
AI Startups and Collaborations
Research AssistantAbout
Atoof Shakir works at the intersection of multimodal AI, representation learning, and large-scale human insight extraction. At ETH Zurich and the Agentic Systems Lab, he is developing systems that can interpret vast streams of real-world digital behavior — from short-form video to social media and news — and turn them into structured, actionable understanding. His work combines strong technical depth with an unusually applied orientation: rather than building models in isolation, he is interested in how multimodal agents can operate in noisy, dynamic environments where language, visuals, and human behavior constantly interact. Before joining ASL, Atoof contributed to state-of-the-art embedding models with broad real-world adoption, including models that have reached millions of downloads on Hugging Face. He has also worked on open-source projects spanning multimodal agentic systems, diffusion models, and real-time music generation. This background gives him a distinctive profile: someone equally comfortable with core model development and with designing systems that can produce tangible value at scale.
Publications
Research Areas
Connect
Project
Sentiments: Agentic Market Intelligence from Multimodal Media
Sentiments is a multimodal agentic system designed to extract high-value human insight from large-scale digital content. The project analyzes millions of short-form videos, news articles, and social media posts using vision-language models, embedding systems, and vector search to identify patterns in opinion, behavior, and emerging trends. Rather than stopping at passive analysis, the system is built to reason over signals, synthesize findings into concrete recommendations, and support end-to-end decision-making. In practice, this means connecting raw online discourse to strategic actions such as product improvements, messaging changes, audience targeting, or partnership opportunities. From a scientific perspective, the project explores a fundamental research question: how reliably can multimodal AI systems measure human attitudes and social dynamics at a scale far beyond traditional studies? Sentiments creates an opportunity to test whether large-scale observational data from digital platforms can complement or challenge established methods in social and behavioral science. This includes questions of representation, robustness across populations, and the validity of model-derived insights when compared with survey-based or human-coded approaches. The work is especially interesting because it treats multimodal online content not simply as media, but as a rich behavioral substrate for scientific measurement. Commercially, Sentiments points toward a new class of intelligence infrastructure for organizations that need to understand markets in real time. Brands, consumer platforms, policy organizations, and research teams increasingly operate in environments where conventional reporting is too slow and too narrow. A system that continuously detects sentiment shifts, identifies emerging opportunities, and translates weak signals into practical recommendations could become deeply valuable across product, marketing, and strategy functions. Over time, the same architecture could support fine-grained monitoring across regions, demographics, and categories, making it highly relevant wherever fast-moving human preference data drives decision quality.
Other team members
Students
Interested in collaborating?
We are always looking for talented students, researchers and industry partners.































