I have worked with over a dozen companies in the last 9 months, to implement AI x GTM systems (signal-based selling, micro-campaigns, AI workflows, modern tooling, etc.). The # 1 lesson I've learned: AI/agentic workflows (and automation) only work if the underlying (foundational) data is SOLID. It doesn't have to be perfect, but it has to be solid. Evals I use today: 1. COVERAGE across data points 2. ACCURACY of data points 3. COST of the data Today, you can do almost anything you want with AI and data extraction. And then use those data points in an enrichment+tiering model ("Company Fit Score"). Things like: - Number of engineers - Number of SDRs hired in the last year - Date the company's privacy page was last updated - Are they using any of <list of tools>? - Are they a high growth company? (raised >$50M and <50 EEs) - Industry pulled from LinkedIn - B2B vs B2C vs Both - Digital Native or not - Large catalog of products on their website or not (eComm) - Selling directly to consumers or not (eComm) - *Their* market is ENT or SMB - Etc. (your imagination is the only limitation here, the more nuanced, the better; that's the point here) Use these bespoke datapoints and layer them on top of the standard datapoints that you're already getting from your B2B data provider (I recommend water-falling several) - things like revenue, location, employee count, funding, industry, website traffic, etc. The output is magic (to a GTM nerd like me)... ~Perfectly tiered accounts in your CRM (see image below), that update automatically, when any new Accounts are found, and update on a regular cadence. This helps with everything downstream in GTM - eg: your ICP mapping, ABM, cutting territories, outbound/signal-based plays focus/routing, inbound routing, etc. AI/agentic workflows (and automation) only work if the underlying (foundational) data is SOLID. And this is a great use-case for AI.
I'm loving this analytical rigor
agreed. foundational data is emerging as the next big battleground. not the workflows layer.
Consistent with every team/project I work with as well. Your data organization and accuracy is directly related to how sophisticated and reliable your agentic workflows can be.
keeping data in our CRM up to date is where we struggle. are you seeing the best teams use scheduled runs with Clay?
Great post Brendan! It's all about good data, AI, and execution. Bloated stacks are even less relevant now in the age of micro-campaigns and real-time data. Came up with the FIR model for account scoring here: - Fit (are they an ideal customer? - Intent (are they showing signs of interest?) - Relationship (is an intro available?) If you have 2 or 3 criteria, they're a target acct.
I built a very similar workflow earlier. What worked for me was combining these data points into a single report and then matching them against a detailed ICP document, while also using the same research for personalization.
Date the company's privacy page was last updated? damn you are getting better every day. 100% with you here man.
unobvious data is the alpha today. and the data is only getting more unstructured and multi-modal. we put these in a warehouse to compute and send the trends to pinecone node with embeddings. the word "magic" doesnt cut it.
I think we are working on very similar topics. Have you experienced issues when deploying tools to new teams?
3X Founder | Data + AI Technologist | Modern GTM Operator
1wBrendan Short (The Signal) We do this natively on top of Snowflake or Google big Query as unified data foundation of SAM, including bespoke ICP modelling, with selected signals from our infinite signals catalogs and weights you can choose to rank/ maintain live ranking of your SAM/ Audience. These ongoing signals via MCP acts as a context layer to Agents to observe, reason and act towards inbound and marketing plays with guard rails. Would love to have you see it in action & see what you think?