New updates on AI-D-Ants : - Compatible with databricks and ollama ( let me know if you want more AI endpoints ) - Dashboard feature to visuals your data - All calculation are in a table format as output as well - The code is saved as well to double check More features are coming soon Repo : https://lnkd.in/egUGTAdy
Tahar B.’s Post
More Relevant Posts
-
Instead of wiring together separate systems and ETL pipelines for data and AI, you can do it all in Spice in one runtime. In this demo, Advay Patil demonstrates querying and accelerating data in Spice - and then calling OpenAI's Responses API endpoint from the same interface for additional insights. All it takes is a few lines of YAML! 📺 Watch the full demo: https://hubs.ly/Q03J-lmC0 📖 Docs: https://hubs.ly/Q03J-t7W0
To view or add a comment, sign in
-
Did you know you can now use Snowflake Cortex AISQL in the SELECT clause for dynamic tables? This new capability brings AI-powered insights directly into your pipelines, allowing you to automatically analyze data as it updates. For example, you can use an LLM function like AI_FILTER to classify customer reviews or survey responses as they're ingested. This means you can get instant sentiment analysis on new data without any extra steps. Learn more: https://lnkd.in/gQCTt2gA
To view or add a comment, sign in
-
-
This is wild. I am not sure if that works as a process but can’t clearly identify from my head right now why it shouldn’t. You can run quality checks and do modifications in later steps. This means declarative AI data transformation pipelines are as simple as writing a select statemen. No API integration needed. No separate scheduler needed. No orchestrator needed (dynamic tables determine the correct build order automatically). No separate lineage needed. No vendor for AI needed. No training for developers needed. No separate monitoring needed. All that you need is a CI pipeline and setup staging. But I got a feeling that Snowflake will automate this, too with the new workspace experience in the near future. Want to hear more about how snowflake gets data teams from any level to elite level performance? Visit adesso SE, October 1st at our booth on the snowflake world tour: Register today: https://lnkd.in/e_qwGZU2
Did you know you can now use Snowflake Cortex AISQL in the SELECT clause for dynamic tables? This new capability brings AI-powered insights directly into your pipelines, allowing you to automatically analyze data as it updates. For example, you can use an LLM function like AI_FILTER to classify customer reviews or survey responses as they're ingested. This means you can get instant sentiment analysis on new data without any extra steps. Learn more: https://lnkd.in/gQCTt2gA
To view or add a comment, sign in
-
-
Snowflake just rolled out something game-changing: AI SQL within dynamic tables. 📊 This means you can now enrich your data pipelines with AI-powered insights as the data is ingested, without waiting for post-processing. Imagine: ❄️ Instantly classifying messy data with functions like AI_CLASSIFY ❄️ Running sentiment analysis on customer reviews or survey responses in real time ❄️ Unlocking smarter, higher-value transformations than traditional ETL Instead of bolting on AI later, you can now bake intelligence directly into your pipelines. #AI #MachineLearning #SnowflakePartner #GotCortex?
Did you know you can now use Snowflake Cortex AISQL in the SELECT clause for dynamic tables? This new capability brings AI-powered insights directly into your pipelines, allowing you to automatically analyze data as it updates. For example, you can use an LLM function like AI_FILTER to classify customer reviews or survey responses as they're ingested. This means you can get instant sentiment analysis on new data without any extra steps. Learn more: https://lnkd.in/gQCTt2gA
To view or add a comment, sign in
-
-
Databricks introduced the Data Science Agent, turning their Assistant into an autonomous partner for end-to-end analytics workflows. Here is how it works: Toggle Agent Mode, describe your task, and watch it plan and execute multi-step data science workflows autonomously. Link to the blog post in the comments! #databricks #agents #llm —- ♻️Repost to help your network. And follow Sudarshan Koirala for more. And if you want more material on AI Agents, Data Science and LLM in general, here: 👉 https://lnkd.in/ea2AsS-s
To view or add a comment, sign in
-
-
Your business is sitting on terabytes of data, but is that data really working for you? Imagine a supply chain executive asking, “Which vendors are most likely to miss shipments next quarter?” and getting an instant, governed answer. Join Doneyli De Jesus, Principal AI/ML Architect, at the ALL IN AI Conference as he demonstrates how to build enterprise-ready AI Data Agents in Minutes with Snowflake. In this session, you’ll see a live, end-to-end build of a generative data agent using Snowflake Cortex. We'll show you how native vector search simplifies data pipelines, how to leverage a Text-to-SQL engine in minutes, and how to address real-world, governance-sensitive queries without needing an army of engineers. You’ll leave with the tools and know-how to deploy production-ready AI sidekicks that speak your business’s language.
To view or add a comment, sign in
-
-
Your business is sitting on terabytes of data, but is that data really working for you? Imagine a supply chain executive asking, “Which vendors are most likely to miss shipments next quarter?” and getting an instant, governed answer. Join Doneyli De Jesus, Principal AI/ML Architect, at the ALL IN AI Conference as he demonstrates how to build enterprise-ready AI Data Agents in Minutes with Snowflake. In this session, you’ll see a live, end-to-end build of a generative data agent using Snowflake Cortex. We'll show you how native vector search simplifies data pipelines, how to leverage a Text-to-SQL engine in minutes, and how to address real-world, governance-sensitive queries without needing an army of engineers. You’ll leave with the tools and know-how to deploy production-ready AI sidekicks that speak your business’s language.
To view or add a comment, sign in
-
-
Your business is sitting on terabytes of data, but is that data really working for you? Imagine a supply chain executive asking, “Which vendors are most likely to miss shipments next quarter?” and getting an instant, governed answer. Join Doneyli De Jesus, Principal AI/ML Architect, at the ALL IN AI Conference as he demonstrates how to build enterprise-ready AI Data Agents in Minutes with Snowflake. In this session, you’ll see a live, end-to-end build of a generative data agent using Snowflake Cortex. We'll show you how native vector search simplifies data pipelines, how to leverage a Text-to-SQL engine in minutes, and how to address real-world, governance-sensitive queries without needing an army of engineers. You’ll leave with the tools and know-how to deploy production-ready AI sidekicks that speak your business’s language.
To view or add a comment, sign in
-
-
🚀 We’re back at Big Data LDN for another great session! 📅 25 September: Live Demo – Build a Custom Fivetran Connector in 20 Minutes See how to build a connector with our SDK + Anthropic Workbench and bring REST API data into Snowflake — plus create a Streamlit in Snowflake app powered by Cortex AI. 👉 Don’t miss the chance to learn how Fivetran streamlines data movement for modern analytics and AI: https://5tran.co/3HPfExl #BigDataLDN
To view or add a comment, sign in
-
-
🚀 We’re back at Big Data LDN for another great session! 📅 25 September: Live Demo – Build a Custom Fivetran Connector in 20 Minutes See how to build a connector with our SDK + Anthropic Workbench and bring REST API data into Snowflake — plus create a Streamlit in Snowflake app powered by Cortex AI. 👉 Don’t miss the chance to learn how Fivetran streamlines data movement for modern analytics and AI: https://5tran.co/3HPfExl #BigDataLDN
To view or add a comment, sign in
-