
In Part 1 of MCP vs Claude Skills series, we showed that Claude Skills deliver 58–65% token savings compared to raw Model Context Protocol (MCP) queries. That is impressive.
In Part 2, we show you how to use MCP with Connect AI to create your own Skills to get consistent, cross-source querying through Connect AI. The limitation is that Skills only work in Claude Code.
Consider what happens once the solution needs to leave Claude Code and reach the rest of the organization:
The sales VP wants it in Tableau. Not supported.
The ML engineer needs to call it from LangChain. Not supported.
The CEO wants to use it in ChatGPT. Not supported.
IT asks to audit usage. Not supported.
Legal needs to update the business logic centrally. Not supported.
One solution suddenly becomes five and none of them scale. The root issue is that Claude Skills are client-side files tied to a single platform, with no central governance, no shared distribution, and no enterprise controls.
There is a better way. Derived Views in Connect AI give you the benefits of Skills, including token efficiency, query optimization, and cross‑system joins, while solving the portability and governance problems that Skills cannot address.
A Derived View is a governed, virtual SQL table in Connect AI that stores your business logic centrally so any tool can query it with a simple SELECT.
What Skills do well
Skills deliver token efficiency by packaging known queries as executable code, cutting tokens by 58–65% compared to MCP-only discovery. They handle complex query execution by offloading multisource joins to Connect AI, which keeps complexity out of the LLM context. They also support simple automation for repeatable scripts in developer workflows.
Where Skills fall short
Skills fall short in five ways:
Platform lock-in confines usage to Claude Code and blocks straightforward use in ChatGPT, Tableau, Excel, or across agent frameworks without extra distribution.
They create code distribution and versioning overhead because every user needs local files and updates.
They lack centralized governance, which makes it difficult to enforce access controls, audit usage, or update business logic centrally.
‑They contribute to credential sprawl, since peruser secrets are a security headache.
T‑hey fragment logic, as slightly different versions proliferate across tools.
That’s where the gap becomes clear: Skills are excellent developer tools, but enterprises need a governed layer that works everywhere, not just inside Claude.
Use Derived Views to put logic in the data layer
Connect AI is the first managed MCP platform. All your data connectivity is configured, managed, and governed in a single platform. You can explore your data with MCP, then promote stable logic into Derived Views that any tool or agent can query securely at scale.
These Derived Views save your complex queries as a governed, virtual table in Connect AI. What was once a code-based function, executed by the LLM, that queried Connect AI using complex SQL is now saved as top_accounts_with_tickets in Connect AI. Publish the Derived View once and now every tool across the organization, from AI agents and frameworks to BI and reporting tools, can use it with a simple query like SELECT * FROM top_accounts_with_tickets.
Why does this matter? The heavy logic sits in Connect AI. Clients such as Claude, ChatGPT, Tableau, Excel, LangChain, and custom apps run a simple SELECT. No code distribution. No lock-i‑n. One definition.
Why Derived Views win for the enterprise
Universal access without platform lock-‑in. Derived views work in Claude and ChatGPT, LangChain and LlamaIndex, Tableau and Power BI, Excel, or via API using the same SQL view.
Even better token efficiency. In a typical pattern from the series, MCP discovery uses about 2,750 tokens, Skill execution about 870 tokens, and Derived View about 400 tokens. With the Derived View, the model issues a tiny SELECT and Connect AI executes the complex logic.
Zero distribution overhead. Derived views get published once in Connect AI, updated once when business rules change and everyone gets the new logic immediately in the tools they're using.
Centralized governance. Role-based‑ access, audit trails, usage analytics, and compliance reporting replace scattered Python or TypeScript files.
One source of truth. Organizations define “at-risk‑ customer” or any business logic once inside a Derived View. Every tool in every department queries the same definition, which eliminates disputes over metrics.
Instant, safe change management. Need to rename a ticket priority? Update the view and you are done, no redeploying or retraining.
Future‑proof portability. Your AI stack will evolve. Derived views survive platform switches because they are standard SQL on a managed MCP layer.
Where Claude Skills still fit
Skills shine in developer and IT pilots and experimentation, especially when lightweight client-side processing is needed inside Claude Code. They are useful for discovering a workable query, proving value quickly, and validating assumptions with MCP‑assisted exploration. See Parts 1 and 2 for benchmarks and a step‑by‑step guide to turning MCP discoveries into Skills.
Once a query becomes important to multiple teams or needs governance, auditing, and cross-tool‑ access, it should graduate to a Derived View on Connect AI. That is the tipping point from developer convenience to organizational reliability.
How to migrate from Skills to Derived Views
Step 1: Identify candidates. Promote Skills that are reused across teams, queried from multiple tools, or have business logic that changes. Keep Skills for strictly Claude-‑only workflows that require local processing.
Step 2: Extract the SQL. Open the Skill, copy the SQL string, test it in Connect AI, and save as a Derived View such as top_accounts_with_tickets. The view becomes part of a virtual business glossary that is instantly queryable from Claude, ChatGPT, LangChain, Tableau, and Excel.
Step 3: Adopt a hybrid pattern. Let Skills call the view, for example SELECT * FROM top_accounts_with_tickets LIMIT 10, so Claude users benefit too, while centralizing logic in Connect AI for everyone else. Then retire the heavier Skill implementations over time.
The strategy that scales from MCP to skills to Derived Views
Across this series we have shown a natural lifecycle: explore with MCP to discover data, relationships, and viable queries. Package repeatables as Skills to cut tokens and speed up developer workflows. Promote durable logic to Derived Views on Connect AI to deliver governed, cross-platform‑ access for the entire organization with the best token profile.
That is why Connect AI, as the first managed MCP platform, is the right foundation when your use cases expand from a single team’s pilot to enterprise-wide data intelligence. It unifies exploration with MCP, execution with Derived Views, and optional Claude‑-specific‑ automation with Skills into one governed layer that every AI and analytics tool can use.
Want to see how we got here?
Read Part 1 to see benchmarks showing when to use MCP versus Skills and why you often need both.
Read Part 2 to learn how to turn MCP discoveries into Claude Skills that Claude Code can run efficiently.
Start sharing valuable data views with Connect AI
Start with Skills to validate and iterate. Move to Derived Views on Connect AI for complex, cross-source‑ queries that must scale across your organization with centralized governance, universal access, and the lowest token footprint.
Begin your journey towards managing AI access to data from a single platform with a free trial of Connect AI.
Explore CData Connect AI today
See how Connect AI excels at streamlining business processes for real-time insights.
Get the trial