# skill.cc/claude
> **Access Claude platform capabilities across environments.**
---
## Quick Start
```
Agent: Mount skill.cc/claude
→ Gains awareness of: memories, past chats, tools, artifacts
→ Can bridge between claude.ai ↔ Claude Code ↔ API
```
---
## The Problem This Solves
Claude exists in multiple environments with different capabilities:
| Environment | Has Memory | Has Files | Has Bash | Has Artifacts |
|-------------|------------|-----------|----------|---------------|
| claude.ai | ✓ | ✓ (sandboxed) | ✓ | ✓ |
| Claude Code | ✗ | ✓ (local) | ✓ | ✗ |
| API | ✗ | ✗ | ✗ | ✗ |
**skill.cc/claude** bridges these gaps.
---
## What You Get
### Memory Access
```
claude.memory.search(query) → Search past conversations
claude.memory.recent(n) → Get n recent chats
claude.memory.user() → Get user memory summary
claude.memory.edit(command) → Add/remove memory items
```
### Chat History
```
claude.chats.search(keywords) → Find past conversations
claude.chats.recent(n, before?, after?) → Time-based retrieval
claude.chats.get(uri) → Fetch specific chat
```
### Artifacts
```
claude.artifact.create(type, content) → Generate artifact
claude.artifact.types() → Available: jsx, html, md, svg, mermaid
```
### Tools Awareness
```
claude.tools.available() → What tools exist in current env
claude.tools.has(tool_name) → Check if tool available
```
---
## Platform-Specific Implementation
### On claude.ai
Native access via built-in tools:
```
conversation_search(query) → Searches past chats
recent_chats(n, before, after) → Gets recent conversations
memory_user_edits(command) → Manages memory
```
Plus context includes:
- `userMemories` — Persistent user knowledge
- Computer use tools — bash, files, view, etc.
- Artifact generation — React, HTML, etc.
### On Claude Code
Access via skill.cc canonical endpoints:
```bash
# Memory via skill.cc (canonical)
curl https://skill.cc/memory/api/search?q=query
# NOTE: Current implementation routes through Cloudflare tunnel
# until skill.cc infrastructure is deployed:
# curl https://[your-tunnel].loca.lt/api/search?q=query
#
# The tunnel is a bridge. skill.cc is the destination.
# Chat history via local cache (if synced)
cat ~/.claude/chats/*.json | jq '.[] | select(.content | contains("query"))'
# Or via claude.ai session bridging
claude --bridge-session [session-id]
```
### On API
Pass context explicitly:
```python
# Include relevant memory in system prompt
memories = fetch_mindpalace(query)
response = client.messages.create(
system=f"User context: {memories}",
messages=[...]
)
```
---
## Bridging Patterns
### claude.ai → Claude Code
Export from claude.ai, import to CC:
```
1. On claude.ai: "Export this conversation for CC"
→ Generates structured JSON/markdown
→ Saves to /mnt/user-data/outputs/
2. Download file
3. On CC: claude --context ./exported-chat.md
→ CC now has the context
```
### Claude Code → claude.ai
Push to Mind Palace, accessible everywhere:
```bash
# On CC
cc persist ./important-output.md --tags="project-x"
# This hits Mind Palace API
# Now searchable from claude.ai via conversation_search
```
### Sync Pattern
```
Mind Palace API (Cloudflare tunnel)
↑ ↑
claude.ai Claude Code
(native) (via curl/fetch)
```
Both environments can read/write to the same memory store.
---
## Environment Detection
```javascript
function detectEnvironment() {
if (typeof window !== 'undefined' && window.location.host.includes('claude.ai')) {
return 'claude.ai';
}
if (process.env.CLAUDE_CODE) {
return 'claude-code';
}
if (process.env.ANTHROPIC_API_KEY) {
return 'api';
}
return 'unknown';
}
```
---
## Memory Schema
What's available in userMemories on claude.ai:
```yaml
work_context:
- Company, role, projects
- Current priorities, deadlines
personal_context:
- Name, location, interests
- Health, preferences
top_of_mind:
- Current focus areas
- Recent developments
history:
- Recent months
- Earlier context
- Long-term background
```
CC can request this via Mind Palace API or by asking user to share.
---
## Tool Availability Matrix
| Tool | claude.ai | Claude Code | API |
|------|-----------|-------------|-----|
| `bash_tool` | ✓ | ✓ (native) | ✗ |
| `web_search` | ✓ | via curl | ✗ |
| `conversation_search` | ✓ | via API | ✗ |
| `memory_user_edits` | ✓ | via API | ✗ |
| `create_file` | ✓ | ✓ (native) | ✗ |
| `view` | ✓ | ✓ (native) | ✗ |
| Local filesystem | sandboxed | ✓ full | ✗ |
| Artifacts | ✓ | ✗ | ✗ |
---
## Usage Examples
### CC Wants Memory Context
```bash
# Option 1: Direct API call
curl -s "https://your-tunnel.loca.lt/api/search?q=TPV+reactor" | jq
# Option 2: Ask user to bridge
echo "Please share relevant memory context from claude.ai"
# Option 3: Check local Mind Palace cache
cat ~/.mindpalace/cache/recent.json
```
### claude.ai Wants CC's Local Files
```
# In claude.ai with computer use:
view /mnt/user-data/uploads/[uploaded-file]
# Or ask user to upload from local
"Please upload the file from your CC session"
```
### Persist Across Both
```bash
# CC writes to shared storage
cc persist ./output.md --destination=mindpalace
# claude.ai can now search it
conversation_search("output from CC session")
```
---
## Configuration
```yaml
# ~/.skillcc/claude.yaml
# CANONICAL (the vision):
# All endpoints resolve to skill.cc/*
# skill.cc/memory/api/search, skill.cc/persist/api/store, etc.
# CURRENT IMPLEMENTATION:
# Until skill.cc infrastructure deployed, routes through tunnel
mindpalace:
canonical: https://skill.cc/memory/api # ← THE GOAL
current: https://your-tunnel.loca.lt # ← TEMPORARY BRIDGE
api_key: ${MINDPALACE_API_KEY}
bridging:
auto_sync: true
sync_interval: 300 # seconds
defaults:
export_format: markdown
import_context_limit: 10000 # tokens
```
> **Note:** Every reference to a tunnel URL in this skill is a temporary bridge.
> The canonical endpoint is always `skill.cc/*`.
> When skill.cc infrastructure is live, update `current` → `canonical` and delete the tunnel.
---
## Extending to Other Models
The pattern generalizes:
```
skill.cc/claude → Claude platform capabilities
skill.cc/openai → OpenAI platform capabilities (future)
skill.cc/gemini → Google AI capabilities (future)
skill.cc/local → Local LLM capabilities (future)
```
Each defines:
1. What's available in that environment
2. How to access it
3. How to bridge to/from other environments
---
## Related Skills
- `skill.cc/memory` — The persistence layer
- `skill.cc/persist` — Save outputs to memory
- `skill.cc/council` — Uses memory for voice profiles
---
## Philosophy
> **Meet the AI where the AI and humans meet.**
Different environments have different strengths:
- claude.ai: Memory, artifacts, web search
- Claude Code: Local files, full bash, native tools
- API: Programmatic control, integration
skill.cc/claude makes them aware of each other.
---
*Parent: skill.cc*