Claude Code & MCP MasterclassModule 4

4.3File System & Cloud Storage MCP Servers

25 min 3 code blocks Practice Lab Quiz (4Q)

File System & Cloud Storage MCP Servers

One of Claude Code's most powerful built-in capabilities is reading and editing local files. But what about files stored on Google Drive, Dropbox, or AWS S3? What about a Pakistani client who keeps all their project documents on OneDrive, or a startup that stores user uploads on DigitalOcean Spaces? Without MCP, you download files manually, paste content into Claude, and lose the audit trail. With a Cloud Storage MCP Server, Claude can browse, read, create, and organize files across any storage system — as naturally as it works with your local filesystem.

Section 1: Enhanced Local File System MCP

Claude Code already has file read/write capabilities, but a custom File System MCP lets you add business logic: search by content, organize by type, enforce naming conventions, and log all file operations — useful for agency work where you're managing files for multiple clients.

Create filesystem-mcp-server.py:

python
#!/usr/bin/env python3
import os
import json
import glob
import hashlib
from datetime import datetime
from pathlib import Path
from mcp.server import Server
from mcp.server.stdio import stdio_server
from mcp import types

app = Server("filesystem-pro")

# Restrict to a specific root directory (safety measure)
ALLOWED_ROOT = os.getenv("FS_ROOT", os.path.expanduser("~/projects"))

def is_safe_path(path: str) -> bool:
    """Ensure path is within allowed root"""
    abs_path = os.path.abspath(path)
    return abs_path.startswith(os.path.abspath(ALLOWED_ROOT))

@app.list_tools()
async def list_tools():
    return [
        types.Tool(
            name="search_files",
            description="Search for files by name pattern or content within the project directory",
            inputSchema={
                "type": "object",
                "properties": {
                    "pattern": {"type": "string", "description": "Glob pattern (e.g., '**/*.py') or text to search in file contents"},
                    "search_type": {"type": "string", "enum": ["name", "content"], "description": "Search by filename or file content"}
                },
                "required": ["pattern", "search_type"]
            }
        ),
        types.Tool(
            name="get_directory_tree",
            description="Get a tree view of a directory structure",
            inputSchema={
                "type": "object",
                "properties": {
                    "path": {"type": "string", "description": "Directory path to visualize"},
                    "max_depth": {"type": "integer", "description": "Maximum depth to traverse (default 3)"}
                },
                "required": ["path"]
            }
        ),
        types.Tool(
            name="bulk_rename",
            description="Rename multiple files matching a pattern with a new naming convention",
            inputSchema={
                "type": "object",
                "properties": {
                    "directory": {"type": "string"},
                    "match_pattern": {"type": "string", "description": "Glob pattern to match files"},
                    "rename_rule": {"type": "string", "description": "Python format string for new names, e.g., '{date}_{original}'"}
                },
                "required": ["directory", "match_pattern", "rename_rule"]
            }
        )
    ]

@app.call_tool()
async def call_tool(name: str, arguments: dict):
    if name == "search_files":
        pattern = arguments["pattern"]
        search_type = arguments["search_type"]

        if search_type == "name":
            matches = glob.glob(os.path.join(ALLOWED_ROOT, "**", pattern), recursive=True)
            return [types.TextContent(type="text", text=json.dumps(matches[:100], indent=2))]

        elif search_type == "content":
            results = []
            for filepath in glob.glob(os.path.join(ALLOWED_ROOT, "**", "*.py"), recursive=True):
                if is_safe_path(filepath):
                    try:
                        with open(filepath, "r", encoding="utf-8", errors="ignore") as f:
                            content = f.read()
                            if pattern.lower() in content.lower():
                                results.append({"file": filepath, "preview": content[:200]})
                    except Exception:
                        continue
            return [types.TextContent(type="text", text=json.dumps(results[:50], indent=2))]

    elif name == "get_directory_tree":
        path = arguments["path"]
        max_depth = arguments.get("max_depth", 3)
        if not is_safe_path(path):
            return [types.TextContent(type="text", text="Error: Path outside allowed root")]

        tree = []
        for root, dirs, files in os.walk(path):
            depth = root.replace(path, "").count(os.sep)
            if depth >= max_depth:
                dirs.clear()
                continue
            indent = "  " * depth
            tree.append(f"{indent}{os.path.basename(root)}/")
            for file in files:
                tree.append(f"{indent}  {file}")
        return [types.TextContent(type="text", text="\n".join(tree))]

Section 2: Google Drive MCP Server

For Pakistani businesses using Google Workspace (very common — Drive, Docs, Sheets), this MCP server gives Claude access to Drive files:

python
from google.oauth2 import service_account
from googleapiclient.discovery import build

SCOPES = ["https://www.googleapis.com/auth/drive.readonly"]
SERVICE_ACCOUNT_FILE = os.getenv("GOOGLE_SERVICE_ACCOUNT_JSON")

def get_drive_service():
    creds = service_account.Credentials.from_service_account_file(
        SERVICE_ACCOUNT_FILE, scopes=SCOPES
    )
    return build("drive", "v3", credentials=creds)

# Tool: list_drive_files
# Searches Google Drive for files matching a query
# Uses Drive's query syntax: "name contains 'invoice'" or "mimeType='application/pdf'"

Section 3: AWS S3 / DigitalOcean Spaces MCP

Both S3 and DigitalOcean Spaces use the same boto3 interface (Spaces is S3-compatible):

python
import boto3

s3 = boto3.client(
    "s3",
    endpoint_url=os.getenv("S3_ENDPOINT", "https://s3.amazonaws.com"),
    aws_access_key_id=os.getenv("AWS_ACCESS_KEY"),
    aws_secret_access_key=os.getenv("AWS_SECRET_KEY"),
    region_name=os.getenv("AWS_REGION", "ap-south-1")  # Mumbai region, closest to Pakistan
)

# Tool: list_objects — list files in a bucket/prefix
# Tool: read_object — download and return file content as text
# Tool: upload_object — upload a string as a file to S3

Pakistan Context — DigitalOcean Spaces: Many Pakistani developers prefer DigitalOcean Spaces over AWS S3 because:

  • Cheaper: $5/month for 250GB vs AWS S3's variable pricing
  • Singapore region (SGP1) gives better latency from Pakistan than Mumbai (ap-south-1)
  • Same S3-compatible API — no code changes needed
Practice Lab

Practice Lab

Exercise 1: Build the local File System MCP server with just the search_files and get_directory_tree tools. Set FS_ROOT to your projects folder. Connect it to Claude Code and ask: "Give me a tree view of my projects directory. What projects do I have? Identify any Python files that import the requests library." Verify Claude finds real files without you telling it where to look.

Exercise 2: Add the bulk_rename tool and test it on a folder of test files. Create 10 files named image1.jpg through image10.jpg in a test directory. Ask Claude to "rename all .jpg files in this directory to follow the pattern 2026-03-{original_name} to add date prefixes." Verify the renaming works correctly before running on real files.

Exercise 3: Research DigitalOcean Spaces pricing and compare it to AWS S3 for a Pakistani use case (storing 100GB of user-uploaded images for a Daraz-style app). Write a brief comparison (3-4 sentences) in your CLAUDE.md noting which you'd recommend and why. This is the design documentation habit that separates professional developers from hobbyists.

Key Takeaways

  • Path safety checks (is_safe_path) are essential for any File System MCP — without them, Claude could accidentally read or modify files outside your project directory
  • Google Drive MCP requires a Service Account JSON file (not OAuth — service accounts don't need browser authorization flows, making them better for automation)
  • DigitalOcean Spaces is the recommended object storage for Pakistani developers — it uses the same boto3 API as AWS S3 but with a flat $5/month price and better Pakistani latency from the Singapore region
  • The get_directory_tree tool is often the first tool Claude calls when onboarding to a new project — it gives an immediate structural overview that guides all subsequent work

Lesson Summary

Includes hands-on practice lab3 runnable code examples4-question knowledge check below

Quiz: File System & Cloud Storage MCP Servers

4 questions to test your understanding. Score 60% or higher to pass.