Shadow
  • 🌗Overview
  • 🏁Quickstart
  • 🎯Use cases
  • ⛓️Supported chains
  • MODIFY CONTRACTS
    • Versioning
    • How to's
      • Proxy contracts
      • Using oracles
      • Factory contracts
      • Interdependent shadow contracts
  • GETTING DATA
    • Shadow RPC
    • Database syncs
    • Recurring exports
    • One-off exports
    • API Keys
  • Integrations
    • Postgres
    • GCS / S3
    • Dune
  • Product Guide
    • Catalog
    • Test runs
    • Subgraphs
  • Tips
    • Proxy contracts
    • Solidity beginners
      • Recommended tools
  • Resources
    • 📚Concepts
    • ❓FAQs
Powered by GitBook
On this page
Export as PDF
  1. Integrations

GCS / S3

Set up a fully managed data pipeline to export data to a GCS / S3 bucket in a one-off or recurring fashion.

PreviousPostgresNextDune

Last updated 7 months ago

The GCS / S3 integration is included on paid .

The GCS / S3 integration is available for and . It's ideal for teams who need historical blockchain data for analytics or research use cases that don’t require real-time data freshness. Data exports to GCS / S3 offer the highest flexibility, allowing you to upload data into tools that you already use such as Snowflake, Metabase, etc.

When you set up a data export to GCS / S3, you’ll be able to specify:

  1. Which contracts and events to export

  2. The file format (CSV or Parquet)

  3. How often you want the data delivered to your bucket (daily, hourly, every 15 min)

  4. How to partition the data (daily, hourly, or no partitioning)

See the or pages for additional information.

One-off Exports
Recurring Exports
Shadow Fork plans
One-off Exports
Recurring Exports