How to Migrate Google Cloud Storage (GCS) to Vultr Object Storage

Updated on 03 April, 2025
How to Migrate Google Cloud Storage (GCS) to Vultr Object Storage header image

Object Storage is a critical storage solution that lets you store and continuously access data, which is a critical component for business and developers. Buckets store unstructured data, such as images, videos, logs, or backups, offering a scalable and efficient way to handle large volumes of data. Migrating data from Google Cloud Storage buckets to Vultr Object Storage allows you to expand your storage with multiple locations and performance benefits to match your business or project needs.

Follow this guide to migrate your data from your Google Cloud Storage bucket to a Vultr Object Storage bucket. You will use Rclone or gsutil to migrate existing buckets to your destination Vultr Object Storage subscription.

Prerequisites

Before you begin, you need to:

Verify the Source Google Cloud Storage Bucket Information

Follow the steps below to verify your source Google Cloud Storage bucket information to migrate to Vultr Object Storage.

  1. Log in to Google Cloud Console.

  2. Select your target project that contains your Google Cloud Storage bucket.

  3. Copy your project number to the clipboard.

    View the Google Cloud Project number

  4. Click Cloud Storage.

    View the Google Cloud Project number

  5. Click Buckets on the left navigation menu.

  6. Note of your target bucket's Location type, Location, and Default storage class information to configure on your workstation.

    View the Bucket Properties

Choose a Migration Method

You can migrate data from Google Cloud Storage buckets to Vultr Object Storage using two methods, Rclone and gsutil.

  • Rclone: An open-source tool used to manage files on cloud storage. You can use Rclone to migrate cloud storage buckets and verify the data integrity for each transfer.
  • gsutil: A legacy Google Cloud Storage tool used to perform bucket and object management tasks, including creating, uploading, listing, moving, and managing buckets.

Choose a migration method to install on your migration workstation and use it to transfer data to your Vultr Object Storage subscription.

Set Up the Migration Workstation

Follow the steps below to install and configure Rclone or gsutil, depending on your selected migration method.

Install and Configure Rclone
  1. Update the APT package index.

    console
    $ sudo apt update
    
  2. Download the latest Rclone installation script.

    console
    $ curl -O https://rclone.org/install.sh
    
  3. Run the installation script.

    console
    $ sudo bash install.sh
    
  4. Verify the installed Rclone version.

    console
    $ rclone version
    

    Your output should be similar to the one below.

    rclone v1.69.0
    - os/version: ubuntu 24.04 (64 bit)
    - os/kernel: 6.8.0-52-generic (x86_64)
    - os/type: linux
    - os/arch: amd64
    - go/version: go1.23.4
    - go/linking: static
    - go/tags: none
  5. Generate a new Rclone configuration.

    console
    $ rclone config file
    

    Copy the generated configuration file path in your output similar to the one below.

    Configuration file doesn't exist, but rclone will use this path:
    /home/linuxuser/.config/rclone/rclone.conf
  6. Open the Rclone configuration file.

    console
    $ nano ~/.config/rclone/rclone.conf
    
  7. Add the following configurations to the rclone.conf file. Replace YOUR_ACCESS_KEY, YOUR_SECRET_ACCESS_KEY, and YOUR_ENDPOINT_URL with your actual Vultr Object Storage details.

    ini
    [vultr]
    type = s3
    provider = Other
    env_auth = false
    access_key_id = YOUR_ACCESS_KEY
    secret_access_key = YOUR_SECRET_ACCESS_KEY
    region =
    endpoint = YOUR_ENDPOINT_URL
    location_constraint =
    acl = private
    server_side_encryption =
    storage_class =
    

    Save and exit the file.

    The above configuration creates a new connection to your Vultr Object Storage subscription using the specified connection credentials.

  8. List all buckets in your Vultr Object Storage subscription to test the connection.

    console
    $ rclone lsd vultr:
    

    Your output should be similar to the one below.

            -1 2000-01-01 00:00:00       -1 example_destination_bucket
  9. Open the interactive Rclone configuration.

    console
    $ rclone config
    

    Output:

    Current remotes:
    
    Name                 Type
    ====                 ====
    
    e) Edit existing remote
    n) New remote
    d) Delete remote
    r) Rename remote
    c) Copy remote
    s) Set configuration password
    q) Quit config
    • Enter n to create a new remote configuration.

      e/n/d/r/c/s/q> n
    • Specify a new name for tthe remote connection. For example, gcs.

      Enter name for new remote.
      name> 
    • Find Google Cloud Storage (this is not Google Drive) from the list of cloud storage providers and enter its number to configure it.

      Option Storage.
      Type of storage to configure.
      Choose a number from below, or type in your own value.
      ...
      19 / Google Cloud Storage (this is not Google Drive)
         \ (google cloud storage)
      ...
      Storage>

      At the prompt, enter the corresponding number (for instance, 19).

    • Press Enter to keep the OAuth client ID empty.

      Option client_id.
      OAuth Client Id.
      Leave blank normally.
      Enter a value. Press Enter to leave empty.
      client_id>
    • Press Enter to keep the OAuth client secret empty.

      Option client_secret.
      OAuth Client Secret.
      Leave blank normally.
      Enter a value. Press Enter to leave empty.
      client_secret>
    • Enter your GCP project number.

      Option project_number.
      Project number.
      Optional - needed only for list/create/delete buckets - see your developer console.
      Enter a value. Press Enter to leave empty.
      project_number>
    • Press Enter to keep the user project empty.

      Option user_project.
      User project.
      Optional - needed only for requester pays.
      Enter a value. Press Enter to leave empty.
      user_project>
    • Press Enter to keep the service account path empty.

      Option service_account_file.
      Service Account Credentials JSON file path.
      Leave blank normally.
      Needed only if you want use SA instead of interactive login.
      Leading ~ will be expanded in the file name as will environment variables such as ${RCLONE_CONFIG_DIR}.
      Enter a value. Press Enter to leave empty.
      service_account_file>
    • Press Enter to restrict anonymous bucket access.

      Option anonymous.
      Access public buckets and objects without credentials.
      Set to 'true' if you just want to download files and don't configure credentials.
      Enter a boolean value (true or false). Press Enter for the default (false).
      anonymous>
    • Press Enter to keep the access control list for new objects empty.

      Option object_acl.
      Access Control List for new objects.
      Choose a number from below, or type in your own value.
      Press Enter to leave empty.
         / Object owner gets OWNER access.
       1 | All Authenticated Users get READER access.
         \ (authenticatedRead)
         / Object owner gets OWNER access.
       2 | Project team owners get OWNER access.
         \ (bucketOwnerFullControl)
         / Object owner gets OWNER access.
       3 | Project team owners get READER access.
         \ (bucketOwnerRead)
         / Object owner gets OWNER access.
       4 | Default if left blank.
         \ (private)
         / Object owner gets OWNER access.
       5 | Project team members get access according to their roles.
         \ (projectPrivate)
         / Object owner gets OWNER access.
       6 | All Users get READER access.
         \ (publicRead)
      object_acl>
    • Press Enter to keep the access control list for new buckets empty.

      Option bucket_acl.
      Access Control List for new buckets.
      Choose a number from below, or type in your own value.
      Press Enter to leave empty.
         / Project team owners get OWNER access.
       1 | All Authenticated Users get READER access.
         \ (authenticatedRead)
         / Project team owners get OWNER access.
       2 | Default if left blank.
         \ (private)
      ...
      bucket_acl>
    • Press Enter to continue with the bucket policy only set to false.

      Option bucket_policy_only.
      Access checks should use bucket-level IAM policies.
      If you want to upload objects to a bucket with Bucket Policy Only set
      then you will need to set this.
      When it is set, rclone:
      - ignores ACLs set on buckets
      - ignores ACLs set on objects
      - creates buckets with Bucket Policy Only set
      Docs: https://cloud.google.com/storage/docs/bucket-policy-only
      Enter a boolean value (true or false). Press Enter for the default (false).
      bucket_policy_only>
    • Find your Google Cloud Storage bucket location from the list, enter the corresponding location number, and press Enter.

      Option location.
      Location for the newly created buckets.
      Choose a number from below, or type in your own value.
      Press Enter to leave empty.
      1 / Empty for default location (US)
      \ ()
      2 / Multi-regional location for Asia
      \ (asia)
      3 / Multi-regional location for Europe
      \ (eu)
      4 / Multi-regional location for United States
      ...
      location>
    • Find your storage class from the list and enter the corresponding number. If your storage class is Standard, select any of the following options.

      • Regional storage class: For buckets stored in a single region.
      • Multi-regional storage class: For buckets distributed across multiple regions.
      Option storage_class.
      The storage class to use when storing objects in Google Cloud Storage.
      Choose a number from below, or type in your own value.
      Press Enter to leave empty.
      1 / Default
        \ ()
      2 / Multi-regional storage class
        \ (MULTI_REGIONAL)
      3 / Regional storage class
        \ (REGIONAL)
      4 / Nearline storage class
        \ (NEARLINE)
      5 / Coldline storage class
        \ (COLDLINE)
      6 / Archive storage class
        \ (ARCHIVE)
      7 / Durable reduced availability storage class
        \ (DURABLE_REDUCED_AVAILABILITY)
      storage_class> 
    • Enter 1 to continue and manually enter your credentials.

      Option env_auth.
      Get GCP IAM credentials from runtime (environment variables or instance meta data if no env vars).
      Only applies if service_account_file and service_account_credentials is blank.
      Choose a number from below, or type in your own boolean value (true or false).
      Press Enter for the default (false).
      1 / Enter credentials in the next step.
        \ (false)
      2 / Get GCP IAM credentials from the environment (env vars or IAM).
        \ (true)
      env_auth>
    • Enter n to proceed without editing the advanced configuration.

      Edit advanced config?
      y/n>
    • Enter n to start the Google Cloud account authentication.

      Use web browser to automatically authenticate rclone with remote?
      * Say Y if the machine running rclone has a web browser you can use
      * Say N if running rclone on a (remote) machine without web browser access
      If not sure try Y. If Y failed, try N.
      y) Yes (default)
      n) No
      y/n>

      Your output should be similar to the one below.

      2025/03/19 18:30:43 NOTICE: Make sure your Redirect URL is set to "http://127.0.0.1:53682/" in your custom config.
      2025/03/19 18:30:43 ERROR : Failed to open browser automatically (exec: "xdg-open": executable file not found in $PATH) - please go to the following link: http://127.0.0.1:53682/auth?state=Yi582ar0tYjlvqze3y_3sQ
      2025/03/19 18:30:43 NOTICE: Log in and authorize rclone for access
      2025/03/19 18:30:43 NOTICE: Waiting for code...
    • Copy the generated URL from the second line of the command output.

    • Open a new terminal on your local workstation.

    • Set up port forwarding to your remote migration workstation. Replace 192.0.2.1 with the IP address of your migration server and linuxuser with your non-root username.

      console
      $ ssh -L 53682:localhost:53682 linuxuser@192.0.2.1
      

      The above command opens a connection to your server and forwards port 53682 to your local workstation address. Port forwarding routes all requests to localhost:53682 on your local machine to port 53682 on the remote server's.

      • Enter your user's SSH password when prompted.

        linuxuser@192.0.2.1's password:
    • Paste the Rclone URL you copied earlier in a web browser such a Chrome to log in to your Google account.

      http://127.0.0.1:53682/auth?state=asLQ...
    • Sign in with your Google Cloud account.

    • Grant Rclone access to your account.

    • Verify that a success message displays in your web browser to complete the authentication.

      Authenticate rclone with the Google Console Account

    • Close the web page.

    • Press Ctrl + C in your terminal to exit the port forwarding session.

    • Navigate to your Rclone configuration terminal.

    • Verify that Rclone successfully adds your login credentials to the configuration.

      2025/03/19 18:31:32 NOTICE: Got code
      Configuration complete.
      Options:
      - type: google cloud storage
      - project_number: 14082920324
      - location: asia-southeast1
      - storage_class: REGIONAL
      - token: {"access_token":"ya29.a0AeXRPp50uCYeI..............","token_type":"Bearer","refresh_token":"1//0eutCa31ItSEKCgYIARAAGA4SNwF-L9IrgXJDM5I4Jr1AokMIxKpT_5siQgC5c6cf9TybC_Gt6094pBmLCC0x_-m4E63CuRzByfs","expiry":"2025-03-19T19:31:31.410881765Z"}
      Keep this "gcs" remote?
      y) Yes this is OK (default)
      e) Edit this remote
      d) Delete this remote
      y/e/d>
      ...

      Enter y and press Enter to save the configuration.

    • Enter q to quit the configuration.

      e) Edit existing remote
      n) New remote
      d) Delete remote
      r) Rename remote
      c) Copy remote
      s) Set configuration password
      q) Quit config
      e/n/d/r/c/s/q>
  10. List all available Google Cloud Storage buckets to test your configuration.

    console
    $ rclone lsd gcs:
    

    Your output should be similar to the one below when successful.

              -1 2000-01-01 00:00:00        -1 example_source_bucket
  11. List all Rclone remotes and verify that your configuration is active.

    console
    $ rclone listremotes
    

    Output:

    vultr:
    gcs:

Migrate Google Cloud Storage Buckets to Vultr Object Storage

Follow the steps below to migrate your Google Cloud Storage buckets to Vultr Object Storage using your desired migration method.

Migrate with Rclone
  1. Transfer all objects in your GCS bucket to your Vultr Object Storage subscription.

    console
    $ rclone sync gcs: vultr: -v
    

    Monitor the migration process and wait for the transfer to complete, depending on your bucket size.

  2. Transfer objects from your GCS bucket to a specific Vultr Object Storage bucket. Replace example_source_bucket with your GCS bucket and example_destination_bucket with your destination bucket name.

    console
    $ rclone sync gcs:example_source_bucket vultr:example_destination_bucket -v
    

    Your output should be similar to the one below when successful.

    2025/03/19 19:39:21 INFO  : test.txt: Copied (new)
    2025/03/19 19:39:21 INFO  : image.png: Copied (new)
    2025/03/19 19:39:21 INFO  : web_app/index.html: Copied (new)
    2025/03/19 19:39:21 INFO  : 
    Transferred:             22 B / 22 B, 100%, 0 B/s, ETA -
    Transferred:            3 / 3, 100%
    Elapsed time:         0.4s

    If the file transfer command is interrupted or unexpectedly stops due to an error, run the rclone command again to start the transfer again.

    console
    $ rclone sync gcs:example_source_bucket vultr:example_destination_bucket -v
    

Test the Destination Vultr Object Storage Subscription

Follow the steps below to test your Vultr Object Storage subscription and verify the migrated data integrity, depending on your migration method.

Test Vultr Object Storage with Rclone
  1. Check for file differences between the source GCS bucket and the destination Vultr Object Storage bucket.

    console
    $ rclone check gcs:example_source_bucket/ vultr:example_destination_bucket/
    

    If the above command returns a 0 differences found prompt, the buckets match, and your migration is successful.

    2025/01/03 19:44:55 NOTICE: S3 bucket example_destination_bucket: 0 differences found
    2025/01/03 19:44:55 NOTICE: S3 bucket example_destination_bucket: 3 matching files

    If you receive file differences similar to the ones below, run the rclone sync command to migrate the missing files.

    2025/03/19 20:14:20 ERROR : test.txt: sizes differ
    2025/03/19 20:14:20 NOTICE: S3 bucket testbucket3434: 1 differences found
    2025/03/19 20:14:20 NOTICE: S3 bucket testbucket3434: 1 errors while checking
    2025/03/19 20:14:20 NOTICE: S3 bucket testbucket3434: 2 matching files
    2025/03/19 20:14:20 NOTICE: Failed to check: 1 differences found

Conclusion

You have migrated data from your Google Cloud Storage bucket to a Vultr Object Storage subscription. You can use rclone or gsutil to migrate all existing data, verify the data integrity, and cut over your applications to Vultr Object Storage. Visit the Vultr Object Storage documentation for more bucket configuration options.

Comments

No comments yet.