How to Use NeevCloud GPUs with VS Code

Connecting VS Code to a NeevCloud GPU instance allows you to write code on your laptop while executing it on a high-performance cloud GPU.

This setup provides a seamless experience: you use your local VS Code editor—with all your extensions, themes, and settings—while the heavy computational work is handled remotely by the GPU. You don't need to switch contexts or use a browser-based console; it feels just like coding locally.

This guide walks you through the setup and explains how to verify your code is running on the GPU.


Prerequisites

Make sure you have:

  • A GPU deployed using NeevCloud GPU AI Service

  • Public IP and SSH enabled (default)

  • Ability to SSH from terminal:

    ssh root@<GPU_PUBLIC_IP>
  • VS Code with Remote – SSH extension installed

Once SSH works, VS Code can connect to the GPU.


Step 1: Install Remote SSH extension in VS Code

  1. Open VS Code

  2. Go to Extensions (left sidebar)

  3. Search for Remote – SSH

  4. Install Remote – SSH by Microsoft

This extension allows VS Code to open and edit files on a remote GPU as if they were local.


Step 2: Add your NeevCloud GPU as an SSH host

  1. Press Ctrl + Shift + P (or Cmd + Shift + P on Mac)

  2. Select Remote-SSH: Add New SSH Host

  3. Enter your SSH command:

  4. Save the config when prompted (default is fine)

VS Code now remembers your NeevCloud GPU as a reusable host.


Step 3: Connect VS Code to the GPU instance

  1. Open Command Palette again

  2. Select Remote-SSH: Connect to Host

  3. Choose your GPU host eg: 10.99.1.26

VS Code will:

  • Open a new window

  • Install a small VS Code Server on the GPU

First-time setup may take up to a minute.

You are connected when:

  • Bottom-left shows SSH:

  • Any terminal you open runs on the GPU


How your code uses GPU compute after connection (Important)

Once connected via Remote SSH:

VS Code does NOT run code on your laptop anymore

What actually happens

  • Your code files live on the GPU instance

  • When you click Run, Debug, or open a terminal:

    • Commands execute on the NeevCloud GPU

    • GPU drivers, CUDA, and AI libraries are used

  • Your local machine is only:

    • Displaying the editor

    • Sending keystrokes and commands

Think of it as:

VS Code UI on your laptop + GPU compute on NeevCloud


Running GPU workloads from VS Code

After SSH connection, you can run GPU workloads exactly like you would on a local machine.

Run from Terminal

Open terminal in VS Code:

This confirms the GPU is available.

Run training or inference:

These commands execute on the NeevCloud GPU.


Run using VS Code Run / Debug

  • Click Run on a Python file

  • Or press F5 to debug

The process:

  • Starts on the GPU

  • Uses CUDA, PyTorch, TensorFlow, vLLM, etc.

  • Consumes GPU memory and compute

You can debug GPU-backed code just like local code.


Using frameworks and libraries

NeevCloud GPU AI Service provides pre-built templates that already include:

  • CUDA & NVIDIA drivers

  • PyTorch / TensorFlow / vLLM (based on template)

  • Python / Conda environments

As long as:

  • Your code uses GPU-enabled libraries

  • The environment supports CUDA

Your workloads will automatically run on GPU.

Example:

This will return True on a NeevCloud GPU instance.


Using VS Code + Remote SSH with NeevCloud GPU AI Service gives:

  • Full IDE experience

  • True remote GPU execution

  • Faster iteration for AI/ML workflows

  • Works for training, inference, and experimentation

This is the recommended developer workflow for NeevCloud GPU deployments.

Last updated