logo

a2-highgpu-8g by Google Cloud Platform

a2-highgpu-8g is a Accelerator Optimized: 8 NVIDIA Tesla A100 GPUs, 96 vCPUs, 680GB RAM server offered by Google Cloud Platform with 96 vCPUs, 680 GiB of memory and 0 GB of storage. The pricing starts at 2.5459 USD per hour.
96 vCPU
680 GiB Memory
8 GPU
Spare SCore
75800
(All-cores)
1201
(Single-core)

Specifications

Server Details

General
Vendor ID
gcp
Server ID
1000096
Name
a2-highgpu-8g
API Reference
a2-highgpu-8g
Display Name
a2-highgpu-8g
Description
Accelerator Optimized: 8 NVIDIA Tesla A100 GPUs, 96 vCPUs, 680GB RAM
Family
a2
Status
active
Observed At
2026-02-11T23:04:59.698551
CPU
vCPUs
96
CPU Allocation
Dedicated
CPU Cores
48
CPU Speed
2.2 GHz
CPU Architecture
x86_64
CPU Manufacturer
Intel
CPU Family
Xeon
CPU L1 Cache
2 MiB
CPU L2 Cache
48 MiB
CPU L3 Cache
77 MiB
CPU Flags
fpu, vme, de, pse, tsc, msr, pae, mce, cx8, apic, sep, mtrr, pge, mca, cmov, pat, pse36, clflush, mmx, fxsr, sse, sse2, ss, ht, syscall, nx, pdpe1gb, rdtscp, lm, constant_tsc, rep_good, nopl, xtopology, nonstop_tsc, cpuid, tsc_known_freq, pni, pclmulqdq, ssse3, fma, cx16, pcid, sse4_1, sse4_2, x2apic, movbe, popcnt, aes, xsave, avx, f16c, rdrand, hypervisor, lahf_lm, abm, 3dnowprefetch, ssbd, ibrs, ibpb, stibp, ibrs_enhanced, fsgsbase, tsc_adjust, bmi1, hle, avx2, smep, bmi2, erms, invpcid, rtm, mpx, avx512f, avx512dq, rdseed, adx, smap, clflushopt, clwb, avx512cd, avx512bw, avx512vl, xsaveopt, xsavec, xgetbv1, xsaves, arat, avx512_vnni, md_clear, arch_capabilities
Memory
Memory Amount
680 GiB
GPU
GPU Count
8
GPU Memory Min
40 GiB
GPU Memory Total
320 GiB
GPU Manufacturer
NVIDIA
GPU Family
Ampere
GPU Model
A100
GPUs
  • NVIDIA Ampere NVIDIA A100-SXM4-40GB (Memory amount: 40960, Firmware version: 535.183.01, BIOS version: 92.00.45.00.03, Clock rate: 1410)
  • NVIDIA Ampere NVIDIA A100-SXM4-40GB (Memory amount: 40960, Firmware version: 535.183.01, BIOS version: 92.00.45.00.03, Clock rate: 1410)
  • NVIDIA Ampere NVIDIA A100-SXM4-40GB (Memory amount: 40960, Firmware version: 535.183.01, BIOS version: 92.00.45.00.03, Clock rate: 1410)
  • NVIDIA Ampere NVIDIA A100-SXM4-40GB (Memory amount: 40960, Firmware version: 535.183.01, BIOS version: 92.00.45.00.03, Clock rate: 1410)
  • NVIDIA Ampere NVIDIA A100-SXM4-40GB (Memory amount: 40960, Firmware version: 535.183.01, BIOS version: 92.00.45.00.03, Clock rate: 1410)
  • NVIDIA Ampere NVIDIA A100-SXM4-40GB (Memory amount: 40960, Firmware version: 535.183.01, BIOS version: 92.00.45.00.03, Clock rate: 1410)
  • NVIDIA Ampere NVIDIA A100-SXM4-40GB (Memory amount: 40960, Firmware version: 535.183.01, BIOS version: 92.00.45.00.03, Clock rate: 1410)
  • NVIDIA Ampere NVIDIA A100-SXM4-40GB (Memory amount: 40960, Firmware version: 535.183.01, BIOS version: 92.00.45.00.03, Clock rate: 1410)
Storage
Storage Size
0 GB
Storages
    Network
    Inbound Traffic
    0 GB/month
    Outbound Traffic
    0 GB/month
    IPv4
    0

    Availability

    REGION / IDSPOTONDEMAND
    Council Bluffs (US) / us-central1
    2.958 USD/h 5.9158 USD/h
    The Dalles (US) / us-west1
    2.958 USD/h 5.9158 USD/h
    Moncks Corner (US) / us-east1
    2.958 USD/h 5.9158 USD/h
    Tel Aviv (IL) / me-west1
    2.5541 USD/h 6.5074 USD/h
    Eemshaven (NL) / europe-west4
    2.5459 USD/h 6.5125 USD/h
    Las Vegas (US) / us-west4
    2.6645 USD/h 6.6624 USD/h
    Salt Lake City (US) / us-west3
    3.0891 USD/h 7.1056 USD/h
    Jurong West (SG) / asia-southeast1
    3.5322 USD/h 7.2976 USD/h
    Seoul (KR) / asia-northeast3
    2.7381 USD/h 7.5842 USD/h
    Tokyo (JP) / asia-northeast1
    3.1835 USD/h 7.5842 USD/h

    Economics

    Average Price per Region

    Prices per Zone

    Lowest Prices

    Performance

    Alternatives

    Servers of the Same Family

    INSTANCEvCPUsMEMORYGPUs
    a2-highgpu-1g 1285 GiB1
    a2-ultragpu-1g 12170 GiB1
    a2-highgpu-2g 24170 GiB2
    a2-ultragpu-2g 24340 GiB2
    a2-highgpu-4g 48340 GiB4
    a2-ultragpu-4g 48680 GiB4
    a2-ultragpu-8g 961360 GiB8

    Similar Servers

    INSTANCEVENDORvCPUsMEMORYGPUs

    a2-highgpu-8g FAQs

    What is a2-highgpu-8g?

    What are the specs of the a2-highgpu-8g server?

    How fast is the a2-highgpu-8g server?

    How much does the a2-highgpu-8g server cost?

    Who is the provider of the a2-highgpu-8g server?

    Where is the a2-highgpu-8g server available?

    Are there any other sized servers in the a2 server family?

    What other servers offer similar performance to a2-highgpu-8g?