logo

a2-highgpu-8g by Google Cloud Platform

a2-highgpu-8g is a Accelerator Optimized: 8 NVIDIA Tesla A100 GPUs, 96 vCPUs, 680GB RAM server offered by Google Cloud Platform with 96 vCPUs, 680 GiB of memory and 0 GB of storage. The pricing starts at 2.36604 USD per hour.
96 vCPU
680 GiB Memory
8 GPU
Spare SCore
75746
(All-cores)
1199
(Single-core)

Specifications

Server Details

General
Vendor ID
gcp
Server ID
1000096
Name
a2-highgpu-8g
API Reference
a2-highgpu-8g
Display Name
a2-highgpu-8g
Description
Accelerator Optimized: 8 NVIDIA Tesla A100 GPUs, 96 vCPUs, 680GB RAM
Family
a2
Status
active
Observed At
2024-09-19T22:21:34.489876
CPU
vCPUs
96
CPU Allocation
Dedicated
CPU Speed
2.2 GHz
CPU Architecture
x86_64
CPU Manufacturer
Intel
CPU Family
Xeon
CPU L1 Cache
3 MiB
CPU L2 Cache
48 MiB
CPU L3 Cache
77 MiB
CPU Flags
fpu, vme, de, pse, tsc, msr, pae, mce, cx8, apic, sep, mtrr, pge, mca, cmov, pat, pse36, clflush, mmx, fxsr, sse, sse2, ss, ht, syscall, nx, pdpe1gb, rdtscp, lm, constant_tsc, rep_good, nopl, xtopology, nonstop_tsc, cpuid, tsc_known_freq, pni, pclmulqdq, ssse3, fma, cx16, pcid, sse4_1, sse4_2, x2apic, movbe, popcnt, aes, xsave, avx, f16c, rdrand, hypervisor, lahf_lm, abm, 3dnowprefetch, ssbd, ibrs, ibpb, stibp, ibrs_enhanced, fsgsbase, tsc_adjust, bmi1, hle, avx2, smep, bmi2, erms, invpcid, rtm, mpx, avx512f, avx512dq, rdseed, adx, smap, clflushopt, clwb, avx512cd, avx512bw, avx512vl, xsaveopt, xsavec, xgetbv1, xsaves, arat, avx512_vnni, md_clear, arch_capabilities
Memory
Memory Amount
680 GiB
GPU
GPU Count
8
GPU Manufacturer
Nvidia
GPU Family
Ampere
GPU Model
NVIDIA A100-SXM4-40GB
GPUs
  • Nvidia Ampere NVIDIA A100-SXM4-40GB (Memory amount: 40960, Firmware version: 535.183.01, BIOS version: 92.00.45.00.03, Clock rate: 1410)
  • Nvidia Ampere NVIDIA A100-SXM4-40GB (Memory amount: 40960, Firmware version: 535.183.01, BIOS version: 92.00.45.00.03, Clock rate: 1410)
  • Nvidia Ampere NVIDIA A100-SXM4-40GB (Memory amount: 40960, Firmware version: 535.183.01, BIOS version: 92.00.45.00.03, Clock rate: 1410)
  • Nvidia Ampere NVIDIA A100-SXM4-40GB (Memory amount: 40960, Firmware version: 535.183.01, BIOS version: 92.00.45.00.03, Clock rate: 1410)
  • Nvidia Ampere NVIDIA A100-SXM4-40GB (Memory amount: 40960, Firmware version: 535.183.01, BIOS version: 92.00.45.00.03, Clock rate: 1410)
  • Nvidia Ampere NVIDIA A100-SXM4-40GB (Memory amount: 40960, Firmware version: 535.183.01, BIOS version: 92.00.45.00.03, Clock rate: 1410)
  • Nvidia Ampere NVIDIA A100-SXM4-40GB (Memory amount: 40960, Firmware version: 535.183.01, BIOS version: 92.00.45.00.03, Clock rate: 1410)
  • Nvidia Ampere NVIDIA A100-SXM4-40GB (Memory amount: 40960, Firmware version: 535.183.01, BIOS version: 92.00.45.00.03, Clock rate: 1410)
Storage
Storage Size
0 GB
Storages
    Network
    Inbound Traffic
    0 GB/month
    Outbound Traffic
    0 GB/month
    IPv4
    0

    Availability

    REGION / IDSPOTONDEMAND
    Council Bluffs (US) / us-central1
    2.36604 USD/h 5.91582 USD/h
    The Dalles (US) / us-west1
    2.36604 USD/h 5.91582 USD/h
    Moncks Corner (US) / us-east1
    2.36604 USD/h 5.91582 USD/h
    Tel Aviv (IL) / me-west1
    2.60288 USD/h 6.5074 USD/h
    Eemshaven (NL) / europe-west4
    2.6052 USD/h 6.51251 USD/h
    Las Vegas (US) / us-west4
    2.66448 USD/h 6.66236 USD/h
    Salt Lake City (US) / us-west3
    2.55808 USD/h 7.10564 USD/h
    Jurong West (SG) / asia-southeast1
    2.85392 USD/h 7.29758 USD/h
    Seoul (KR) / asia-northeast3
    2.6686 USD/h 7.58425 USD/h
    Tokyo (JP) / asia-northeast1
    3.03424 USD/h 7.58425 USD/h

    Economics

    Average Price per Region

    Prices per Zone

    Lowest Prices

    Performance

    Alternatives

    Servers of the Same Family

    INSTANCEvCPUsMEMORYGPUs
    a2-highgpu-1g 12 85 GiB 1
    a2-ultragpu-1g 12 170 GiB 1
    a2-highgpu-2g 24 170 GiB 2
    a2-ultragpu-2g 24 340 GiB 2
    a2-highgpu-4g 48 340 GiB 4
    a2-ultragpu-4g 48 680 GiB 4
    a2-ultragpu-8g 96 1360 GiB 8

    Similar Servers

    INSTANCEVENDORvCPUsMEMORYGPUs

    a2-highgpu-8g FAQs

    What is a2-highgpu-8g?

    What are the specs of the a2-highgpu-8g server?

    How fast is the a2-highgpu-8g server?

    How much does the a2-highgpu-8g server cost?

    Who is the provider of the a2-highgpu-8g server?

    Where is the a2-highgpu-8g server available?

    Are there any other sized servers in the a2 server family?

    What other servers offer similar performance to a2-highgpu-8g?