logo

a2-highgpu-4g by Google Cloud Platform

a2-highgpu-4g is a Accelerator Optimized: 4 NVIDIA Tesla A100 GPUs, 48 vCPUs, 340GB RAM server offered by Google Cloud Platform with 48 vCPUs, 340 GiB of memory and 0 GB of storage. The pricing starts at 1.18302 USD per hour.
48 vCPU
340 GiB Memory
4 GPU
Spare SCore
37914
(All-cores)
1202
(Single-core)

Specifications

Server Details

General
Vendor ID
gcp
Server ID
1000048
Name
a2-highgpu-4g
API Reference
a2-highgpu-4g
Display Name
a2-highgpu-4g
Description
Accelerator Optimized: 4 NVIDIA Tesla A100 GPUs, 48 vCPUs, 340GB RAM
Family
a2
Status
active
Observed At
2024-09-19T22:21:34.489816
CPU
vCPUs
48
CPU Allocation
Dedicated
CPU Speed
2.2 GHz
CPU Architecture
x86_64
CPU Manufacturer
Intel
CPU Family
Xeon
CPU L1 Cache
2 MiB
CPU L2 Cache
24 MiB
CPU L3 Cache
39 MiB
CPU Flags
fpu, vme, de, pse, tsc, msr, pae, mce, cx8, apic, sep, mtrr, pge, mca, cmov, pat, pse36, clflush, mmx, fxsr, sse, sse2, ss, ht, syscall, nx, pdpe1gb, rdtscp, lm, constant_tsc, rep_good, nopl, xtopology, nonstop_tsc, cpuid, tsc_known_freq, pni, pclmulqdq, ssse3, fma, cx16, pcid, sse4_1, sse4_2, x2apic, movbe, popcnt, aes, xsave, avx, f16c, rdrand, hypervisor, lahf_lm, abm, 3dnowprefetch, ssbd, ibrs, ibpb, stibp, ibrs_enhanced, fsgsbase, tsc_adjust, bmi1, hle, avx2, smep, bmi2, erms, invpcid, rtm, mpx, avx512f, avx512dq, rdseed, adx, smap, clflushopt, clwb, avx512cd, avx512bw, avx512vl, xsaveopt, xsavec, xgetbv1, xsaves, arat, avx512_vnni, md_clear, arch_capabilities
Memory
Memory Amount
340 GiB
GPU
GPU Count
4
GPU Manufacturer
Nvidia
GPU Family
Ampere
GPU Model
NVIDIA A100-SXM4-40GB
GPUs
  • Nvidia Ampere NVIDIA A100-SXM4-40GB (Memory amount: 40960, Firmware version: 535.183.01, BIOS version: 92.00.45.00.03, Clock rate: 1410)
  • Nvidia Ampere NVIDIA A100-SXM4-40GB (Memory amount: 40960, Firmware version: 535.183.01, BIOS version: 92.00.45.00.03, Clock rate: 1410)
  • Nvidia Ampere NVIDIA A100-SXM4-40GB (Memory amount: 40960, Firmware version: 535.183.01, BIOS version: 92.00.45.00.03, Clock rate: 1410)
  • Nvidia Ampere NVIDIA A100-SXM4-40GB (Memory amount: 40960, Firmware version: 535.183.01, BIOS version: 92.00.45.00.03, Clock rate: 1410)
Storage
Storage Size
0 GB
Storages
    Network
    Inbound Traffic
    0 GB/month
    Outbound Traffic
    0 GB/month
    IPv4
    0

    Availability

    REGION / IDSPOTONDEMAND
    Council Bluffs (US) / us-central1
    1.18302 USD/h 2.95791 USD/h
    The Dalles (US) / us-west1
    1.18302 USD/h 2.95791 USD/h
    Moncks Corner (US) / us-east1
    1.18302 USD/h 2.95791 USD/h
    Tel Aviv (IL) / me-west1
    1.30144 USD/h 3.2537 USD/h
    Eemshaven (NL) / europe-west4
    1.3026 USD/h 3.25626 USD/h
    Las Vegas (US) / us-west4
    1.33224 USD/h 3.33118 USD/h
    Salt Lake City (US) / us-west3
    1.27904 USD/h 3.55282 USD/h
    Jurong West (SG) / asia-southeast1
    1.42696 USD/h 3.64879 USD/h
    Seoul (KR) / asia-northeast3
    1.3343 USD/h 3.79212 USD/h
    Tokyo (JP) / asia-northeast1
    1.51712 USD/h 3.79212 USD/h

    Economics

    Average Price per Region

    Prices per Zone

    Lowest Prices

    Performance

    Alternatives

    Servers of the Same Family

    INSTANCEvCPUsMEMORYGPUs
    a2-highgpu-1g 12 85 GiB 1
    a2-ultragpu-1g 12 170 GiB 1
    a2-highgpu-2g 24 170 GiB 2
    a2-ultragpu-2g 24 340 GiB 2
    a2-ultragpu-4g 48 680 GiB 4
    a2-highgpu-8g 96 680 GiB 8
    a2-ultragpu-8g 96 1360 GiB 8

    Similar Servers

    INSTANCEVENDORvCPUsMEMORYGPUs

    a2-highgpu-4g FAQs

    What is a2-highgpu-4g?

    What are the specs of the a2-highgpu-4g server?

    How fast is the a2-highgpu-4g server?

    How much does the a2-highgpu-4g server cost?

    Who is the provider of the a2-highgpu-4g server?

    Where is the a2-highgpu-4g server available?

    Are there any other sized servers in the a2 server family?

    What other servers offer similar performance to a2-highgpu-4g?