logo

p3.16xlarge by Amazon Web Services

p3.16xlarge is a GPU accelerated Gen3 16xlarge server offered by Amazon Web Services with 64 vCPUs, 488 GiB of memory and 0 GB of storage. The pricing starts at 5.578 USD per hour.
64 vCPU
488 GiB Memory
8 GPU
Spare SCore
29310
(All-cores)
861
(Single-core)

Specifications

Server Details

General
Vendor ID
aws
Server ID
p3.16xlarge
Name
p3.16xlarge
API Reference
p3.16xlarge
Display Name
p3.16xlarge
Description
GPU accelerated Gen3 16xlarge
Family
p3
Status
active
Observed At
2024-09-19T18:19:58.560209
CPU
vCPUs
64
Hypervisor
xen
CPU Allocation
Dedicated
CPU Cores
32
CPU Speed
2.3 GHz
CPU Architecture
x86_64
CPU Manufacturer
Intel
CPU Family
Xeon
CPU Model
E5-2686 v4
CPU L1 Cache
2 MiB
CPU L2 Cache
8 MiB
CPU L3 Cache
90 MiB
CPU Flags
fpu, vme, de, pse, tsc, msr, pae, mce, cx8, apic, sep, mtrr, pge, mca, cmov, pat, pse36, clflush, mmx, fxsr, sse, sse2, ht, syscall, nx, pdpe1gb, rdtscp, lm, constant_tsc, arch_perfmon, rep_good, nopl, xtopology, nonstop_tsc, cpuid, aperfmperf, tsc_known_freq, pni, pclmulqdq, monitor, est, ssse3, fma, cx16, pcid, sse4_1, sse4_2, x2apic, movbe, popcnt, tsc_deadline_timer, aes, xsave, avx, f16c, rdrand, hypervisor, lahf_lm, abm, 3dnowprefetch, cpuid_fault, pti, fsgsbase, bmi1, hle, avx2, smep, bmi2, erms, invpcid, rtm, rdseed, adx, xsaveopt, ida
Memory
Memory Amount
488 GiB
GPU
GPU Count
8
GPU Memory Min
16 GiB
GPU Memory Total
128 GiB
GPU Manufacturer
Nvidia
GPU Family
Volta
GPU Model
Tesla V100-SXM2-16GB
GPUs
  • Nvidia Volta Tesla V100-SXM2-16GB (Memory amount: 16384, BIOS version: 88.00.4F.00.09, Clock rate: 1530)
  • Nvidia Volta Tesla V100-SXM2-16GB (Memory amount: 16384, BIOS version: 88.00.4F.00.09, Clock rate: 1530)
  • Nvidia Volta Tesla V100-SXM2-16GB (Memory amount: 16384, BIOS version: 88.00.4F.00.09, Clock rate: 1530)
  • Nvidia Volta Tesla V100-SXM2-16GB (Memory amount: 16384, BIOS version: 88.00.4F.00.09, Clock rate: 1530)
  • Nvidia Volta Tesla V100-SXM2-16GB (Memory amount: 16384, BIOS version: 88.00.4F.00.09, Clock rate: 1530)
  • Nvidia Volta Tesla V100-SXM2-16GB (Memory amount: 16384, BIOS version: 88.00.4F.00.09, Clock rate: 1530)
  • Nvidia Volta Tesla V100-SXM2-16GB (Memory amount: 16384, BIOS version: 88.00.4F.00.09, Clock rate: 1530)
  • Nvidia Volta Tesla V100-SXM2-16GB (Memory amount: 16384, BIOS version: 88.00.4F.00.09, Clock rate: 1530)
Storage
Storage Size
0 GB
Storages
    Network
    Network Speed
    5 Gbps
    Inbound Traffic
    0 GB/month
    Outbound Traffic
    0 GB/month
    IPv4
    0

    Availability

    REGION / IDSPOTONDEMAND
    Northern Virgina (US) / us-east-1
    7.914925 USD/h 24.48 USD/h
    Oregon (US) / us-west-2
    8.362733 USD/h 24.48 USD/h
    Ohio (US) / us-east-2
    9.1879 USD/h 24.48 USD/h
    Dublin (IE) / eu-west-1
    11.67605 USD/h 26.44 USD/h
    Quebec (CA) / ca-central-1
    14.8125 USD/h 26.928 USD/h
    London (GB) / eu-west-2
    21.28165 USD/h 28.712 USD/h
    Frankfurt (DE) / eu-central-1
    8.5751 USD/h 30.584 USD/h
    Tokyo (JP) / ap-northeast-1
    12.31485 USD/h 33.552 USD/h
    Seoul (KR) / ap-northeast-2
    19.725 USD/h 33.872 USD/h
    Singapore (SG) / ap-southeast-1
    12.62525 USD/h 33.872 USD/h
    Sydney (AU) / ap-southeast-2
    13.2276 USD/h 33.872 USD/h

    Economics

    Average Price per Region

    Prices per Zone

    Lowest Prices

    Performance

    Alternatives

    Servers of the Same Family

    INSTANCEvCPUsMEMORYGPUs
    p3.2xlarge 8 61 GiB 1
    p3.8xlarge 32 244 GiB 4

    Similar Servers

    INSTANCEVENDORvCPUsMEMORYGPUs

    p3.16xlarge FAQs

    What is p3.16xlarge?

    What are the specs of the p3.16xlarge server?

    How fast is the p3.16xlarge server?

    How much does the p3.16xlarge server cost?

    Who is the provider of the p3.16xlarge server?

    Where is the p3.16xlarge server available?

    Are there any other sized servers in the p3 server family?

    What other servers offer similar performance to p3.16xlarge?