Menu
Selecting Instance Sizes
Choose the amount of virtual memory for your workflow. Look for gpu instance types for speeding up deep neural network workflows.
![](https://i0.wp.com/www.jetml.com/wp-content/uploads/2022/03/Instance-size-selector.jpg?fit=580%2C466&ssl=1)
Selecting the right instance size
Small instances 1gb-4gb | Medium instances 8gb-32gb | Large instances 64gb-512gb | GPU instances 16gb+gpu and 61gb+gpu | |
Resources similar to | smart phone | laptop | workstation | graphics accelerated workstation |
Jupyter notebook usage | light | heavy | heavy | heavy |
Good for deep learning (Tensorflow, Keras, Pytorch, images, video, audio) or traditional workloads (csv, parquet, databases, scikit learn, pandas) | traditional | traditional | traditional | deep learning |
Ability to load at once into memory | small datasets (~1GB-4GB) | large datasets (~4GB-32GB) | large datasets (~32GB-512GB) | large datasets (~16GB-61GB) |
Capable of batch processing very large datasets with Python | yes | yes | yes | yes |
Cost to run an hour | $0.02 to $0.08 | $0.16 to $0.64 | $1.28 to $10.24 | $0.50 and $2.00 |
Tip Running out of memory with your current instance type? Start a larger instance with enough memory to handle your workload – or refactor code to keep memory usage low such as processing large datasets in batches. |