View of server racks in a modern data center, showcasing networking equipment and infrastructure.

Snowflake, AWS, NVIDIA spark Data Cloud AI shift

Snowflake, Amazon Web Services, and NVIDIA unveiled a partnership Monday to bring powerful GPU-accelerated computing directly to enterprise data storage, eliminating security risks from moving sensitive information. The collaboration centers on AWS’s new EC2 G7e instances, powered by NVIDIA’s latest Blackwell GPUs, which Snowflake will integrate into its AI Data Cloud to accelerate artificial intelligence workloads while maintaining strict data governance.

The new Amazon EC2 G7e instances are now generally available in the US East (N. Virginia) and US East (Ohio) AWS Regions, according to AWS News Blog, offering enterprises immediate access through On-Demand, Savings Plans, or Spot Instance models. Each instance can be configured with up to eight NVIDIA RTX PRO 6000 Blackwell Server Edition GPUs, delivering a total of 768 GB of GPU memory per instance.


The technical specifications represent a significant leap in enterprise AI capabilities. According to NVIDIA, the Blackwell GPUs feature fifth-generation Tensor Cores with new FP4 precision for enhanced AI performance and fourth-generation RT Cores for accelerated graphics rendering. The architecture includes PCI Express Gen 5 for rapid data transfer and supports Multi-Instance GPU technology, allowing a single GPU to be partitioned into up to four fully isolated instances.

Market Impact

View of server racks in a modern data center, showcasing networking equipment and infrastructure.

The partnership addresses a critical challenge in enterprise AI adoption: the security risks and performance penalties of moving sensitive data between systems. By bringing GPU computing directly to where data resides in Snowflake’s environment, companies can process everything from large language model training to engineering simulations without exposing their information to additional security vulnerabilities.


This architectural approach could reshape how enterprises deploy AI workloads. Rather than transferring data between different cloud services or virtual private clouds, organizations can now keep their data within Snowflake’s established governance framework while accessing the computational power needed for demanding AI applications, including generative AI, agentic AI systems, and professional 3D rendering.

Security and Compliance Advantages

The integration maintains Snowflake’s existing security certifications, including SOC 2 Type II, HIPAA, and GDPR compliance, according to the companies. The NVIDIA Multi-Instance GPU feature provides hardware-level isolation, preventing workloads from different users or departments from interfering with one another, a critical requirement for enterprises running multiple sensitive AI applications on shared infrastructure.


AWS plans to expand software support beyond current Deep Learning AMIs, Amazon Elastic Container Service, and Amazon Elastic Kubernetes Service, with Amazon SageMaker integration planned for the future, according to AWS News Blog.

Sources

  • aws.amazon.com/blogs/aws
  • nvidia.com