When working with AI development, AWS customers often need to restrict outbound and inbound internet traffic due to the sensitive data they work with. Transmitting data across the internet is typically not secure enough for highly sensitive data; hence, accessing AWS services without leaving the AWS network can enhance security. AWS users can enhance the security of their AI development by creating Amazon SageMaker instances within a virtual private cloud (VPC) with direct internet access disabled. However, this process makes API calls to other AWS services impossible, challenging developers building architectures for production, needing numerous AWS services.
A solution is presented involving configuring SageMaker notebook instances to connect to AWS services such as Amazon Bedrock via AWS PrivateLink and Amazon Elastic Compute Cloud (Amazon EC2) security groups. The security groups should be created first before creating a SageMaker instance, as security group configurations cannot be altered once created. Two groups should be created: one for outbound and another for inbound.
A SageMaker instance with the outbound security group is subsequently created, using the same VPC that was used to create the inbound and outbound security groups. An Interface VPC endpoint is then created using Amazon VPC, which utilizes PrivateLink to allow calls from the SageMaker to AWS services.
These configurations can be tested using Python API calls that invoke Amazon Bedrock models. However, it has to be configured to utilize the VPC endpoint by adding an endpoint URL to the client instantiation.
This post provides steps to safely remove or clean up the created resources, involving actions such as navigating to the notebook configuration page, stopping the instance, then choosing ‘Delete’ to delete the instance.
While the post demonstrates how to link SageMaker to Amazon Bedrock, the steps can be replicated for other services. The recommendation is for developers working on developing AI applications within AWS to get started.
About the authors:
Francisco Calderon, Data Scientist at AWS Generative AI Innovation Center, helps solve business problems for AWS customers via Generative AI. Sungmin Hong, an Applied Scientist at AWS Generative AI Innovation Center, expedites various use cases for AWS customers. Yash Shah, a Science Manager in the AWS Generative AI Innovation Center, his team focus on machine learning use cases from healthcare, sports, automotive, and manufacturing. Anila Joshi has over ten years of experience in building AI solutions. As an Applied Science Manager at AWS Generative AI Innovation Center, she pioneers innovative applications of AI for customers to strategically plot a path to the future of AI.