AWS: example using sagemaker jupyter notebook for a simple training

This is a example procedure for a simple deeplearning case using jupyter notebook on AWS sagemaker. I followed a tutorial shown in a github repository [1]

Summary of the procedure

  • Here deep learning case used is for a bank data with personal information.
  • included components in the AWS sagemaker is the sagemaker jupyter notebook instance creation, EC2 instance use for training and s3 bucket object creation for traing/test input data and output model.
  • input data loading and deep learning model build

Sagemaker jupyter notebook instance creation
A AWS sagemaker jupyter notebook instance is created with ml.t3.medium instance

S3 bucket creation for data
S3 bucket object is created for training and test input and model output data. Here
boto3 module to access s3
image

Data loading
panda module is used to parse the training and test data and
sagemaker.input module is used to load the data into the model.

Model building
sagemaker xgboost model library is used to build the model and
ml.m5.2xlarge compute instance is also used for the training.
image

The code used in this example is shown the ref. [3]

Reference

[1] sagemaker introduction: https://gmoein.github.io/files/Amazon%20SageMaker.pdf
[2] sagemaker xgbost documentation: Use the XGBoost algorithm with Amazon SageMaker - Amazon SageMaker
[3] github: GitHub - dchoi/sagemakerModelPractice