s3#

predict_backend.utils.s3.create_buckets()#
predict_backend.utils.s3.read_data_from_s3(bucket_name, key, **kwargs)#

Read data from s3 allowing boto3 to locate assumed credentials, assumed role, or EC2 IAM roles.

Parameters:
  • bucket_name (str) – the s3 bucket name

  • key (str) – the path in s3 where data will be written

  • kwargs – additional kwargs to pass to s3 client (endpoint_url, region, etc)

Returns:

predict_backend.utils.s3.read_data_from_s3_using_aws_credentials(aws_access_key_id, aws_secret_access_key, bucket_name, key, **kwargs)#

Read data from s3 using supplied AWS credentials. Objects may need to be unpickled to be used depending on how they were created. The write_data_to_s3_* functions will convert an object to binary data by dill pickling it so these same objects will need to be unpickled using dill.loads

Parameters:
  • aws_access_key_id (str) – The access key to use when creating the client

  • aws_secret_access_key (str) – The secret key to use when creating the client

  • bucket_name (str) – the s3 bucket name

  • key (str) – the path in s3 where data will be written

  • kwargs – additional kwargs to pass to s3 client (endpoint_url, region, etc)

Returns:

predict_backend.utils.s3.read_data_from_s3_using_aws_profile(profile_name, bucket_name, key, **kwargs)#

Read data from s3 using a configured AWS profile. Objects may need to be unpickled to be used depending on how they were created. The write_data_to_s3_* functions will convert an object to binary data by dill pickling it so these same objects will need to be unpickled using dill.loads

Parameters:
  • profile_name (str) – a profile_name to read credentials from the user’s .aws/credentials profile if available

  • bucket_name (str) – the s3 bucket name

  • key (str) – the path in s3 where data will be written

  • kwargs – additional kwargs to pass to s3 client (endpoint_url, region, etc)

Returns:

predict_backend.utils.s3.read_data_from_s3_using_connection_store(connection_name, bucket_name, key, store_interface, connection_owner=None, **kwargs)#

Read data from s3 using the connection store to read AWS credentials. Objects may need to be unpickled to be used depending on how they were created. The write_data_to_s3_* functions will convert an object to binary data by dill pickling it so these same objects will need to be unpickled using dill.loads

Parameters:
  • connection_name (str) – a connection_name to read credentials from the connection store

  • bucket_name (str) – the s3 bucket name

  • key (str) – the path in s3 where data will be written

  • store_interface (StoreInterface) – a store interface object from the flow step where this utility is being called

  • connection_owner (Optional[str]) – a user_id of the user that owns the connection

  • kwargs – additional kwargs to pass to s3 client (endpoint_url, region, etc)

Returns:

predict_backend.utils.s3.write_data_to_s3(data, bucket_name, key, **kwargs)#

Write data to s3 allowing boto3 to locate assumed credentials, assumed role, or EC2 IAM roles.

Parameters:
  • data (Any) – the python object to store in S3, if not bytes/bytearray object will be dill.pickled

  • bucket_name (str) – the s3 bucket name

  • key (str) – the path in s3 where data will be written

  • kwargs – additional kwargs to pass to s3 client (endpoint_url, region, etc)

Returns:

predict_backend.utils.s3.write_data_to_s3_using_aws_credentials(data, aws_access_key_id, aws_secret_access_key, bucket_name, key, **kwargs)#

Write a python object to s3 using supplied AWS credentials

Parameters:
  • data (Any) – the python object to store in S3, if not bytes/bytearray object will be dill.pickled

  • aws_access_key_id (str) – The access key to use when creating the client

  • aws_secret_access_key (str) – The secret key to use when creating the client

  • bucket_name (str) – the s3 bucket name

  • key (str) – the path in s3 where data will be written

  • kwargs – additional kwargs to pass to s3 client (endpoint_url, region, etc)

Returns:

predict_backend.utils.s3.write_data_to_s3_using_aws_profile(data, profile_name, bucket_name, key, **kwargs)#

Write a python object to s3 using a configured AWS profile

Parameters:
  • data (Any) – the python object to store in S3, if not bytes/bytearray object will be dill.pickled

  • profile_name (str) – a profile_name to read credentials from the user’s .aws/credentials profile if available

  • bucket_name (str) – the s3 bucket name

  • key (str) – the path in s3 where data will be written

  • kwargs – additional kwargs to pass to s3 client (endpoint_url, region, etc)

Returns:

predict_backend.utils.s3.write_data_to_s3_using_connection_store(data, connection_name, bucket_name, key, store_interface, connection_owner=None, **kwargs)#

Write a python object to s3 using the connection store to read AWS credentials

Parameters:
  • data (Any) – the python object to store in S3, if not bytes/bytearray object will be dill.pickled

  • connection_name (str) – a connection_name to read credentials from the connection store

  • bucket_name (str) – the s3 bucket name

  • key (str) – the path in s3 where data will be written

  • store_interface (StoreInterface) – a store interface object from the flow step where this utility is being called

  • connection_owner (Optional[str]) – a user_id of the user that owns the connection

  • kwargs – additional kwargs to pass to s3 client (endpoint_url, region, etc)

Returns: