Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Reusable Initialization of aiobotocore S3 Client #1111

Closed
all4one-max opened this issue Apr 25, 2024 Discussed in #1110 · 7 comments
Closed

Reusable Initialization of aiobotocore S3 Client #1111

all4one-max opened this issue Apr 25, 2024 Discussed in #1110 · 7 comments

Comments

@all4one-max
Copy link

Discussed in #1110

Originally posted by all4one-max April 21, 2024
I'm in the process of migrating our Python package, used by various applications within our organization, from boto3 to aiobotocore. In our existing implementation with boto3, we ensure singleton initialization of the S3 client to optimize resource usage and avoid redundant client creation.

Existing Implementation with boto3:

  class AWSMeta(type):
	  _instances: Dict[Any, Any] = {}
  
	  def __call__(cls) -> Any:
		  if cls not in cls._instances:
			  instance = super().__call__()
			  cls._instances[cls] = instance
		  return cls._instances[cls]


  class S3ClientSingleton(metaclass=AWSMeta):
	  _s3_client = None
  
	  @classmethod
	  def get_s3_client(cls) -> Any:
		  if not cls._s3_client:
			  # Lazy initialization of the S3 client
			  cls._s3_client = boto3.client(
				  "s3",
				  aws_access_key_id=AWS_ACCESS_KEY,
				  aws_secret_access_key=AWS_ACCESS_KEY_SECRET,
				  region_name="us-east-1",
			  )
		  return cls._s3_client

However, transitioning to aiobotocore poses a challenge as it requires using context managers for client creation, and the client doesn't exist outside the context. I aim to encapsulate this singleton logic within our package itself, rather than relying on configuration in the application lifecycle.

I've explored several sources, including the aiobotocore documentation and relevant discussions on GitHub and Stack Overflow, but haven't found a satisfactory solution yet.

Sources referred:

  1. https://github.com/aio-libs/aiobotocore
  2. Feature: Get a long-lived client object #928
  3. https://stackoverflow.com/questions/77095898/reuse-create-client-in-aiobotocore-for-better-performences (no one answered here but more or less the use case but I have to do this inside my package logic itself)
@all4one-max
Copy link
Author

all4one-max commented Apr 25, 2024

I tried implementing using the below code but it did not work I was getting Runtime error, Event loop is closed. I am executing operations such as download, upload in my celery task.

class AsyncS3ClientSingleton(metaclass=AWSMeta):
    _async_s3_client = None

    @classmethod
    async def get_s3_client(cls) -> Any:
        if not cls._async_s3_client:
            # Lazy initialization of the S3 client
            session = aioboto3.Session()
            ctx = session.client(
                "s3",
                aws_access_key_id=AWS_ACCESS_KEY,
                aws_secret_access_key=AWS_ACCESS_KEY_SECRET,
                region_name="us-east-1",
            )
            cls._async_s3_client = await ctx.__aenter__()
        return cls._async_s3_client

But when I create the session and client for every operation as shown below it works fine, though it defeats the purpose of reusing the client.

      session = aioboto3.Session()
      async with session.client(
          "s3",
          aws_access_key_id=AWS_ACCESS_KEY,
          aws_secret_access_key=AWS_ACCESS_KEY_SECRET,
          region_name="us-east-1",
      ) as async_s3_client:
          await async_s3_client.upload_file(file_name, bucket, key)

Please help me fix this. I feel it has something to do with session being tied to an event loop and everytime a new celery task is initiated, new event loop is created. Only speculation though

@thehesiod
Copy link
Collaborator

you need to track the lifetimes of your event loops. The client is only valid for the life of the run loop

@thehesiod
Copy link
Collaborator

sorry it took so long to get back, somehow missed this

@hugoncosta
Copy link

Is tracking an obligation? Especially in runtimes that are shortlived (e.g. AWS Lambda), this shouldn't really be a concern. I also use (and am looking for an equivalent) the singleton pattern (i.e. initialising one client and using it for everything) in Kotlin and other languages. Is it possible to achieve it with aiobotocore or should we just initialize these connections with the async enter/exit pattern?

Copy link

This issue has been marked as stale because it has been inactive for more than 60 days. Please update this pull request or it will be automatically closed in 7 days.

@github-actions github-actions bot added the Stale label Oct 11, 2024
@thehesiod
Copy link
Collaborator

For lambdas you basically just use a context from your main. So when main exits the client is unrolled

@github-actions github-actions bot removed the Stale label Oct 12, 2024
@thehesiod
Copy link
Collaborator

please re-post with full example, this should be posted in discussions as this is not an issue with aiobotocore.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants