Data NFT Streaming Automation - Multiple files

Brief:

For the purpose of minting a Data NFT, the minter is required to provide a Data Stream URL. Below is a brief on how a user can set up an automation sequence to publish and maintain a dynamic Data Stream. The below process uploads file assets that will become the Data Stream to AWS S3. The link can then be used to mint a Data NFT.

Requirements:

  • GitHub account (create one if needed)

  • AWS account (create one if needed)

  • Git and GitHub CLI installed locally

  • Python script that generates output files

Steps:

  1. Initial Git setup and GitHub account login:

    • Set up Git on your local machine and log in to Git using your GitHub account credentials.

  2. Create an AWS S3 bucket:

    • If you don't already have an AWS S3 bucket, create one as part of the setup process.

      • Note: If you would like to use a custom domain name in front of your S3 bucket (mydomainname.com/my_data_stream.json), then please read this guide for specific naming instructions for your new S3 bucket : Task 2: Convert your AWS S3 Bucket into a "website"

  3. Create a GitHub repository:

    • Use the provided template to create a GitHub repository.

    • Clone the repository to your local machine for further configuration.

  4. Set up secrets into your Github repo to access AWS:

    • In the repository's "Secrets and variables" section, configure three secrets:

      • S3_BUCKET_NAME: Provide the name of your AWS S3 bucket.

      • S3_KEY_ID: Set the AWS S3 access key ID.

      • S3_ACCESS_KEY: Set the AWS S3 access key.

  5. Customize the script:

    • Adjust your Python script according to the provided template.

    • Ensure that the output files generated by the script are saved in the "output" folder of the repository.

  6. Push changes to GitHub:

    • Push your script changes to the GitHub repository.

    • This action triggers the GitHub Action, which automatically runs the script daily at 23:30 UTC or whenever new code is pushed.

For more details and customization options, refer to the full documentation available at https://github.com/Itheum/template-datastream-aws-s3

Last updated