Itheum Docs
  • 👋Getting Started
  • Infrastructure
    • 🚆AI Data Workforce
      • Token Utility for the AI Data Workforce
      • Join the Workforce
      • Liveliness Staking Rewards
    • 🖼️Data NFT
      • Data NFT Types
        • Data NFT-FT
        • Data NFT-LEASE
        • Data NFT-PH (Plug-In Hybrid)
      • Data NFT Generative Image Tool
    • 🤖NFMe ID Vaults
    • 🔋Liveliness - On-Chain Reputation
      • Data Creator Liveliness Bonding
        • Liveliness Score States
      • Liveliness Staking
      • FAQ - Liveliness staking
    • 🔓Data Marshal Network
  • Apps
    • 🔥<BiTz> XP System
      • Bonus BiTz for protocol usage
    • 💹Data NFT Marketplace
      • Listing a Data NFT
      • Procuring a Data NFT
      • FAQ - Data NFT Marketplace
    • 📡Data DEX
      • Minting a Data NFT
        • Store Data and Mint a Data NFT - Step-by-Step Tutorial
        • Creator Donations For Community Airdrops
      • Using the claims Portal
      • CanaryNet Guardrails
        • Guardrail : Trading Data NFTs on 3rd Party NFT Marketplaces
    • 🔍Itheum Explorer
  • Integrators
    • 🔋Liveliness Staking Guides
      • Liveliness Staking Guide : Solana
      • Liveliness Staking Guide : MultiversX
    • 📙Data Streams Guides
      • Data Asset Storage Options
      • Data Stream URL Rules
      • Zedge Storage
        • Static File on IPFS
        • Dymamic File on IPFS + DNS Link
        • Dymamic File on IPFS + IPNS
        • Music Data NFT Compatible Dynamic Data Stream on IPFS + IPNS
        • Trailblazer Data NFT Compatible Dynamic Data Stream on IPFS + IPNS
      • Amazon Web Services (AWS)
        • Storage : AWS S3
          • Data NFT Streaming Automation - Multiple files
          • Manual upload of file to AWS for Data NFT Streaming
          • Data NFT Streaming Automation - Trailblazer
        • Hosting : AWS S3 + Cloudflare
          • Task 1: Use a domain name to "sit in front" of your AWS S3 Bucket Public URL
          • Task 2: Convert your AWS S3 Bucket into a "website"
          • Task 3: Use Cloudflare to connect your Domain Name to your S3 Bucket securely
          • Troubleshooting
      • Akord - Arweave blockchain
      • MultiversX Native Auth Protected API
    • 📗Data DEX Guides
      • MultiversX Blockchain
        • Guide 1 : Get Started with the Data DEX on MultiversX
          • Section 1: Setting up wallets on the MultiversX Blockchain - Devnet
          • Section 2: Getting xEGLD Gas tokens to pay for transactions - MultiversX Devnet
          • Section 3: Getting ITHEUM devnet tokens via the Data DEX - MultiversX Devnet
        • Guide 2: Get Started with Itheum Enterprise
        • Itheum Ecosystem Actions Catalogue
      • Astar Network
        • Guide 1 : Get Started with the Data DEX on Astar Network
          • Section 1: Setting up wallets on Astar Network - Shibuya Testnet
          • Section 2: Getting ITHEUM devnet tokens via the Data DEX - Shibuya Testnet
        • Guide 2 : Procure Data NFTs from the peer-to-peer Data NFT Marketplace on Astar Network
        • Guide 3: Use the “Web3 Gamer Passport” App on the Astar Network to trade your PlayStation Data
    • 💳Supported Wallets
      • MultiversX DeFi Wallet
      • Ledger Wallet
      • xPortal Wallet
      • xAlias (Login with Google)
    • 📕Trailblazer Guides
      • How to Acquire a Trailblazer
      • How to view a Trailblazer
      • How to List a Trailblazer
    • 📘Data Coalition DAOs (DC DAOs) Guides
      • Appointer > Delegator Pattern for Data NFT "Deputizing"
  • Developers
    • 👨‍💻Software Development Kits (SDKs)
      • Data NFT SDK
        • Guide 1 : Minting a Custom Data NFT Collection with Authenticated Data Streams (via SDK)
        • Guide 2 : Unlocking Data NFTs via MultiversX Native Auth
        • Guide 3 : Using Nested Streams to Access Nested Data Assets from a Primary Data Stream
        • Guide 4: Use the Data NFT "Deputy" Feature to delegate access of your Data NFTs to a Smart Contract
        • Guide 5: Preparing a Data Stream containing a password to protect a URL
      • Enterprise SDK
        • Guide 1 : Using Itheum Enterprise to Mint a Data NFT Collection (e.g. NFT Loyalty Card Solution)
      • Data Marshal Network SDK
        • Guide 1 : Make your Regular NFT Collection to be Data NFT-PH Compatible
    • 🥋Data Marshal Network
      • Data Marshal Node Gateway Endpoints
      • Data Marshal Transit Flags and Headers
    • 🛂Tech Support - Discord
      • Portal Bridge Support
    • 🛒Release Notes
      • Data DEX
      • Itheum Explorer
      • Data NFT SDK
      • Enterprise SDK
      • Data Marshal Network
    • 🔐Security
      • 🐞Bug Bounty
      • ℹ️Security Audit
  • Protocol
    • $ITHEUM Token
    • 🌉Token Bridge
      • FAQ - Omni-Chain Portal Bridge
      • $ITHEUM Token Multi-Chain Max Supply Rebalancing Transactions Audit
    • 🏆Token Rewards
      • Badges
    • 🧨Token Burning
      • Phase 1 : Token Burn Program
    • 🏛️Governance
      • Itheum Ecosystem DAO
        • Version 1: How it Works
      • Itheum xPand DAO
        • Itheum xPand Grants Program
          • Code Of Conduct
          • Announcement Guidelines
          • Cohorts vs Alpha Builders
        • Program 1: MultiversX Post-Hackathon Accelerator
        • Program 2: xPand DAO Music Data NFT Growth
    • 💪Hackathons and Dev Challenges
      • MultiversX xDay Hackathon
        • Project Ideas > MultiversX Dev Tooling and Infra
        • Project Ideas > Itheum
        • Test Data NFT Catalog
      • Community Test Events
        • Portal Traveler 🌀 : Test the Itheum Omni-Chain Portal (Bridge)
        • APR for Liveliness 🎖️: Test the Bonding + Staking Rewards Module
        • Minting and Bonding on Solana
  • R&D
    • 🏢Itheum Enterprise
    • 🗳️Data Coalition DAOs (DC DAOs)
    • 🎏Trailblazer
      • FAQ - Trailblazer
  • Legal
    • ⚖️Ecosystem Tools Terms
      • Datadex
        • Terms Of Use
        • Privacy Policy
      • Liveliness Bonding: Penalties and Slashing Terms
      • BiTz XP
        • Give BiTz
      • Omni-Chain Portal Bridge
      • Gamer Passport
        • Data Collection and Storage
    • 👮Content Guidelines
    • Itheum Data License
    • Terminology Disclaimer
    • Protocol Docs, Token Disclaimer
Powered by GitBook
On this page
  • Step 1) Name your AWS S3 Bucket the exact same as your full domain name
  • Step 2) Enable "Static website hosting" for your bucket
  • Step 3) Add appropriate Bucket policy
  • Step 4) Upload your data assets to this new bucket
  1. Integrators
  2. Data Streams Guides
  3. Amazon Web Services (AWS)
  4. Hosting : AWS S3 + Cloudflare

Task 2: Convert your AWS S3 Bucket into a "website"

PreviousTask 1: Use a domain name to "sit in front" of your AWS S3 Bucket Public URLNextTask 3: Use Cloudflare to connect your Domain Name to your S3 Bucket securely

Last updated 1 year ago

If you followed the guides on Storage : AWS S3 and used AWS S3 to store your data assets for the Data Streams, then this section details how you can "slightly" update your processes to convert your S3 buckets into a "website". This allows you to connect the domain you now own as per your previous task Task 1: Use a domain name to "sit in front" of your AWS S3 Bucket Public URLto your AWS S3 bucket.

Step 1) Name your AWS S3 Bucket the exact same as your full domain name

Note that in our previous Task, we procured the domain alice-datanft-bucket.com which we intend to use to load content from our S3 Bucket.

We recommend that you also use a subdomain to point to your S3 bucket as this gives you a lot more flexibility in future to grow the the content for your domain. Based on this, let's pick a subdomain called dataassets. So, now the full domain dataassets.alice-datanft-bucket.com will be setup to load content from your S3 bucket.

Let's now proceed with the naming of the AWS S3 bucket.

In the steps mentioned in Storage : AWS S3 where it says "Create an AWS S3 bucket", you will need to make sure to use the exact same name of your full domain as the bucket name. So, name your S3 bucket dataassets.alice-datanft-bucket.com

Note that this above instruction is only related to the "name of the S3 bucket", you will still need to follow the other S3 bucket creation instructions mentioned here :

Step 2) Enable "Static website hosting" for your bucket

Once you have created your S3 bucket, the next step is to configure it for website hosting. To do this, navigate to the "Properties" tab of your bucket and click on “Static website hosting.” Select the "Host a static website" option and enter an "index document" (a dummy index.html file will do, this file does not need to exist). Click the “Save changes” button to save your settings.

Step 3) Add appropriate Bucket policy

And finally, navigate to the "Permissions" tab of your bucket, scroll down to the "Bucket Policy" section, click on Edit and enter the following.

Make sure to change the bucket name in Resource: to match your bucket name

{
	"Version": "2012-10-17",
	"Statement": [
		{
			"Sid": "PublicReadForGetBucketObjects",
			"Effect": "Allow",
			"Principal": "*",
			"Action": "s3:GetObject",
			"Resource": "arn:aws:s3:::dataassets.alice-datanft-bucket.com/*"
		}
	]
}

Click the “Save changes” button to save your settings.

Step 4) Upload your data assets to this new bucket

As per the instructions in the guides on Storage : AWS S3, you can now upload the data assets that you want to use as data NFT Data Streams into this new bucket you created. Again, we recommend that you make a new "folder" inside the S3 Bucket and then place the data assets in this folder, this furthermore allows for a lot of flexibility if you intend to mint multiple Data NFTs and host it all in a single AWS S3 bucket. But this is optional and you don't need to do it if you don't want to. In the screenshot below, we have created a new folder call file_storage inside our dataassets.alice-datanft-bucket.com bucket and uploaded two data assets inside it. These are the files that can be used for 2 separate Data Streams if needed, and this is the benefit of having the separate folder as described. The final Data Stream URLs for both these Data Streams will eventually be:

  1. https://dataassets.alice-datanft-bucket.com/file_storage/stream_1.json

  2. https://dataassets.alice-datanft-bucket.com/file_storage/stream_2.svg

We are now ready to move onto our final step to complete this process...

📙
https://github.com/Itheum/template-datastream-aws-s3#b-creating-an-aws-s3-bucket