To see a small sample collection of my results on the OpenSea test instance, click here!
To interact with the smart contract directly, and mint your own NFT (supply is limited) click here.
So how does a developer generate thousands of unique pieces of art programmatically? First of all, it’s important to understand that most NFTs are not generated on-chain at all. Solidity provides us a means to ‘set the rules’ of our smart contract on-chain but scripting tools like Python are required to generate art and interact with our contract.
With regards to the art itself, we will need to create layers of interchangeable characteristics as PNGs, as long as we are careful in our design we can generate and merge these layers into a final, random character that we can deploy on-chain.
First things first, I needed a basis for my design. Being that I am by no means an artist I chose to create some very simple pixelated character art, similar to that of the original CryptoPunks. To generate the PNG files I used Pixlr, a free tool you can find here; to achieve the desired effect I set my image size to 25×25.
Now you’ve likely noticed that most NFT collections share a base body or shape that typically includes color, this is then followed by a variety of characteristics ranging from appendages to accessories — arms, clothing, jewelry, and weapons. In my case I started with several required layers including the Body, Eyes, Mouth, and Hair; I followed these with optional characteristics: Facial Hair, Jewelry, Hat, Glasses, and Mask — in that order. Each layer is represented by a folder, each folder containing a range of characteristics, as PNG files.
I chose to number my images, but since we’ll be leveraging a Map object you can just as easily use a named file for each characteristic — any key-value pair will do.
We start with a base layer, our ‘Body’ type, this is the base by which everything else will be added, as progressive layers.
Here you can see our second layer, our ‘Eyes’, which can be merged or pasted atop our base to create the image you see below.
With regard to the selection process, we will leverage a random number generator and a ‘map’ of characteristics that align with our PNG files; if, for example, our random number generator provides us with a ‘3’ we will programmatically open this file 3.png
, in the Hair
folder — our path will be something like this: ../Assets/Hair/3.png
. The map depicted below provides both a numbered key
as well as a string type name value
, our key-value pair; this will come into play later in the process:
Interactions with these files will come courtesy of the pathlib
and os
Python libraries.
Randomization
Randomization is an interesting topic in and of itself, being that computers are deterministic in nature they are greatly limited with regard to the generation of truly random numbers; for the purpose of this project, pseudo-random numbers generated via Python’s NumPy library will suffice.
Please refer to ‘Randomization, A Deep Dive’ at the end of the article if you are interested in leveraging the Chain Link VRF (Verifiable Random Function) to generate a cryptographically verifiable random number.
The following snippet will return a random number in the desired range; here our SEED value is optional, most randomization strategies will allow for the integration of some form of SEED value, this is typically a timeStamp
of OS currentTime
, this adds variety but does not guarantee randomness.
Alternatively, you can return multiple numbers or return a single (large) number and use some math to divide this into many smaller, random numbers. For example, a very large number <randomNumber = 1234567899999>
can be divided up into smaller chunks for consumption by performing the following:
randomNumber
randomNumber/10
randomNumber/10000
All provide viable candidates, we can take this a step further and use this same method to generate multiple SEEDs, for even more randomization.
Optional & Required Characteristics
Now when it comes to optional characteristics, I expanded the RANGE to include 0, meaning that if my random number was 0, I would bypass this characteristic altogether.
For required characteristics (body, eyes, mouth) I excluded 0 from our range, meaning we’d limit our range from 1-N options.
To reduce the chance any one group of optional characteristics is generated, we can simply increase the range well outside of our valid options. In the case below we have 5 options, here we can generate a number between 0–20, meaning there is a ~25% chance we will get a Mask
and a ~75% chance we will not. The following is my Mask
characteristic, adjusted for increased rarity
:
Modulo (%) is another concept that may be helpful here as it will reduce any large number to a desired circular range:
Ex: 12345678 % 8 = <range from 0–7>
In the case that we have leveraged Modulo (%) and we are working with a required characteristic, we can increase our result to shift
this range right:
Optional Characteristics: randomNumber % 8 = range from 0–7
Required Characteristics: (randomNumber % 8) + 1 = range from 1–8
Image — Background
You may be wondering about the background, don’t we start there? Well, yes and no— markets like OpenSea frame, center, and size our images to some degree to provide an appealing and versatile interface. You can see the result of a ‘background included’ image below, we want to avoid this:
In fact, OpenSea expects a PNG of our character and the background provided as a hex color code in our metadata; this means the background isn’t necessarily a part of the image at all, it is rendered by the browser or application. That being said, some projects do establish backgrounds as a characteristic; at the end of the day, this is a design decision for each project. More on metadata to come…
Image-Merge
Briefly, I leverage the Python library PIL (Pillow) for the image merge capability we require, there are a great number of ways this can be achieved, with a great many tools and languages.
One challenge you may face, at least with a digitized 25×25 type pixelated image, is the retention of clarity and sharpness when you resize for the OpenSsea market — OpenSea recommends a 350×350 image, uploading without accounting for this will obfuscate your image to some large degree; in my case Resample Nearest allowed me to retain the proper resolution while resizing.
Earlier I mentioned metadata, let me explain: this may come as a surprise but NFT image data isn’t generally stored on-chain as it is prohibitively expensive to do so. To solve this we provide a URI or link for each ‘token’/NFT that points to some form of storage. This storage hosts our metadata as a .json file; furthermore, this metadata includes a second URI or ‘link’ to our image.
Essentially our JSON file tells each marketplace everything it needs to know about our NFT, including where to find the image. This approach is aligned with the ERC 721 standard, which allows for developers and marketplaces to agree, beforehand, on the required functionality, for example, the transfer of ownership.
In accordance with these standards, OpenSea offers additional instructions with regard to metadata expectations and functionality. To make sure you are providing what is required on the Solidity side, you can import a library like OpenZeppelin ERC 721 which allows us to inherit all the required functionality, with the added ability to overload functions as need be. These functions will appear in Etherscan as if you had implemented them yourself. Your solidity file should include the following:
import “@openzeppelin/contracts/token/ERC721/ERC721.sol”;
OpenSea expands upon this by providing some additional UI functionality including ‘boosts’ and stats. You can find the Open Sea metadata documentation here.
The following is a function to generate a template JSON object for our metadata — this sets generic values as well as the tokenId
(NFT number), which in my case is determined by iteration:
Once my NFT image generation is complete, I update the image
value of my metadata object as well as append the attributes
section with details of my randomized characteristics, which we were unable to know, prior to this step.
You are likely thinking, doesn’t this mean our decentralized NFT is centralized? Well, exactly how decentralized your project will depend entirely on your use case. You will find that many NFT projects point to some centralized file system, this is not recommended but is not strictly enforced; in fact, some valuable features may require a more centralized approach.
In my case, I programmatically uploaded my metadata and image to IPFS immediately after creation. IPFS, or the Interplanetary File System, is the decentralized internet; it provides an immutable, decentralized means of storing our data.
You will find more on IPFS, including instructions in ‘IPFS, A Deep Dive’, at the end of this article.
The details of our smart contract are incredibly important, for the most part, once a contract is on-chain it is immutable — meaning we want to get it right the first time.
Being that this project was a POC for me to explore the tools and techniques used in NFT generation, I wanted to explore some more in-depth concepts beyond simply getting an NFT on the OpenSea market.
Delayed Reveal
The first topic I explored was how to implement a delayed reveal — you may have seen this in other projects like Adam Bomb Squad; in this case, ‘minting’ simply left you with a ‘pending’ image and blank metadata; this seems to be a great way to encourage trading on factors other than rarity.
The problem here lies with decentralization, unless you want a centralized application that can receive commands from your smart contract to generate the NFT, upload it to IPFS, and ‘mint’ a new token you have to have this all done, pre-deployment.
To solve for this I did not delay the generation of the NFT image and metadata, rather, I delayed the reveal by initially supplying a URI pointing to generic metadata and the image you see below. This pending image was uploaded to IPFS manually beforehand, and this single metatdata/image pair is used for ALL tokens I wish to have a delayed reveal.
So how then do we store the final URI?
A contract typically uses mappings to track all kinds of things back to our tokenId
. This means we have a way of storing NFT details; if we set this mapping rather than set the final tokenURI
on creation, we are left with the ability to ‘set’ or ‘resolve’ the URI from the mapping, at a later date.
Notice our mapping tokenIdToURI
below, this will store our URI values for our NFT token.
Our constructor expects a value loadingURI
which is our pre-set pending
image and metadata URI. We supply this URI when we deploy our contract, which triggers the constructor.
You will notice the mintPending
function accepts a URI, but rather than setting the tokenURI (_setTokenURI)
, it simply stores this in the token mapping, associated with the tokenID
.
User Mint
I also wanted to make sure users, noncontract owners, were able to mint new NFTs; since I didn’t want a centralized application triggering the generation and upload of metadata, I needed a way to do this beforehand.
I required a setTokenMapping()
function, this would skip token creation altogether. To keep track of iteration I created a second Counter
, mappedCounter
— now I have a way to track the total number of tokens, as well as the total number of potential tokens for mint (mapped).
At the time of mint, a new token would be created, and the URI would be set from our mapping; this is similar to our delayed reveal, the difference being that the actual minting is also delayed here.
As with any minting, the user address that triggers the mint takes ownership of the, previously generated but newly minted, NFT.
To interact with and manage our contract we create Python scripts, two examples follow; first, we have a function to return or tokenCount (minted)
as well as our mappedCount (mapped)
.
Now with regards to testing, the ecosystem provides us a great many options. In my case, I used OpenSea’s test instance and the Rinkeby (Etherem) test net. This allows us to use fake
accounts with fake
or free
eth and view our NFT on the test version of the OpenSea marketplace.
You will want to install the MetaMask browser extension, here you will see several test net options, I used Rinkeby. To fund your test account you will want to visit a ‘faucet’ (basically a source of fake Ethereum for our test usage). You can find one such faucet faucet here; follow the instruction on the site.
The details of your implementation depend on the tools or libraries you decided to use, most provide the keys to interact with these test nets.
One point I will stress is where to place your private key, being that this provides access to both your test funds and real funds, you will want to keep this OUT of your code, especially if it’s being uploaded to GitHub.
You can save this in a .env
file, that should be cited in your .gitignore
file, even better you can export this key to your PRIVATE_KEY
environment variable and repeat the process to switch from account to account and test net to Mainnet, like so:
$export PRIVATE_KEY=’0x………………….’
$echo PRIVATE_KEY //To display the current key
The InterPlanetary File System (IPFS) is a protocol and peer-to-peer network for storing and sharing data in a distributed file system. IPFS uses content-addressing to uniquely identify each file in a global namespace connecting all computing devices — Wikipedia
To leverage IPFS we need 2 things1 — The IPFS command-line tool and 2 — A Pinning services that will allow for the retention of our data across multiple nodes.
You will have to install IPFS, which you can find here. You can alternatively install the IPFS desktop and both can be used together, or separately.
I used the Pinata service for remote pinning capability. You can find instructions here, essentially we need to sign up and generate an API key and secret in Pinata, this will be required in our IPFS local service for the two to interact.
For convenience, you may consider installing the IPFS companion extension in chrome, this will allow you to resolve your IPFS URIs for viewing (both your image and your JSON metadata file).
Don’t forget to initialize your IPFS instance, running Ipfs init
as per the instructions, otherwise you may see a ‘no IPFS repo’ error.
Next, you will need to start the IPFS daemon by typing ipfs daemon
, by default this will run on http://localhost:5001. Alternatively, you can start IPFS by installing the IPFS desktop and starting it prior to use; if one is running the other will not start, and will not be required (it’s already on!).
With a local instance of IPFS running, whether by starting the IPFS Desktop or by running IPFS daemon
you will be able to run the following code, this will both upload our files to IPFS as well as Pin the item, via Pinata. This is best practice but OpenSea will cache these images and it even offers a freeze
method to ensure our data is stored and remains available if you choose to leverage another storage method.
IPFS Hashing & Preloaded Folders
Without too much detail here, each IPFS file is associated with a hash rather than a traditional ‘address’ or folder path, this hash is generated from the data itself; changes to the data, nullify, or at least change, the hash.
If you are especially astute you may consider the limitations this hash creates, you can’t PRE upload your metadata and image because the image & metadata themselves are deterministic with regard to the hash, this is NOT similar to a typical folder in which we can add files; in this case, adding a file or appending the PNGs URI would change the metadata, and therefore change the hash, rendering our ipfs://<hash>
useless.
There are some ways around this, you will want to preload your PNGs and use consistent naming of files — I encourage you to explore this capability further; you will want to research the setBaseURI function in solidity, which allows you to set the URI in two parts, baseURI + tokenURI
, this way you can establish a baseURI
and simply use the setTokenURI
function to append
the file name, with your expected, file naming scheme (1.png, 2.png etc)
“You can also include filenames inside the path component of an IPFS URI. For example, if you’ve stored your token’s metadata on IPFS wrapped in a directory, your URI might be:
ipfs://bafybeibnsoufr2renqzsh347nrx54wcubt5lgkeivez63xvivplfwhtpym/metadata.json "
https://docs.ipfs.io/how-to/best-practices-for-nft-data/#types-of-ipfs-links-and-when-to-use-them
In my case, I simply generated the image & the metadata dynamically and uploaded (and pinned) to IPFS during each iteration — the drawback here is that each NFT requires a full iteration, with upload to IPFS and Ethereum transaction; again there are pros and cons to any strategy, your use case should determine what approach works best for you.
Earlier I mentioned Chain Link VRF, or Verifiable Random Function. The documentation can be found HERE but I do want to expand on this to some degree.
As mentioned earlier programming languages are deterministic, our Python random number generation, with a proper seed, will always start to exhibit a pattern when ran many times over. This is also true on-chain, and any block data we might leverage is not only deterministic in nature but is entirely visible to the public — this is a potential point of exploit. This may not seem extremely critical, but consider the implications of an on-chain lottery. This is where the chainLink VRF comes in.
“Chainlink VRF generates a random number and cryptographic proof of how that number was determined. The proof is published and verified on-chain before it can be used by any consuming applications. This process ensures that the results cannot be tampered with nor manipulated by anyone, including oracle operators, miners, users and even smart contract developers.” — Chainlink docs
I won’t go into too much detail here, the main takeaway I am hoping to share here is there are intelligent ways to solve for randomness in Solidity, on-chain. A few things you should consider:
- To leverage this feature we will have to fund our smart contract with Link; This allows our contract to reach out to ChainLink VRF and pay the fee for doing so. To fund our contract we will need to first add the ChainLink ‘Link’ tokens to our own account in Metamask, as soon as we have deployed our contact we can then send this Link to this new contract address (our deployment script should return this address).
- The VRF obviously requires some computation and a reply, this means we need to account for the asynchronous nature of this call. Our minting function will need to call to the VRFs requestRandomness() function, which in turn will call back to a second function, ‘fulfillRandoness()’; the minting actually takes place here, and with a delay.
A note: the VRF does also provide a means to generate many random numbers — I encourage you to explore this further if it fits your desired outcome.
- Python for the deployment of, and interaction with our smart contract; image & metadata generation, and programmatic upload/pin to IPFS.
- Solidity with the OpenZeppelin ERC 721 library for NFTs.
- Brownie (IPFS companion browser extension to view IPFS files.
- IPFS command line and/or IPFS Desktop.
- MetaMask Browser extension to provide us with public & private keys for both test and main net.
- OpenSea Testnet & Mainnet.
- Remix Ethereum IDE for contract validation.
- Etherscan — contract approval and testing.
For a full video walkthrough of a chainlink VRF NFT project, click here.
Thanks for reading.
- "
- &
- 7
- access
- Account
- Accounting
- Additional
- All
- api
- Application
- applications
- around
- Art
- article
- artist
- BEST
- body
- bomb
- browser
- call
- Chainlink
- challenge
- change
- Chrome
- Clothing
- code
- component
- computers
- computing
- consumption
- contents
- contract
- contracts
- Creating
- Current
- data
- day
- Decentralization
- decentralized
- delay
- Design
- detail
- Developer
- developers
- Development
- Devices
- DID
- ecosystem
- Environment
- ETH
- ethereum
- EU
- Expand
- expands
- expects
- Exploit
- Face
- fake
- Feature
- Features
- First
- first time
- follow
- form
- Framework
- Free
- full
- function
- fund
- funds
- GitHub
- Global
- great
- Group
- Hair
- hash
- hashing
- here
- hoping
- How
- How To
- hr
- HTTPS
- ia
- identify
- image
- Including
- Increase
- integration
- interaction
- Internet
- IP
- IPFS
- IT
- Key
- keys
- Languages
- large
- Leverage
- LG
- Library
- Limited
- Line
- LINK
- local
- Long
- lottery
- map
- Market
- marketplace
- Markets
- mask
- math
- medium
- MetaMask
- Miners
- net
- network
- NFT
- NFT projects
- NFTs
- nodes
- numbers
- Offers
- open
- Option
- Options
- oracle
- order
- Other
- owners
- Pattern
- Pay
- PoC
- private
- Private Key
- Private Keys
- Programming
- programming languages
- project
- projects
- proof
- public
- Randomized
- range
- Reading
- reduce
- research
- Results
- Run
- running
- SEA
- seed
- seeds
- Services
- set
- setting
- Share
- Simple
- Size
- small
- smart
- smart contract
- Smart Contracts
- So
- solidity
- SOLVE
- standards
- start
- started
- stats
- storage
- store
- stores
- Strategy
- stress
- supply
- supplying
- surprise
- Switch
- system
- tells
- test
- Testing
- Thinking
- time
- token
- Tokens
- track
- Trading
- transaction
- ui
- Update
- us
- users
- value
- Video
- View
- Virtual
- virtual machine
- What is
- Wikipedia
- works
- youtube